This is a very interesting question. After all, many people nowadays would claim that women and men are treated equally in the US. After all, these people would argue, the gender norms that women have been subjected to over the years are definitely reducing in number, given that there have been so many advances in terms of gender equality over the past decades.
However, when looking at it more closely, you will see that this may well appear to be the case on the surface but that it actually isn't the case at all when digging a bit deeper.
For example, while it has become a bit more commonplace for new fathers to take some time off work when they have a new baby, it still is usually the mother who stays at home with the child for longer or who might go down to part-time hours at work in order to be able to spend more time with her child. While there is nothing stopping a man from asking to go down to part-time hours at work in order to be with his children, this would definitely be seen as very unusual by society.
Another example you might mention here is the fact that it is still expected of women to wear skirts or other typically female clothes when they work in an office. Yes, more and more women are dressed in trousers these days, but these women are then often seen as pushy, dominant, and very career driven. Also, even when wearing a suit, women are sometimes still expected to wear heals with these suits in order to look smart.