Explain gender roles.

1 Answer | Add Yours

pohnpei397's profile pic

pohnpei397 | College Teacher | (Level 3) Distinguished Educator

Posted on

Gender roles are the roles that society expects us to play based on what sex we are.  These roles differ from one society to another, of course, and they also can differ from time to time within the same society.  However, they are always there.  There are always expectations about how we are to act based on what sex we are.

In the United States, for example, gender roles for women are somewhat in flux right now.  Traditionally, the role of women has been to be supportive of men and to nurture their children.  Women have been seen as the “gentler sex” that should generally not try to involve itself too much in public life.  This role has changed a great deal in the last few decades.  Women are now very much expected to have a career.  However, the role that women are expected to play has not completely changed.  Women are still seen as more nurturing and many people still think that it is natural and appropriate for women to do more of the child-rearing and housework than men do.

Gender roles, then, are simply the roles that we are expected to fulfill based on our sex.

We’ve answered 318,959 questions. We can answer yours, too.

Ask a question