One of the most important social changes in the last 40 years has been the change in sex roles in the United States. This change has its origins before 1972 and the change is still not complete. Even so, the changes in sex roles in the US have dramatically changed our society.
Since 1972, women have become much more equal to men in the eyes of society. Women are now expected to have careers almost as much as men are. A woman was a serious candidate for the presidency in 2008. More men are willing to participate more in raising their children and in housework. Let us look at a few causes of these changes:
- World War II. Many women took “men’s” work while the men were away in the war. They generally had to give up these jobs after the war ended, but women’s horizons and expectations were expanded by this experience.
- Birth control. As birth control (particularly “The Pill”) became more available and reliable, women could have much more control over their childbearing. This made it more possible for them to work and it eased the burden of housework/child care.
- Decline in importance of physical labor. In the last 40 years, more jobs have come to be “white collar.” These are jobs where women are at no disadvantage because physical strength is not an issue.
All of these factors helped to create a situation where the roles of men and women in society have become less distinct from one another.