What is women's role in society based on the culture wars?  

Expert Answers
pohnpei397 eNotes educator| Certified Educator

Some parts of the culture war are actually being fought over the status or role of women in society.  What that means is that there is not really a role of women that has come about because of the culture wars.  Instead, the culture wars are still being fought to determine what the role of women will be.

The liberal side of the culture wars generally wants women’s roles in society to be essentially the same as men’s roles.  That is, they want women to be able to do all the same things that men can as long as women are capable of doing those things.  They want women to be able to participate in combat, for example.  They want women to be free to pursue careers of their choice.

The conservative side of the culture wars is generally much more traditional.  Some parts of the conservative side would go so far as to want women to mainly stay at home and be wives and mothers.  Other conservatives will not go that far, but do want to maintain traditional values to the greatest extent possible. 

Overall, though, the role of women is gradually moving toward the liberal end of the spectrum.  30 years ago, there would have been many more conservatives arguing for women to return to their roles as wives and mothers.  Today, the conservative side of the culture wars usually does no more than to argue against abortion and extreme aspects of women’s equality such as women in combat.  There are very few conservatives who will argue that women should not have careers and that men should not help with housework.

Access hundreds of thousands of answers with a free trial.

Start Free Trial
Ask a Question