Homework Help

What is meant by the term "the culture wars?"

user profile pic

samiee | Student, Undergraduate | (Level 1) eNoter

Posted December 5, 2012 at 3:10 AM via web

dislike 1 like
What is meant by the term "the culture wars?"

1 Answer | Add Yours

user profile pic

pohnpei397 | College Teacher | (Level 3) Distinguished Educator

Posted December 5, 2012 at 3:23 AM (Answer #1)

dislike 1 like

The term “culture wars” refers to conflicts within American society that have existed at greater or lesser levels of intensity since the Vietnam Era and the time of the counterculture.  These are conflicts about what American society and culture should be like.

In the 1960s, there were changes in American society that were greater than any since the 1920s.  African Americans and other minority groups pushed for civil rights.  Women started to try to gain a place in society that was more equal to that of men.  Different attitudes towards authority and patriotism came about as well.  This was a major assault on many of the foundations of the American cultural order.

Since then, there have been “culture wars” between those who approve of and support these changes and those who would like traditional ways to return.  The most prominent of these issues today revolve around gay rights and abortion.  However, there are also issues of race and of the place of women in society that are important as well.  Democrats, for example, tried to argue in the past election that Republicans were waging a “war on women.”  These types of conflict between proponents of traditional ways and the new ways are often referred to as the culture wars.

Join to answer this question

Join a community of thousands of dedicated teachers and students.

Join eNotes