Homework Help

Did WWI substantially alter American society and culture or were its effects primarily...

user profile pic

metsmlz | Student, Grade 11 | eNotes Newbie

Posted January 19, 2011 at 2:44 PM via web

dislike 1 like
Did WWI substantially alter American society and culture or were its effects primarily an affair of the mind?

Did WWI substantially alter American society and culture or were its effects primarily an affair of the mind?

4 Answers | Add Yours

user profile pic

pohnpei397 | College Teacher | (Level 3) Distinguished Educator

Posted January 19, 2011 at 2:50 PM (Answer #2)

dislike 0 like

Are those two things really different?  In many ways, society and culture are "affairs of the mind" and so this question seems to be asking you to differentiate between two things that are not necessarily different.

I do think that WWI did help to alter American society and culture in important ways.  In general, WWI was a disillusioning experience.  People had been thinking that modern ways had made wars a thing of the past.  They felt that society had progressed to the point where war and savagery were no longer necessary.  The war destroyed those illusions.

In part because of that, the 1920s were a time when society changed.  People seemed to be less interested in progress (like the Progressives had been) and were more interested in hedonism.  It is like they just decided that the idea of trying to make a perfect world was pointless and they might as well just go out and have fun instead.

So, one can argue, WWI had a major effect on American society and culture.

user profile pic

brettd | High School Teacher | (Level 2) Educator Emeritus

Posted January 19, 2011 at 3:49 PM (Answer #3)

dislike 0 like

World War I had a profound affect on the mindset of not only American culture and beliefs, but that of European societies as well.  It was such a huge catastrophe of destruction and death that a whole generation of Americans and Europeans became pacifists.  Remember, it was not called the "War to end all wars" until after it was over.  This reflects the views of many people in the United States and abroad at the time that governments had failed them, and any and all efforts should be made to ensure that no similar disaster ever happened again.

In Russia, World War I's failures were part of the cause of full scale communist revolution.  In France, communities of expatriate writers from America wrote of war's atrocities and the failure of American society.  A new generation of diplomats attempted to permanently end war with the Kellogg-Briand Pact, arms control treaties, and the League of Nations.  Although they were all failures, they pretty accurately reflected the sea change in American and European attitudes towards war.  Unfortunately, it only lasted for one generation before the Second World War dragged us and Europe into another disaster.

user profile pic

larrygates | College Teacher | (Level 1) Educator Emeritus

Posted January 19, 2011 at 4:18 PM (Answer #4)

dislike 0 like

Few experiences in American history have altered American society more than World War One. The war came on the cusp of the Progressive Era, when there was the belief that human beings were "progressing." The utter devastation of the war--something never experienced before-led to disillusionment, skepticism and cynicism. An entire group of American writers who had fought in the war, such as Hemingway, cummings, and T.S. Eliot left the country. Gertrude Stein called them the "lost generation." At home, there was a renewed interest in Patriotism, fundamentalist religion, and even prohibition. There is some argument that it was the German's superior attitude, reeking of Social Darwinism, led to the opposition of Darwinism altogether in this country. Even the rules of artistic expression were thrown out the window, and "modern" art was born. All this because of the profound destruction and agony caused by the war. 

user profile pic

litteacher8 | Middle School Teacher | (Level 1) Distinguished Educator

Posted August 11, 2011 at 11:54 AM (Answer #5)

dislike 0 like
World War I picked America up and shook it. We shed our ideals, and changed our focus. Like a teenager with a near-death experience, we went from idealism to hedonism. All we cared about was living large, making money and having fun. Ideals no longer mattered.

Join to answer this question

Join a community of thousands of dedicated teachers and students.

Join eNotes