Did WWI substantially alter American society and culture or were its effects primarily an affair of the mind?
4 Answers | Add Yours
Few experiences in American history have altered American society more than World War One. The war came on the cusp of the Progressive Era, when there was the belief that human beings were "progressing." The utter devastation of the war--something never experienced before-led to disillusionment, skepticism and cynicism. An entire group of American writers who had fought in the war, such as Hemingway, cummings, and T.S. Eliot left the country. Gertrude Stein called them the "lost generation." At home, there was a renewed interest in Patriotism, fundamentalist religion, and even prohibition. There is some argument that it was the German's superior attitude, reeking of Social Darwinism, led to the opposition of Darwinism altogether in this country. Even the rules of artistic expression were thrown out the window, and "modern" art was born. All this because of the profound destruction and agony caused by the war.
World War I had a profound affect on the mindset of not only American culture and beliefs, but that of European societies as well. It was such a huge catastrophe of destruction and death that a whole generation of Americans and Europeans became pacifists. Remember, it was not called the "War to end all wars" until after it was over. This reflects the views of many people in the United States and abroad at the time that governments had failed them, and any and all efforts should be made to ensure that no similar disaster ever happened again.
In Russia, World War I's failures were part of the cause of full scale communist revolution. In France, communities of expatriate writers from America wrote of war's atrocities and the failure of American society. A new generation of diplomats attempted to permanently end war with the Kellogg-Briand Pact, arms control treaties, and the League of Nations. Although they were all failures, they pretty accurately reflected the sea change in American and European attitudes towards war. Unfortunately, it only lasted for one generation before the Second World War dragged us and Europe into another disaster.
Are those two things really different? In many ways, society and culture are "affairs of the mind" and so this question seems to be asking you to differentiate between two things that are not necessarily different.
I do think that WWI did help to alter American society and culture in important ways. In general, WWI was a disillusioning experience. People had been thinking that modern ways had made wars a thing of the past. They felt that society had progressed to the point where war and savagery were no longer necessary. The war destroyed those illusions.
In part because of that, the 1920s were a time when society changed. People seemed to be less interested in progress (like the Progressives had been) and were more interested in hedonism. It is like they just decided that the idea of trying to make a perfect world was pointless and they might as well just go out and have fun instead.
So, one can argue, WWI had a major effect on American society and culture.
We’ve answered 318,929 questions. We can answer yours, too.Ask a question