The social ramifications of World War I were quite plentiful. One of these was the emergence of the idea that the United States represented the unquestioned force of the world. With Europe being completely shelled and gutted from the First World War, most eyes turned to America as the social haven that could set the pace for the rest of the world. This was a stark change from the outset of the war, where most of the world's collective eyes were set on Europe. America became the center of world celebrity and fame. At the same time, a growing fundamentalist movement emerged in America that wanted to revert back to "the way things should be." Such a theme is evident in the social rise of religious fundamentalism as well as nativism in American society.
Industrialism boomed during World War I. Manufacturing reached a peak because they had to keep up with supply and demand due to the war. In addition, there were many new technologies being developed because of this supply and demand.
Women also entered the workforce. Many men were sent overseas to fight in the war so this left no other option but to send women to work. This also changed the views of women. They also started to change the way they dressed. They usually wore dresses but they began to wear pants and overalls because they were working in the factories.
At the end of the war the economy was no so great. Men were returning from home and women had to leave the work force. Even with them leaving there was not enough available to everyone. The unemployment rate began to rise.
Also, women entered the industrial workplace in large numbers, a first, and challenged gender stereotypes. This was one factor which helped them finally to achieve the right to vote with the 19th amendment. We saw at least some women in the 1920s assert themselves on a more equal footing with me, in part because of their role in World War I.
We brought home a new generation of veterans, some broken by war mentally and physically, and at a time when we were not at all good at dealing with them on a medical or psychological level.
There are quite a number of large issues to be considered when looking at the effects of WWI on the social atmosphere in the United States. The backlash against Europe and outsiders of any kind after even the small (compared to European nations) number of casualties the US suffered in the war led to anger and resentment towards immigrants of almost every nationality and led Congress and Roosevelt to push for tighter immigration quotas and other measures designed to limit the influx of foreigners.
There was an increase in the sense of isolationism, not just in the political sense, but also in the idea of communities taking care of their own and not worrying about everyone else.
There was a great deal of racism and classism introduced into the world of college admissions, something detailed in the book titled "The Chosen" by Jerome Karabel.
You could also look at whether or not the rowdiness of returning soldiers, etc., led to more of the push towards prohibition. There were other effects from the return of boys from overseas having seen the horrible nature of war, there was a surge in the membership of peace advocacy groups, etc.
To me, there were two major ramifications.
The first of these was an increased intolerance of people like immigrants. A lot of people got to be really suspicious of immigrants and dissenters because of all the propaganda that the government put out. This helped lead to such things as Prohibition (seen as a strike against German immigrants in particular).
The second was that a lot of black people started to move to the North to get jobs in factories. That has had very long lasting effects, and it also had short term effects like a few race riots during WWI.
See eNotes Ad-Free
Start your 48-hour free trial to get access to more than 30,000 additional guides and more than 350,000 Homework Help questions answered by our experts.
Already a member? Log in here.