How did the U.S. change socially after World War I?

Expert Answers

An illustration of the letter 'A' in a speech bubbles

The United States changed socially in many ways after World War I ended. One way was with our dealing with immigrants.

After World War I ended, there was an effort on the part of the United States to pull back from our involvement in world affairs. We were concerned about...

Unlock
This Answer Now

Start your 48-hour free trial to unlock this answer and thousands more. Enjoy eNotes ad-free and cancel anytime.

Start your 48-Hour Free Trial

The United States changed socially in many ways after World War I ended. One way was with our dealing with immigrants.

After World War I ended, there was an effort on the part of the United States to pull back from our involvement in world affairs. We were concerned about the effects of immigration on our country, especially those immigrants coming from South and East Europe. We were concerned some of these immigrants were communists and anarchists. Partially because of these concerns, we passed very restrictive immigration laws that greatly limited immigration to our country, especially immigration from South and East Europe. The Emergency Quota Act and the National Origins Act are examples of two laws that were passed that limited immigration to the United States.

Women’s roles began to change. Women got the right to vote in 1920. Women began to appear in public with shorter dresses. They were more opinionated than in the past. They also were smoking and drinking in public. Women also began to work outside of the house.

Racial intolerance grew during this time. The Ku Klux Klan grew significantly after World War I ended. African-Americans suffered greatly, and lynching was common. Other groups, such as Catholics and Jews, also faced harassment and discrimination.

There were several social changes in our country after World War I ended.

Approved by eNotes Editorial Team