How did America change after WWI? 

Expert Answers

An illustration of the letter 'A' in a speech bubbles

One very important change America experienced after World War I was in the presidency. President Woodrow Wilson, the last president of the progressive era, ended his presidency in failure. He tried repeatedly to get the United States to join the League of Nations; he failed and his efforts to reach that goal led to his physical collapse. Warren G.

Harding, Wilson's successor in the White House, was different from Wilson in all respects. In his campaign for the presidency, Harding promised a "return to normalcy." After his electoral triumph, he sought to undo the achievements of the progressive movement. For instance, he lowered taxes on affluent Americans. Harding served for only two years before dying in office, but his brief tenure marked a complete break from previous domestic and foreign policies.

Households changed in the 1920s as modern conveniences and entertainment became widely available. The washing machine, flush toilet, and electricity became much more common. Heads of household received large raises during this decade, so their families could enjoy many of the new products on the market. Families gathered around the radio to listen to newscasts or sporting events. Calvin Coolidge, Harding's successor, became the first president to speak to the nation on the radio.

Automobiles changed the culture of the country. Americans could commute to work or drive to one of the new movie cinemas. The Model T became affordable for middle-class Americans.

Airplanes had been used extensively in World War I, and their use accelerated after it ended. Airports were built. Finally, Charles Lindbergh electrified the nation when he flew non-stop to Paris from New York.

Approved by eNotes Editorial Team
An illustration of the letter 'A' in a speech bubbles

World War I impacted and changed the United States in a number of notable ways that can be added to the answers below. For starters, it signaled the entry of the country to the world stage. For much of the previous generation, the United States had tried not to involve itself in the affairs of other countries. Although the United States had begun meddling in imperialism abroad, it tended to keep its nose out of the business of other countries when not directly concerned.

This war changed that. It was the first time that the United States got involved in world events on such a level. While many Americans went to war to counter the threat of Germany, its post-war mission was to create a world-order safe for democracy. After the war, the country briefly returned to its isolationist stance, but it would not be long before the United States became a world player again.

At home, World War I changed much of the domestic landscape. For one thing, it spurred the Great Migration. African Americans from the rural South began moving to the urban North and Midwest in large numbers to fill job vacancies left by soldiers overseas. When the war ended, many stayed, signaling a huge demographic shift. This resulted in a lasting change in African American culture and politics that still reverberates today.

Approved by eNotes Editorial Team
An illustration of the letter 'A' in a speech bubbles

The United States changed significantly after...

This Answer Now

Start your 48-hour free trial to unlock this answer and thousands more. Enjoy eNotes ad-free and cancel anytime.

Get 48 Hours Free Access

World War I ended. Prior to the war, the United States pursued an expansionist foreign policy and a domestic policy that worked to correct the problems that existed in the country. The United States had expanded its influence in Asia, the Caribbean, and South America. When Germany interfered with the American right to trade after World War I began, the United States joined the war in 1917 against Germany and the Central Powers. Through the actions of people associated with the Progressive Movement, attempts were made to eliminate child labor, clean up politics, control the actions of big businesses, help working class people, and protect the environment.

After World War I ended, the people of the United States didn’t want the country to be so involved in world affairs, and the American people began to look inward. The people also didn’t want the government to be so actively involved in the economy and in dealing with social issues. The United States passed strict immigration laws when the Emergency Quota Act and the National Origins Act were passed. The Dawes Act was passed to help deal with the issue of Germany’s payment of reparations. The Washington Naval Conference was held, leading to agreements to reduce weapons that various countries possessed. The Kellogg-Briand Pact outlawed war, although it was a non-binding agreement. The government began to pursue a laissez-faire policy toward economic issues and the actions of businesses. Companies were given more latitude to run their businesses, and there was less government involvement in the economy. Taxes were cut with the belief that the prosperity of businesses would trickle down to the average American.

The terms the Roaring 20s and the Magic Decade are used to describe this decade. Americans wanted to enjoy life and not be so concerned about world affairs and problems facing the country.

Approved by eNotes Editorial Team
An illustration of the letter 'A' in a speech bubbles

The period after World War I is called the Roaring Twenties in American history. It is a period that is marked by a sharp uptick in the American economy, especially with respect to manufacturing and investment.  The United States became a consumer driven economy. There was an attitude that prosperity was boundless. Americans began to use consumer credit to drive their purchases. In general, the Republican presidential administrations took a laissez-faire approach to big business and the economy.

The decade was also a period of tremendous cultural conflict. The fear of communism and increase of immigrants during the decade led to a strong nativist sentiment. This xenophobia saw enrollment in the Ku Klux Klan escalate to historic levels.

There was also a conflict between the secular portion of the population and a growing Christian fundamentalist movement. This is evidenced by the sensationalism of the Scopes Trial and the introduction of the Prohibition Amendment. The period was also one in which women and African-Americans fought for political, social, and economic equality.

Approved by eNotes Editorial Team
An illustration of the letter 'A' in a speech bubbles

How did the U.S. change socially after World War I?

The United States changed socially in many ways after World War I ended. One way was with our dealing with immigrants.

After World War I ended, there was an effort on the part of the United States to pull back from our involvement in world affairs. We were concerned about the effects of immigration on our country, especially those immigrants coming from South and East Europe. We were concerned some of these immigrants were communists and anarchists. Partially because of these concerns, we passed very restrictive immigration laws that greatly limited immigration to our country, especially immigration from South and East Europe. The Emergency Quota Act and the National Origins Act are examples of two laws that were passed that limited immigration to the United States.

Women’s roles began to change. Women got the right to vote in 1920. Women began to appear in public with shorter dresses. They were more opinionated than in the past. They also were smoking and drinking in public. Women also began to work outside of the house.

Racial intolerance grew during this time. The Ku Klux Klan grew significantly after World War I ended. African-Americans suffered greatly, and lynching was common. Other groups, such as Catholics and Jews, also faced harassment and discrimination.

There were several social changes in our country after World War I ended.

Last Updated by eNotes Editorial on