History Questions and Answers

Start Your Free Trial

How did America change after WWI? 

Expert Answers info

Susana Scanlon eNotes educator | Certified Educator

calendarEducator since 2019

write793 answers

starTop subjects are History, Literature, and Law and Politics

One very important change America experienced after World War I was in the presidency. President Woodrow Wilson, the last president of the progressive era, ended his presidency in failure. He tried repeatedly to get the United States to join the League of Nations; he failed and his efforts to reach that goal led to his physical collapse. Warren G.

Harding, Wilson's successor in the White House, was different from Wilson in all respects. In his campaign for the presidency, Harding promised a "return to normalcy." After his electoral triumph, he sought to undo the achievements of the progressive movement. For instance, he lowered taxes on affluent Americans. Harding served for only two years before dying in office, but his brief tenure marked a complete break from previous domestic and foreign policies.

Households changed in the 1920s as modern conveniences and entertainment became widely available. The washing machine, flush toilet, and electricity became much more common. Heads of...

(The entire section contains 4 answers and 977 words.)

Unlock This Answer Now


check Approved by eNotes Editorial

Greg Jackson, M.A. eNotes educator | Certified Educator

bookM.A. from University of Massachusetts-Boston


calendarEducator since 2018

write1,777 answers

starTop subjects are History, Literature, and Law and Politics

Further Reading:

check Approved by eNotes Editorial

Michael Koren eNotes educator | Certified Educator

calendarEducator since 2015

write2,982 answers

starTop subjects are History, Law and Politics, and Social Sciences

check Approved by eNotes Editorial

mrkirschner eNotes educator | Certified Educator

calendarEducator since 2015

write934 answers

starTop subjects are History, Literature, and Social Sciences

check Approved by eNotes Editorial


sophie-carter | Student

WWI opened doors for radical change in the United States, such as advocacy for progressive policy regarding women's suffrage and an economic boom and gave way to the "roaring 20s." At the same time, however, the migration of African-Americans from the South to occupy the jobs of those from the North who joined the military lead to pushback from conservative groups like the Ku Klux Klan, making social issues more tumultuous than they had been since the Civil War.