Where did WWI leave the USA domestically and abroad?  After WWI is ended, where does this leave us, the USA domestically and abroad? please please, I need help ASAP. Thanks.

1 Answer | Add Yours

pohnpei397's profile pic

pohnpei397 | College Teacher | (Level 3) Distinguished Educator

Posted on

I will assume for the purpose of this answer that we are talking about the state of affairs right at or very soon after the end of the war.  In other words, before it was clear that the US would retreat into isolationism.

At the end of WWI, the US had a fairly strong position internationally.  The US had been the deciding factor in the war and President Wilson had a clearly defined agenda that he wanted to acomplish in the peace talks.  Wilson had moral authority because of the way that he had defined the US's war aims.  In addition, the other great powers were economically weakened by the war where the US was not.  Right after the war, then, the US seemed to be very important in terms of international affairs.

At home, the US faced a somewhat more uncertain future.  There were soon to be many ex-soldiers coming home to rejoin society.  Would there be jobs for them?  Prohibition had been voted into the Constitution.  What would its impact be when it took effect?  There were also tensions between immigrants and "natives" and there was a rising current of political radicalism among many of the immigrants.  All of this made the domestic situation seem somewhat unsettled at the end of the war.

We’ve answered 318,029 questions. We can answer yours, too.

Ask a question