World War I

Start Your Free Trial

Who Won WW1

Which country won WWI?

Expert Answers info

David Morrison eNotes educator | Certified Educator

calendarEducator since 2017

write9,184 answers

starTop subjects are Literature, History, and Law and Politics

The Allied Powers won the First World War. They included Great Britain, France, Italy, Russia, and from 1917, the United States. What was unusual about the Allied victory was that it didn't take place on enemy soil. The territory of the main Central Power, Germany, wasn't invaded or taken over by enemy forces as would normally have been the case.

This gave rise to a persistent myth, the so-called "Stab in the back" legend, that would later be ruthlessly exploited for political gain by the Nazis. According to its proponents, Germany hadn't really lose the war on the battlefield; her brave...

(The entire section contains 2 answers and 374 words.)

Unlock This Answer Now


check Approved by eNotes Editorial

Related Questions

pohnpei397 eNotes educator | Certified Educator

calendarEducator since 2009

write35,413 answers

starTop subjects are History, Literature, and Social Sciences

Further Reading:

check Approved by eNotes Editorial


abdinajib | Student

 The Allies for the majority of the war were without America's assistance. It wasn't until 1917 that the U.S. officially entered the war to help uphold democracy. America had stayed out before then partially due to the immigrant population asking for the country to stay neutral. The sinking of several U.S. ships and deaths of American citizens finally pushed the nation into joining the Allies.

abdinajib | Student

The Allies for the majority of the war were without America's assistance. It wasn't until 1917 that the U.S. officially entered the war to help uphold democracy. America had stayed out before then partially due to the immigrant population asking for the country to stay neutral. The sinking of several U.S. ships and deaths of American citizens finally pushed the nation into joining the Allies.