Who Won WW1
Which country won WWI?
The Allied Powers won the First World War. They included Great Britain, France, Italy, Russia, and from 1917, the United States. What was unusual about the Allied victory was that it didn't take place on enemy soil. The territory of the main Central Power, Germany, wasn't invaded or taken over by enemy forces as would normally have been the case.
This gave rise to a persistent myth, the so-called "Stab in the back" legend, that would later be ruthlessly exploited for political gain by the Nazis. According to its proponents, Germany hadn't really lose the war on the battlefield; her brave...
(The entire section contains 2 answers and 374 words.)
check Approved by eNotes Editorial
The Allies for the majority of the war were without America's assistance. It wasn't until 1917 that the U.S. officially entered the war to help uphold democracy. America had stayed out before then partially due to the immigrant population asking for the country to stay neutral. The sinking of several U.S. ships and deaths of American citizens finally pushed the nation into joining the Allies.