2 Answers | Add Yours
This is a question that could have many different answers. I would argue that there were two major impacts of this war.
First, this war led to WWII, which was the biggest war in the history of the world. WWI left a few countries very unhappy and led them to desire revenge and/or more territory. This led to WWII.
Second, this war was a major factor in making people more disillusioned about humanity and the prospects for human progress. Before the war, people had felt that human beings were making good progress towards perfecting society. WWI made it clear that a perfect society was nowhere in reach. This made people lose faith in the idea that human history was a process in which things constantly got better.
There are many other impacts of the war, but these are the two main ones in my mind.
I would say one of the biggest ways in which WWI changed the world was the advent of modern warfare. Machine guns, trench warfare, mechanized units, air force, and chemical warfare were all part of the horrors of WWI.
The War did fall short of the claim of many of being so terrible that it would be "the war to end all wars", as the Treaty of Versailles that ended the war set the stage for the beginning of WWII.
We’ve answered 396,860 questions. We can answer yours, too.Ask a question