Was the U.S. right to get involved in WWI?

6 Answers | Add Yours

enotechris's profile pic

enotechris | College Teacher | (Level 2) Senior Educator

Posted on

Wars are fought for economic reasons.  Germany's policy of unrestricted submarine warfare not only threatened to destroy British sea power, but American as well.  If preserving US seafaring can be called "just," then fighting a war with Germany which was threatening it is understandable.  However, complaining about submarine attacks while shipping war materiel to Britain is duplicitous -- which leads to the next axiom of warfare -- any excuse to fight a war will be used, if the people doing the fighting will accept it, especially if it can be sold on "moral" grounds.

akannan's profile pic

Ashley Kannan | Middle School Teacher | (Level 3) Distinguished Educator

Posted on

This is a fairly compelling question which will invoke many responses.  The term "right" in the question might cause the greatest amount of debate.  Fundamentally, if one accepts the premise put forth by President Wilson, the need for America to "make the world safe for democracy" became critical behind the United States' role in the war.  The need to stem the tide of Austria- Hungarian and German threats to Western European democracies as well as aggressive acts towards American interests might have helped to justify the war.  There are others, though, who would argue, and did argue, that the American interests in the war were highly economic, favoring the interests of the wealthy and economic elite at the hands of the downtrodden.  Thinkers like Eugene Debs, Helen Keller, as well as many advocates for workers' rights felt opposition towards the war effort on these grounds.

pohnpei397's profile pic

pohnpei397 | College Teacher | (Level 3) Distinguished Educator

Posted on

Once again, a hard question.

The US got involved in WWI largely because of objections to German unlimited submarine warfare.  The US felt that this was wrong because it was banned by international law.

However, the US looked the other way (or at least didn't complain too much) when the British blockaded Germany using surface ships (this was illegal too).  They also didn't say anything when the Germans tried to follow the rules of submarine blockading (have the submarines warn the ships before sinking them), only to have the British put guns on merchant ships to sink subs that did this.

Most historians argue that a major reason the US got involved in the war was because it traded so much with the British.  The British owed the US a great deal of money so the US would have wanted to make sure the British won the war (so US companies could get their money).

So, there does not seem to have been a huge moral issue that compelled the US to go to war.

krishna-agrawala's profile pic

krishna-agrawala | College Teacher | (Level 3) Valedictorian

Posted on

Answer to this question is very much subjective depending on the beliefs and values of the person answering the question. I personally believe that initially USA acted with great restraint trying to avoid entering in the war. Though World War I started in July 1914, USA joined the war only in April 1917, after direct German hostile action against USA by way of attacking its cargo ships, and its design to persuade Mexico to go to war against USA. In such situation it was both wise and honourable for the USA to enter the war and directly fight Germany and its allies. This was necessary to secure immediate interests of the country as well as to give right signals to the world that USA is not a soft target for any nation to attack.

mkcapen1's profile pic

mkcapen1 | Middle School Teacher | (Level 3) Valedictorian

Posted on

The United States entered into World War II based on the information that the first editor gave you.  However, there were so many other implications once the United States was in the war that I have to feel grateful that we had entered the war.   The atrocities that the German Nazi party inflicted on the Jewish people, Gypsies, Jehovah Witnesses and others made the intervention of America a necessary and dominant move in regards to Human Rights.

While America did not enter the war with the intent to save the people in the concentration camps, our soldiers still gave their lives in the protection of people who could not defend themselves.  America's intervention was greatly appreciate by France, Britain, and The population of people who were tortured and dying at the hands of the Nazi Party.

At one time, America had intended to stay out of the politics and wars of other countries, but when our allies were attacked and the expansion and spread of the German army, concern arose that Hitler may very well become the dictator of Europe.  The United States along with other world powers identified that the threat was great enough that it was necessary for their involvement.  I believe that America had to get involved in World War II.

 

revolution's profile pic

revolution | College Teacher | (Level 1) Valedictorian

Posted on

Yes, it was a "right" for United States to get involved in the war. 1917 was a crucial year for the victors of the war as USA had entered the war on the side of the Allied Powers, which include Britian, France, Italy and Russia ( which had surrendered at March 1918), against the Central Powers, consisting of Italy (who defected to the other side during May 1915), Germany and Austria-Hungary empire. The involvement of USA in the global war was that Germany had use submarines to attack ships carrying supplies to the Allied Powers, and had the audacity to attack and sank a British ship called the LUsitania, which had lots of American passengers. The loss of many of American lives enraged many citizens living there, who urged their government to join the war to fight against Germany to seek revenge against life losses of many innoncent victims of Americans, which they declared war in April 1917.

Also, Woodrow Wilson also was against the idea of a war against Germany, but then changed its mind and joined the war to fight against Germany to "make the worls safe from democracy", making the world a better place. So, in my opinion, USA was "right" to be involved in the war. They were also economically benefited after the war, as Britian and France emrged from the war heavily indebted to the USA, and would take years to repay the loans given to them, so they have some benefits compared to other countries, which had suffered the worst of the aftermath.

We’ve answered 318,926 questions. We can answer yours, too.

Ask a question