What were the effects of the Spanish American War?
The Spanish-American war lasted only a four months, but had far-reaching consequences.
On the foreign affairs front, the war marked the emergence of the U.S. as a world power. The U.S. became a larger player in global politics, entering into treaties and other international agreements. It also marked the end of the Spanish Empire, which had been slowly crumbling since the early 1800’s. The war actually benefitted Spain economically since all the capital from their colonies was eventually returned to them.
The U.S. however began struggling with the idea of becoming an imperialist nation. At first they promised Cuba independence after the war, but the passage of the Platt Amendment kept the island on a very short leash. This is where the naval base at Guantanamo Bay came from.
Theodore Roosevelt returned to the U.S. a war hero. His popularity resulted in a vice-presidential nomination that eventually the presidency.
Strangely enough, the war had a healing effect back home. The photos of southern and northern fighting together helped heal scars left over from the Civil War. It also marked the beginning of an extended period of prosperity in the U.S. that lasted well into the 1920’s.
The main effect of the Spanish-American War was to make the United States something of a global imperial power. Before the war, the United States had not really had any possessions outside of North America. With the conclusion of the war, the US gained an empire. This led to a major debate within the US as to the advisability of having such an empire.
From this war, the US got Puerto Rico, the Philippines, and Guam. This meant that the US had possessions stretching from the mainland US almost all the way to the mainland of Asia. It also meant that the US, for the first time, had undertaken to rule a large number of non-white, non-American people. This led to some amount of debate among Americans as to whether it was ethical and/or wise to have such an empire.
So, the main effects of this war (for the US, at least) were to give the US an empire and to make Americans think about the pros and cons of being an imperial world power.