What changed in the US after reconstruction was a rise of imperialism?

3 Answers | Add Yours

geosc's profile pic

geosc | College Teacher | (Level 3) Assistant Educator

Posted on

The rise of American imperialsim began before Reconstruction.  The War to Prevent Southern Independence was a war of imperialism.  For years the industrial and mercantile interests of the North had struggled against the Agriculture interests of the South for control of the U.S. government.  When the agriculture states of the South seceeded, that left the northern states in total control of the U.S. government, but that was not enough.  The northern politicians wanted control of the government and control of the South, so they waged a war to conquer the South.

On your computer, search for University of Michigan DeBow's Review. Within DeBow's Review, search for Python. One of the first two articles on the resulting list predicts exactly what did happen relative to America's imperial growth. The article was published in 1857.

brettd's profile pic

brettd | High School Teacher | (Level 2) Educator Emeritus

Posted on

The United States had already expanded to the California coast by 1848, and with the new land and resources came an increase in population (immigration) and a vast increase in the size of the economy.  The Civil War accelerated economic growth in the North as a whole new class of wealthy people emerged and then invested in new technology and inventions and industries.

By the late 1800's, most of the best land and resources had been settled, claimed or exploited.  The rich were very rich, and the government was run by the wealthy, at least in terms of their influence.  There was mounting pressure to look overseas for new lands to exploit, as well as a growing nationalism in the 1890's that gave rise to our birth as an empire in the Spanish-American War, with the colonies we took afterwards - most of which we still have today.

pohnpei397's profile pic

pohnpei397 | College Teacher | (Level 3) Distinguished Educator

Posted on

So are you asking what major change happened in the US after Reconstruction?

If that is what you are asking, the rise of imperialism is one possibility.  However, I would say that the real rise in imperialism does not start for around ten years after Reconstruction ended in 1876.

Another change that was happening in the US right after Reconstruction was an increase in conflict between workers and employers.  This played out partly in strikes, some of which were quite violent.

The increase in labor conflict was tied to a boom in the US economy and an increase in the size of companies.  This is another major change in the US after Reconstruction.  However, it was already happening before Reconstruction ended.

We’ve answered 317,692 questions. We can answer yours, too.

Ask a question