Discuss the institution of slavery in America. How did it change when England got out of the slave business, and where did Southerners’ get their slaves afterwards? How did the opening of new territories raise issues concerning slavery?
1 Answer | Add Yours
Slavery was practiced in America till the end of the Civil War. The slaves were originally brought from various African nations to America by British colonists. Over time slavery became synonymous with skin color and indeed very few people of color were free citizens. Slaves were bought and sold and considered the property of their owners. The children of slaves were also slaves and they hardly had any rights or freedom. Once the Britain pulled out of the slave trade and America gained its independence, trade practices changed. At the same time the southern states were booming centers of cotton agriculture and the product had great demand in Britain and the rest of Europe. With no fresh supplies of slaves, these agriculture-dependent states started the transport of slaves from the upper south to the deeper south. The rich plantation owners also dominated Congress and politics and hence prevented any anti-slavery law from being enacted. Domestic slave trading was a major business till 1860 and most of the slaves were sold from Maryland, Virginia and North and South Carolina; they were headed to Georgia, Texas, Mississippi and Alabama. With the opening of new territories and the simultaneous increase in cotton cultivation, domestic slave trading boomed. This also meant that slaves were traded and sold to owners in new territories. Almost always the slaves were separated from their families and had to walk overland to the new destinations. The opening of new territories extended the slave trade and slavery by many decades and opened new lands for it.
We’ve answered 319,667 questions. We can answer yours, too.Ask a question