2 Answers | Add Yours
Ownership of land was central to the founding of America. Although students are often taught that Europeans came here for religious freedom, a more appropriate reason was the possibility of land ownership. In Europe, land was precious; most of it was owned by the nobility, and the commoners were forced to rent from the lord; he was literally the "landlord." America, with its wide open spaces, offered more land than most Europeans could even conceive. Indeed, several of the Southern colonies were founded on the basis of the "head right" system whereby one received 50 acres for each passage to America.
Since that time, land ownership has been ingrained in American history. In fact, the U.S. government has encouraged Americans to own land by offering insurance on mortgages, generous terms for Veterans to buy property, and tax deductions for home ownership. This dates back to the founding of the country and its emphasis on land ownership.
I would argue that the right to private ownership of land has been a given throughout US history. Americans have always assumed they have this right. Many things that have happened in US history have been based on this assumption, but there has not been any need to fight for this right.
Americans have assumed they had the right to own land ever since colonial times. The colonists, for example, were unhappy with the British for trying to wall off expansion at the Appalachians. Instead, the colonists wanted to exercise their rights to private ownership as far as they could reach.
This desire helped to cause the free soil movement that contributed to North-South tensions before the Civil War. The idea that every American should be able to own a bit of land was a strong motivation for Northerners to oppose the spread of slavery.
In ways like this, the assumption that Americans have the right to private property has been basic to American history.
We’ve answered 317,869 questions. We can answer yours, too.Ask a question