Has the United States, especially before the mid-nineteenth century, traditionally been a patriarchal society?

Expert Answers

An illustration of the letter 'A' in a speech bubbles

One could definitely argue that the U.S. has long been a patriarchic society due to its European heritage and the expected roles of women and men that had developed long before the founding of the nations. During most of the 17th through the 19th centuries, women in America were treated as second class and arguably third class citizens. They were barred form political participating, most forms of employment outside of the home and even legal rights such as divorce, custody of children and protection from spousal abuse. These roles and norms were reinforced by colonial law, economic changes and religious beliefs. 

There were some notable exceptions. In Quaker society women were treated as total equals, enjoying all rights and privileges accorded to men. In many Native American cultures women held ultimate political power. Women’s circles or councils had to approve all actions and treaties, including war and land sales, before chiefs or sachems could move.

Outside of these exceptions, the transition to a more equitable society was a slow one, but around the 1800’s marked improvements in the freedoms and privileges enjoyed by women were starting to surface in many northern and western states. This was thanks to the beginning of many reform movements such as temperance and abolition, where women played key organizing roles, and the importance of women in establishing civilization on the frontier. By the 1920’s, women had become a political force all their own, culminating in universal suffrage at the end of the decade.

See eNotes Ad-Free

Start your 48-hour free trial to get access to more than 30,000 additional guides and more than 350,000 Homework Help questions answered by our experts.

Get 48 Hours Free Access
Approved by eNotes Editorial