Better Students Ask More Questions.
Has the United States traditionally been a patriarchal society, especially before the...
1 Answer | add yours
Middle School Teacher
One could definitely argue that the U.S. has long been a patriarchic society due to its European heritage and the expected roles of women and men that had developed long before the founding of the nations. During most of the 17th through the 19th centuries, women in America were treated as second class and arguably third class citizens. They were barred form political participating, most forms of employment outside of the home and even legal rights such as divorce, custody of children and protection from spousal abuse. These roles and norms were reinforced by colonial law, economic changes and religious beliefs.
There were some notable exceptions. In Quaker society women were treated as total equals, enjoying all rights and privileges accorded to men. In many Native American cultures women held ultimate political power. Women’s circles or councils had to approve all actions and treaties, including war and land sales, before chiefs or sachems could move.
Outside of these exceptions, the transition to a more equitable society was a slow one, but around the 1800’s marked improvements in the freedoms and privileges enjoyed by women were starting to surface in many northern and western states. This was thanks to the beginning of many reform movements such as temperance and abolition, where women played key organizing roles, and the importance of women in establishing civilization on the frontier. By the 1920’s, women had become a political force all their own, culminating in universal suffrage at the end of the decade.
Posted by saintfester on April 18, 2012 at 4:48 PM (Answer #1)
Join to answer this question
Join a community of thousands of dedicated teachers and students.