Could someone give me their view of history in America when it comes to women and oppression versus today's world?

Expert Answers
Ashley Kannan eNotes educator| Certified Educator

I think that part of America's history has involved the reality of women and oppression.  In the earliest stages of the nation's development, women's contributions and voice were not acknowledged.  The formation of the Constitution did not account for women's political voices as women were denied the right to vote.  This did not appear as part of the nation's document until the 19th Amendment was ratified.  At the same time, women experienced a fundamentally challenging reality as the nation grew and matured.  Women were not encouraged to seek voice in the workplace or in the political setting. Instead, women were actively encouraged to remain in the realm of domesticity.  The last half a century has seen women in America make significant strides in social, political, and economic equality.  The advances made has been significant in terms of reducing the reality of oppression for women.  In fact, one could make the argument that the recent advances in the last fifty years have helped to remove the term "oppression" in describing the condition of women as a group in America.  

If one were to see such realities as reflective of growth and advancement, then this could as America being ahead of the rest of the world.  Those who would take the advancements of women in America as representations of progress could very well suggest that being a woman in America is fundamentally better than many other parts of the world where oppression of women is a part of daily life.  In many different parts of the world, simply being a woman is a struggle.  The denial of voice on political, legal, social, and economic levels helps to marginalize women, condemning them to a life of silence.  In this light, America has advanced farther than many other nations, making it an example to the rest of the world.

However, there is a flip side to this coin.  The reality is that while there have been undeniable advances for women in America, there are some challenges in being a woman in America.  For example, statistics reveal that one in four women will be a victim to domestic violence.  85% of all victims of domestic violence are women, and "Almost one-third of female homicide victims that
are reported in police records are killed by an intimate partner." The recent cases involving high profile athletes involved in domestic violence issues have revealed that women in America still experience realities that many men do not have to encounter.  Sexual assault is still a reality that women must confront in realms such as college campuses, work place settings, and even in the military where "women serving in the military are more likely to be raped by a fellow soldier than killed by enemy fire."  Of course, economic inequality is still present in America where a man makes about 25% more than a woman for the same work.  Media images of women do seek to display some level of empowerment.  However, the oversexualized and reductive nature of women as shown in the media still denies voice in more subtle ways.  While one could not really be effective in saying that these realities constitute full blown oppression, it is clear that the relationship between "America" and what it means to be a woman is a challenging one where greater improvement in terms fairness is a work in progress.

bmrasmussen eNotes educator| Certified Educator

It is true that in America's history, women have been barred from activities we now consider to be rights - for example, voting and holding certain jobs.  I would argue, however, that the position of women in history was, in some ways, more respected than today.

In today's culture, women are often viewed as objects.  Women and girls are hooted at, catcalled and plastered on billboards in various states of dress and undress.  The female body is viewed as a source of male gratification, and some top female executives have to deal with comments and looks from the men she works with daily.

While it is unfair to say that all was rosy for women in American history (because it wasn't!), it would be easy to make the argument that women were more respected.  Women had fewer rights, but more respect was shown to them.  Thus, the traditions of opening doors for ladies, pulling out chairs, offering to carry packages, etc.  These were all gestures of respect.  Women were offered the most preferred places, such as seating in an overcrowded bus or a space in a life raft on a sinking ship.  This was all part of the culture - that women were to be cared for and treated with kind regard, not mentally undressed. 

While there have been huge gains in women's rights in more recent years, the respect that women enjoyed for hundreds of years has been largely lost. So there are advantages and disadvantages to the women's rights movements in the last few decades.

devinsmith | Student

The oppression of women in America has changed from being oppression from the government to more of a social oppression. Of course there are still certain things the government hasn't fixed (yet) like uneven pay wages in some companies, but for the most part its oppression from people themselves. Throughout the history of America, women have usually been the bottom tier, even during the civil rights movement. We weren't able to vote and even farther back we were generally treated like property. Now, most of our constriction comes from society saying "A lady doesn't act that way," or "A good woman does this."