Better Students Ask More Questions.
Describe the roles of women in the past and the present.
10 Answers | add yours
The role of women is something that varies significantly across the world today. However, we can generally say that women's roles have been expanded from what they traditionally were in the past. This is especially true in more developed countries.
In more developed countries in particular, more jobs require little physical strength. This means that there are many fewer jobs that women are less able to do. As a result, women's roles have expanded to the point where a majority of women tend to participate in the labor market. Along with this has come a great increase in the status of women. The idea that women and men are equal is much more (though not universally) accepted than it ever has been in the past.
Of course, this is not true for all women in the developed world and is even less true in many parts of the developing world.
Posted by pohnpei397 on September 27, 2011 at 11:25 PM (Answer #2)
Obviously, you are asking a broad question that can be answered with broad generalizations that are relatively meaningless in specific situations, or you can give detailed responses that vary greatly from place to place and at different times in history.
In the far distant past, women's role was to bear young and care for them. As part of that caring, women have traditionally been very involved in food preparation, which in many cultures expanded to include agricultural efforts to raise food as well as collecting water, gathering fuel for cooking, and making of objects needed by the family such as pottery and clothing.
As tools and technology has evolved, the assumed role of the woman as the caregiver has ceased to be the only acceptable role in some parts of the world and in some cultures, although it persists in other areas. The generalizations have many more exceptions in today's world.
Posted by stolperia on September 28, 2011 at 3:52 AM (Answer #3)
High School Teacher
I would throw out the thought that in the U.S., at least, the only real sexism comes from people with no sense of reality. Any woman in the U.S. is free to do anything she wants, any time she wants. There are abusive relationships, of course, and there are sexist bosses and so forth, but I've never encountered a woman in my life who couldn't work, or drive, or go out in public without a man and with her legs showing. Really, we can see the progression of woman's rights just by comparing certain third-world countries with the U.S. and making a timeline. Some people are just stuck in the past, and abuse of women and their rights is a part of that past.
Posted by belarafon on September 28, 2011 at 4:01 AM (Answer #4)
High School Teacher
Where to start? For thousands of years, women (in cultures dominated by by men) were subservieant to men. There are a few exceptions were the lineage of a family was traced on the mother's side, surnames came from the mother, etc. However, the lineage in most cultures generally follows the male line of the family.
In some cultures, women work/worked side-by-side men. In the Anglo-Saxon period, women often fought alongside the men. In many very old and traditional cultures (both in terms of and religious) women are seen as second-class citizens. The old addage of women "barefoot and pregnant" still exists today in some areas of the world. Women are not allowed to make major decisions in the family and are expected to submit in all things to the husband; they cannot own anything, and they are not permitted to go to school. This is certainly the way things were in England hundreds of years ago, where women could not (historically) inherit money or property, or if they did, it was forfeited to the husband when the woman married. Widows might not marry again for just this reason.
Today, at least in America, women are on a more equal footing with men in most cases: often by necessity, where women can now compete in the workplace for jobs that were once traditionally held by men (as a doctor for example), and also in light of two-parent working households. In other countries, still women are treated without value. I recently men a man of a particular faith who would not shake my hand because I was a woman.
However, more than ever, opportunities in becoming educated, serving in medicine, politics, news reporting—in fact, most career opportunities—are open to women in this country. Things have changed a great deal over the last three hundred years in America—even since women won the right to vote near the beginning of the twentieth century, and equal opportunity legislation was passed just past the middle of the twentieth century.
Posted by booboosmoosh on September 28, 2011 at 4:25 AM (Answer #5)
Middle School Teacher
Women's roles vary by society and time period, but there has been a gradual increase in gender equality, especially in the last hundred years. Women do not have equal rights in some countries, especially in the Middle East. American women may be able to do the same jobs as men, but they still are ofen paid less and promoted less often.
Posted by litteacher8 on September 28, 2011 at 4:38 AM (Answer #6)
In response to post #4
Even though women legally have the same rights as men, there is still a lot of sexism and double standards in this country. Women are still oversexualized and objectified. Older women in Hollywood are not as successful as the men. The list goes on and on about the disrepancies between the genders, and it is not certainly just a thing of the past.
Posted by megan-bright on September 28, 2011 at 10:37 AM (Answer #7)
High School Teacher
There is so much that can be said in relation to this post. To pick up on some of what the other editors have commented upon, let us be aware that many people claim that, although the conditions for women have become far more equitable, at the same time there is still what is called a "glass ceiling" that prevents women from achieving the same levels of responsibility and salaries as men, and it is all the more pernicious because it is "glass" and therefore invisible.
Posted by accessteacher on September 28, 2011 at 8:44 PM (Answer #8)
Elementary School Teacher
The role of Western women through history is ofttimes oversimplified and misinterpreted through the lens of recent history for which there are far more records. Focusing on the 1950s, women came to be idealized as a result of the affects of two World Wars, with houses in isolated, idealized suburbs and Christian Dior styles that glorified form and restricted movement, etc. Focusing on the 19th century, women in Western societies hit their lowest point with patriarchal pomposity and grandiosity significantly heightened. These two eras form much of what today is inadequately called the "traditional" roles of women.
As antiquities archaeologist Robin Lane Fox says, while literature cannot stand in lieu of archaeological evidence, it must be taken seriously into account as it must have rung true in the society in which it was written to have been accepted. Therefore literature can give guidelines to the roles women had in more remote eras. Women have been spoken of since Biblical times as running private businesses, managing large pools of domestic servants and making significant contributions to society. In fact, Fox speaks of inscriptions at Delphi that identify women as the authoritarian and financial sponsors for grand and culturally important civic buildings in antiquity. In Beowulf, Wealhtheow may not have had legal power, but she had personal power and authority. For example, she could order her husband's men and they would obey her: She was not a "traditional" Victorian nor an idealized 1950s woman, at all.
Further, historian writers of an AP text declare that their research connects the idea of a woman's place being in the home to post-plague times when part of promises of marriage included the promise to keep the woman safe at home, in other words, not exposed to the terrors of the plague. This places the rise of the connection between a woman's place and the home with love and protection instead of with dominance, suppression and subjugation. In brief (?!), the role of Western women has changed so vastly throughout the ages and from such diverse causes that there is no single descriptive umbrella that can identify a simplistic or "traditional" or uniform role for Western women.
Posted by kplhardison on September 29, 2011 at 4:16 AM (Answer #9)
High School Teacher
The role of women is much different today than it was in the past. Today, women work outside the home much more (a single income household is simply too hard to support). Women also have more power than they have in the past. Look at the influence of women like Michelle Obama and Oprah.
Women are required to balance a lot more today than in the past as well. Jobs, children, homes, life in general.
Posted by literaturenerd on September 29, 2011 at 7:15 AM (Answer #10)
It would be inappropriate to say that their role have changed, but instead they have taken new roles along with the role they used to play earlier. Women earlier had the role of home maker alone, now they are the bread earner and home maker together.
Posted by kinghtalexis on November 20, 2011 at 7:24 PM (Answer #11)
Join to answer this question
Join a community of thousands of dedicated teachers and students.