1 Answer | Add Yours
There are a number of specific changes during these years that show us how the roles that women play have changed during that time. Let us look at three of them.
- Women get the right to vote. This change occurred in 1920. Before that, women were not allowed to vote in national elections. When they got this right in 1920, it showed that ideas about women’s roles were changing to some degree. This right meant that women now had at least some role in the political sphere.
- Women work in men’s jobs in WWII. This change has to do with the economic sector. Before WWII (and at least for a little while after it), there were many jobs that were not seen as fit for women. Women were ideally supposed to be out of the workforce. If they did enter the work force, it was to be in either nurturing jobs like teaching elementary school or in subordinate jobs like being a secretary. The move of women into “men’s” jobs during WWII was only temporary, but it foreshadowed a bigger and more permanent change as women started to have careers by 1970.
- The Women’s Lib movement occurs. This change touches on the social and cultural sector. Women were, to this point, seen mainly as subordinate to men. It was acceptable to think of them as inferior and as mainly existing for the convenience of men. With the rise of thinkers like Betty Friedan, challenges to this idea arose. Women started to push to be treated in ways that were more equal to men.
Thus, women’s status and role in the political, economic, and social/cultural areas all changed to some degree during this time period.
We’ve answered 319,199 questions. We can answer yours, too.Ask a question