1 Answer | Add Yours
The position of women changed to some degree in the 1920s with women becoming more likely to be part of the public world. Women were coming to be more likely to do things like getting jobs and to engage in politics. They were also more likely to do things like smoking, drinking, and wearing what would have been seen as daring clothes in public.
All of this had many roots. It was partly based in the Progressive Era of the early 1900s. The reforms of this era had given women the vote, thus putting them in the political sphere. Economic changes during the early 1900s had also opened up a large number of new professions to women. These were generally not great jobs, just things like secretarial positions. But they did give women the opportunity to work outside the house and some money to use to have fun. Finally, there were those opportunities for fun. Labor-saving devices had made it easier to do the the housework and that left times for things like riding in cars and going to movies, both of which were new things at the time.
In those ways and for those reasons, the position of women in American society changed in the '20s.
We’ve answered 319,863 questions. We can answer yours, too.Ask a question