1 Answer | Add Yours
In general, the changes in women's role in society that happened during the war did not last. After World War I, women generally went back to playing the same role that they had previously.
During WWI, many women took up jobs that had been left open by men going to fight. However, once the war ended, women did not keep those jobs. Instead, they were generally encouraged to go back to being wives and mothers. For example, Congress passed the Sheppard-Towner Maternity Act in 1921 to fund instruction for women in how to care for their infant children.
However, there were some ways in which WWI did change women's roles. The most important of these was the fact that WWI was one factor in women getting the right to vote in 1920. Even so, there was not a major and immediate change in women's roles that could be attributed to WWI.
We’ve answered 330,328 questions. We can answer yours, too.Ask a question