1 Answer | Add Yours
The answer to this will vary to a large degree depending on what country or countries you are looking at. Women had different experiences in the various countries. In the United States, the main effect that the war had is summed up in the words of the link below:
One of the biggest social changes of the war involved the expanded role women played in society.
In the US, women moved to some degree into roles that men had played before the men went off to war. For example, women took a large number of jobs that had previously been done almost exclusively by men. Women did not keep those jobs after the war, but the war did bring a greater sense that women could and should participate more fully in public life. This led, for example, to women getting the right to vote in 1920.
We’ve answered 288,163 questions. We can answer yours, too.Ask a question