I feel that the short answer to this would be that the emergence of the women's movement was primarily responsible for helping change the roles of women post World War II. The emergence of works like The Feminine Mystique by Betty Friedan were critical in asserting that women's roles can be altered and changed from what was to what can be. Such works as well as scientific developments like the birth control pill helped to reconfigure how women were seen and, more importantly, how they saw themselves. This helped to cast women's roles in a more assertive light, in a way that allowed them greater autonomy and greater control of themselves and their capacity. From this, the women's movement began to emerge out of the social activism of the 1960s. With this growth, social change moved into economic change, with women in the workplace. These alterations made it so that women were able to change their roles, identities, and their sense of self.
I do not think that they really did, at least not in the "good" way that I think your question implies.
After the war, the only real change in women's roles was that they went back to the status quo from before the war. During the war, women had been "allowed" to do all sorts of things that had been "male" jobs. But now they were expected to go back to how things had been before.
I think that you can see the impact of this in the beginnings of feminism in the 1950s. Women were not "liberated" right after WWII. Instead, they were expected to go back to being housewives and housewives only. This did not change until the women's movement of the '50s and '60s started to take hold, and that was largely with a younger generation of women.
So, overall, I'd say that women's roles changed only in that they returned to the previous status quo. This happened because men came back from the war and reclaimed their old jobs and it happened because people's attitudes about women had not yet changed.