In the United States, World War II helped to complicate the issue of gender and gender roles in the United States. The main way in which it did this was by forcing (or allowing) women to play roles that had previously been reserved for men. With men absent, women had to be more in control of their families and lives. With men absent, women also had to (or were able to) step into jobs that would never have been acceptable for a woman. These included things like manufacturing jobs and even military (non-combat) occupations.
But WWII did not truly change gender roles. Not all men or women felt that these changes would be good as a long term thing. Instead, many felt that once the war was over things should go back to normal. In this way, the war complicated the issue because it started to change attitudes but did not cause a complete change.