The answer to this question depends on which country you're referring to. If you mean the United States, then the answer is that the effects of World War II were generally positive on the domestic front. Almost uniquely among the War's participants the United States emerged from the conflict considerably stronger, with gains economically, militarily, and politically.
For the first time since the onset of the Great Depression the scourge of mass unemployment had ended. With the economy working at full capacity everyone who wanted a job had one. The post-war Truman Administration remained committed to a significant level of involvement in the economy, which ensured a consistently high level of demand. The return of American servicemen from overseas stimulated a massive housing boom, which further strengthened an already strong economy.
At the same time, one shouldn't ignore the negative consequences of America's participation in the War. In the wake of the Japanese attack on Pearl Harbor, a veritable tidal wave of racism was unleashed upon the United States. Millions of Americans regarded Japanese-Americans as potential traitors and saboteurs. The vast majority of Japanese-American citizens were loyal, but that didn't stop the government from rounding them up and forcing them into squalid internment camps. Although they were eventually released, Japanese-American citizens had to live with the legacy of racism created by the government's actions for years to come. For them, the overall domestic effects of the War were most certainly negative.