How did Japan surrendering (in WWII) have an impact in America?

Expert Answers
pohnpei397 eNotes educator| Certified Educator

When Japan surrendered, the war was over.  The end of the war affected the US tremendously.  Most importantly, Americans stopped dying in large numbers.  There were no more battles killing and wounding members of the armed services.  Very importantly, the US did not have to incur the huge numbers of casualties that an actual invasion of Japan would have caused. In addition, American life on the "homefront" was able to start returning to normal.  Rationing was soon ended.  Resources that had been used to make things for the war could now be used for consumer goods.