Many things happened after Germany invaded Poland. The most likely answer to this question, though, is that France and Great Britain declared war on Germany. This is typically seen as the beginning of World War II in Europe.
Germany invaded Poland on September 1, 1939. It did so soon after it concluded a nonaggression pact with the Soviet Union. The Germans and the Soviets agreed not to attack one another and they agreed to divide Poland between them. This gave Germany the ability to attack Poland without fear of bringing the Soviet Union into the war.
When Germany invaded, France and Great Britain had finally had enough. Prior to that point, they had appeased Hitler. Now, they actually followed through on their threats and declared war on Germany because it had invaded Poland. They were not in any position to actually help Poland fight Germany, so there was very little real fighting between Germany and the Western Allies at that point. However, this is seen as the start of WWII in Europe.