The country that is most often cited as the major cause of World War I is Germany. Germany was not the country that first declared war in this conflict. However, historians often say that Germany caused the war. The reason for this is the fact that Germany was pushing to have a greater place in the international system of the time. It wanted to have more power. As it pushed for more power, it caused a great deal of fear on the part of France, Russia, and Britain. Because of this, all of these countries (and Germany, too) were in a state of tension in which any relatively minor event could end up causing a huge war.