Whether or not a war is just is always a matter of perspective.
In the case of the Second World War, I think that the Nazis did all they could to justify their actions; after all, the harsh terms and demands of the Treaty of Versailles had left many Germans bitter and disillusioned, and had set the stage for what was to follow. In the minds of those Germans who felt resentment and who were looking for revenge, the war was just.
On the other end, the British and the French remembered the suffering endured during the First World War. The French especially had a well-founded mistrust of their German neighbor, and they knew what destruction awaited them, should history repeat itself. Although Germany had not invaded Britain, there was a real fear that this was Nazi Germany's ultimate goal, to be followed by world domination. Had it not been for the Battle of Britain, Germany might very well have succeeded in invading the British Isles. In this instance, where self defense became imperative, I believe that most British people would feel that war was just. In light of what we now know of Nazi Germany's actions during the war, I would argue that most of those who fought against the regime would feel that the war was justified.