Whether or not a war is just is always a matter of perspective.
In the case of the Second World War, I think that the Nazis did all they could to justify their actions; after all, the harsh terms and demands of the Treaty of Versailles had left many Germans bitter and disillusioned, and had set the stage for what was to follow. In the minds of those Germans who felt resentment and who were looking for revenge, the war was just.
On the other end, the British and the French remembered the suffering endured during the First World War. The French especially had a well-founded mistrust of their German neighbor, and they knew what destruction awaited them, should history repeat itself. Although Germany had not invaded Britain, there was a real fear that this was Nazi Germany's ultimate goal, to be followed by world domination. Had it not been for the Battle of Britain, Germany might very well have succeeded in invading the British Isles. In this instance, where self defense became imperative, I believe that most British people would feel that war was just. In light of what we now know of Nazi Germany's actions during the war, I would argue that most of those who fought against the regime would feel that the war was justified.
That is definitely going to depend on who you ask. It's also going to depend on what dates you throw around "World War II." I'm sure Great Britain didn't think it was "just" watching the way Hitler and the Nazi regime were marching across Europe. I'm sure France and Poland felt the same way. But if you asked the Nazi party that same question about those same events, they would say "yes, it's justified." Hitler saw his war as a way to purify the European gene pool. That wasn't his only reason, but in Hitler's eyes his cause was justified.
If you are referring to only the United States's involvement, then yes I think it was just. I might feel completely different if Japan had not attacked Pearl Harbor, and the U.S. still got involved, but that's not what happened. The United States was attacked, and the U.S. retaliated by declaring war.
I'm sure that you have heard the phrase "violence doesn't solve anything" before. I once saw a reporter say to a veteran that very same line. His response has always stuck with me. He replied "It sure seemed to stop Hitler." That doesn't mean I think violence should be the first choice solution. The question asked for an opinion on whether or not World War II was a justifiable war. Despite all of its destruction and loss of life, yes, I do think it was a justifiable war for the United States. I wish the entire thing could have been avoided, but that's not what happened.
world war 2 was not a just war in the eyes of Britain however if the Nazis won the war the answer would be very different.