There is no way to know this for sure, but here are two possibilities.
First, it could be that American culture and values prevented it. The United States never had an aristocratic government or an absolute monarchy or anything like that. Therefore, the ideas of democracy were perhaps more strongly engrained in the minds and attitudes of Americans than they were in places like Germany or the Soviet Union. You can note that Great Britain (also a place with deep roots in democracy) did not become totalitarian either.
Second, you could argue that the New Deal prevented it. The New Deal made it seem to the masses that the government was trying to do something to help them. Importantly, it did so without resorting to the rhetoric of class warfare or of any sort of Nazi-style racism. The New Deal helped to make the masses feel that things were going to get better, which reduced the likelihood that they would go for a more radical option that might have led to totalitarianism.