This is an interesting question that has, over time, received many different answers, especially the issue as to when the American public was aware of what we now call the "Holocaust."
Even though there exist some news coverage in the late 1930's and early 1940's of Hitler's "social" programs, which included the beginning of the Holocaust, these articles received scant attention for a variety of reasons, not the least of which is that readers simply didn't believe most of them and thought they were propaganda pieces.
In real terms, the American public did not know the reality of the Holocaust until 1945 when Allied soldiers began to come across and liberate various concentration camps. As the soldiers began recounting their tales to other soldiers and to write of these experiences to their families in America, knowledge of the Holocaust began its journey through the larger American public. Even then, however, average Americans found it difficult, if not impossible, to believe that the atrocities they were told about were more than an isolated aberration.
In the years immediately after the war, the American public really began to understand the implications of these isolated stories--eventually, it became clear that the atrocities were not anomalies but were the result of institutionalized behavior on the part of the Germans. Although the American public was slow to recognize that the Germans had committed genocide against Jews and many other ethnic minorities, when it became clear that these stories were true, most Americans were shocked to their core.