One event that I would pick would be the introduction of the birth control pill. It is hard to overestimate the impact that this event had on American society.
With the introduction of The Pill, women's role in American society started to change. All of a sudden, women had much more control over their reproductive lives. This allowed them to have, among other things, more opportunities to have careers. This, in turn, has helped lead to the situation we now have in which women tend to have careers and are seen as much more equal to men.