In this question, you specify that you are asking about “Western culture.” Therefore, I will focus on things that affected culture, not things that affected society. In other words, something like the atomic bomb affected society much more than it affected culture. I will focus on things that were more specifically related to culture.
One major change to Western culture in this time period has been the “liberation” of women or the advancement of equality for women. This has changed Western culture tremendously. Our culture used to be based on family units in which the woman stayed home with the children and the man worked. Now, we are much more likely to live in families where both parents work and/or families that have been split by divorce. We are also much more likely to treat women with respect than we once were. These changes, both for good and ill, have come about because of this change.
A second major change has been the declining importance of religion. This has been particularly prevalent in Europe. Much of Western culture has moved away from its religious roots. This, too, has altered our culture. It has helped to liberalize our societies, but also to weaken ties between us since we no longer share a common religious background.
A final major change has been the increased presence and importance of non-whites. In the US, this has been accomplished through things like the Civil Rights Movement and through immigration. In places like France and the UK, it has come about through immigration. Our increased attention to minority rights has made our society more just. However, the growing nonwhite population has also made it harder for us to have strong social ties (and I say this as a person of color) as we work on creating a new culture that is not so firmly based on white culture.