What were some of the political & social changes that affected whites after the Civil War?
There were many political and social changes that affected whites after the Civil War.
In the South, Reconstruction affected whites as well as blacks. Whites had to, for a time, get used to the idea of free blacks who had rights. This did not last, but for as long as Reconstruction lasted, this was a big deal.
In the North, perhaps the biggest change was the "opening" of the West. After the Civil War, the boom in railroad as well as things like the Homestead Act opened the Great Plains and the West for settlement. This brought about a huge change as thousands of people made their way west to try to make their fortunes. The experiences they had are said to have been a major part of forming the national character of our country.