So are you asking what major change happened in the US after Reconstruction?
If that is what you are asking, the rise of imperialism is one possibility. However, I would say that the real rise in imperialism does not start for around ten years after Reconstruction ended in 1876.
Another change that was happening in the US right after Reconstruction was an increase in conflict between workers and employers. This played out partly in strikes, some of which were quite violent.
The increase in labor conflict was tied to a boom in the US economy and an increase in the size of companies. This is another major change in the US after Reconstruction. However, it was already happening before Reconstruction ended.