This an important question. It is true that even in America that at times women make less than men in the work place. They may do the same job and as well as or even better than men, but at times they get paid less. It is also certain that more equality is being established, which is good news. But to be honest, it is not equal yet.
Also even if women and men get the same pay, there can be a difference in pay in other ways. Men may get more bonus at the end of the year. Part of the reason women get paid less is because men have been in power in society for a long time. And so they have constructed society to benefit them. As the great sociologist Peter Berger has shown us, all society is socially constructed.
Another reason women get paid less is because they bear children and need to take off more time (some argue). So, some countries, like those in Scandinavia, are making both men and women take off equal time for children, so that there would be greater equality. More recently, Walmart was in a lawsuit, because women stated that they were not getting promotions. So, this issue is a real one today.