If robots could reason, have feelings and be self aware what would be the difference between humans and robots?
There are a series of related factors that would continue to make robots and human beings different.
First, robots would presumably not be mortal. It would presumably be possible to make robots whose parts could be replaced when needed, making the robots essentially immortal except in the case of things like accidents.
Second, robots would not necessarily have consciences and moral feelings. Just because the robot was self aware and had emotions would not mean that it would have to have the ability to have moral values and to feel guilt when it violated those values.
Finally, robots might not have human beings' seeming tendency towards a need for religion. They might not have a need to explain their existence (particularly since they would know they were created by human beings). They might not have the urge for spirituality that we humans seem to have.