Concerning acts 2–3 and the epilogue, do you agree with Alquist that creating robots "was a crime?" Or do you think that there was something noble in trying to create robots that would help alleviate humans' enslavement to work and inequality (as Domin suggests)? Was Helena right to insist that the robots have souls?

Expert Answers

An illustration of the letter 'A' in a speech bubbles

These are certainly very interesting questions! However, your response to these questions will depend upon your own perspectives and belief system.

First, Alquist states that creating robots was a crime. He argues that robots have brought out the worst in humankind. Meanwhile, the shareholders are too busy enjoying their profits...

See
This Answer Now

Start your subscription to unlock this answer and thousands more. Enjoy eNotes ad-free and cancel anytime.

Start your Subscription

These are certainly very interesting questions! However, your response to these questions will depend upon your own perspectives and belief system.

First, Alquist states that creating robots was a crime. He argues that robots have brought out the worst in humankind. Meanwhile, the shareholders are too busy enjoying their profits to consider the negative ramifications of unleashing those robots on society. Instead of making society better, robots have taken on the worst traits of humanity. They have learned to be selfish and to place undue emphasis on material things. Most ominously, they have learned to kill in order to secure their own survival.

Now, Domin's idea of creating robots to help alleviate human suffering is noble. However, the results are devastating, as the play shows. Meanwhile, Helena's idea of giving souls to robots poses an important metaphysical question experts are still trying to answer today: should we program any sort of moral agency into robots? In other words, should we give robots "souls"?

By extension, if we give robots "souls," what belief system should we reference when we program moral values into them? More importantly, are moral values programmable? Or are moral values intrinsic to humans alone? In the play, the robots kill all the humans, except for Alquist. They have come to believe that humans are like parasites and are to be destroyed. The question begs to be asked: if we give robots "souls" and humanize them, will they eventually decide to extinguish humanity, as they do in R.U.R.?

These are important questions, ones you and I will have to ponder if we are to entertain the idea of creating legions of robots to extinguish human suffering and inequalities.

Approved by eNotes Editorial Team