Isaac Asimov wrote extensively, both in fiction and non-fiction, about how technology will and does change society. One good short story that addresses these themes is "Evidence," written in 1946, which shows what might happen if a robotic brain is developed to the extent where it can mimic a human being. The idea is that humanoid robots are illegal, because of the crimes they could commit or facilitate; robots must look like machines, with metallic skin and non-human features. The central character, Dr. Susan Calvin, must figure out if a man running for President is actually a humanoid robot. If so, he would obviously not be eligible; however, the candidate argues that he should not be subjected to unwarranted intrusion into his life on the possibility that he is a robot, because if he is a human, this is a violation of his rights.
"As a citizen of adult responsibility -- I have the psychiatric certificate proving that -- I have certain rights under the Regional Articles. Searching me would come under the heading of violating my Right of Privacy. That paper isn't sufficient."
"Sure, but if you're a robot, you don't have Right of Privacy."
"True enough but that paper still isn't sufficient. It recognizes me implicitly as a human being."
"Where?" Harroway snatched at it.
"Where it says 'the dwelling place belonging to' and so on. A robot cannot own property."
(Asimov, "Evidence," cdn.preterhuman.net)
In this story, the theme of technology changing society is seen in the uproar over the idea that a non-human being could be suitable for a leadership role over humans. Since humans always consider themselves superior to their technological creations, the idea that a robot could be better-suited to run a country implicitly allows that humans are inferior. If that is the case, then humans have no right to subjugate machines in any way, seeing as the machines have proved to be superior beings. This infers that humans are not only determined to create more and more advanced machines, but scared of the possibilities if these machines become truly self-aware. If they are self-aware, then it is only logical that destruction of a machine would be on the level of murdering a human. In that case, society would need to alter its laws and perceptions to allow for sentient, individual machines among its populace.