The Age of Moral Machines
1. The original version of the three laws of robotics can be categorized as deontological ethics. Deontological ethics places great importance on the correlation between the aptness of human action and duty. The first law of robotics in the original version requires that a robot refrain from injuring a human being or protect them from danger. The law reflects the duty of the robot to the human being. The robot acts because it is its duty to human beings, and this is the basis of deontological ethics.
2. The third version of the three laws of robotics can be classified under the social contract theory, where humanity’s moral obligations are met by means of an agreement or contract. The society is formed on a contractual basis according to the theory. By law, the robot is not to harm humanity and this agreement is what prevents them from harming a human being.
3. A robot could directly harm one human to save another in the original version of the laws. This is because the original version simply alludes to the fact that the robot should protect a human being because it is their duty.
4. In a situation where two people are at risk and the robot can only save one, the robot will often rescue the human who shows a higher likelihood of survival. This condition appeals to the third version of the laws of robotics since the law has a higher regard for the protection of humanity rather than that of an individual.
5. The rogue robot dreams of a world where only his kind exists and one where the only law is that they should protect their own existence. The rights theory best describes this new moral system as the robots assign great emphasis to their well-being rather than that of human beings. The new moral system acknowledges only the robots’ right of existence but does not the role of humanity in the society.