Three Laws of Robotics
The Three Laws of Robotics are three rules which Isaac Asimov thought would be programmed into robots. The rules assume robots will be human-type machines, able to walk about and do things. The laws are presented in his 'Robot' series of short stories and novels.
The first mention of the laws is in Asimov's short story Runaround (written in 1942).
- A robot may not injure a human being or, by failing to act, allow a human being to come to harm.
- A robot must obey orders given to it by human beings, except where carrying out those orders would break the First Law.
- A robot must protect its own existence, as long as the things it does to protect itself do not break the First or Second Law.
Later, Asimov added the Zeroth Law: "A robot may not harm humanity, or, by inaction, allow humanity to come to harm"; the rest of the laws are modified sequentially to acknowledge this.
According to the Oxford English Dictionary, the passage in Asimov's short story "Liar!" (1941) that mentions the First Law is the earliest recorded use of the word robotics. Asimov was not aware of this at the time; he assumed the word already existed by analogy with mechanics, hydraulics, and other similar terms denoting branches of applied knowledge.
The Three Laws form a theme in his Robot series and the other stories linked to it, as well as his Lucky Starr series of science fiction for children. Other authors working in Asimov's fictional universe have adopted them, and references (often parodic) appear in science fiction and other genres. Technologists in the field of artificial intelligence have speculated upon the role the Laws might have in the future.
References[change | edit source]
Other websites[change | edit source]
- Worley, Gordon. "Robot Oppression: Unethicality of the Three Laws".
- "Frequently Asked Questions about Isaac Asimov", AsimovOnline 27 September 2004.
- Ethical Considerations for Humanoid Robots: Why Asimov's Three Laws are not enough.