fusion telex Telex external link External Link internal link Internal Linkatomjacked inventory cacheInventory Cache

the three laws of robotics

The Three Laws Of Robotics

This nOde last updated November 27th, 2004 and is permanently morphing...
(9 Ix (Jaguar) / 17 Keh (Red) - 74/260 -
fusion telex
Invented by internal linkIsaac Asimov for his _Robots_ seriesatomjacked inventory cache

The "Official" Laws

_Handbook of Robotics_
56th Edition, 2058 A.D.


The NS robot with a modified First Law . The new law was stated as "No robot may harm a human being".

Susan Calvin first suggested the existence of a internal linkZeroth Law of robotics. "No robot may harm humanity or, through inaction, allow humanity to come to harm". The First to Third Laws should be amended accordingly.

Elijah Baley claimed, during a murder investigation on Solaria, that the First Law had always been misquoted. He suggested the First Law should be restated as "A robot may do nothing that, to its knowledge, will harm a human being; nor, through inaction, knowingly allow a human being to come to harm".

The Three Laws of Susan Calvin


The planet-organism internal linkGaia, adapted the first law as a philosophy.
1. Gaia may not harm life or, through inaction, allow life to come to harm.

fusion telex

Bishop synthetic three laws of robotics

fusion telex

Asimov attributes the Three Laws to John W. Campbell from a conversation made on December 23, 1940. However, Campbell claims that Asimov had the Laws already in his mind, and they simply needed to be stated explicitly.

The Three Laws were extended by a fourth law, the 'Zeroth Law', so named to continue the pattern of lower-numbered laws superseding higher-numbered laws. It was supposedly invented by R. Daneel Olivaw and R. Giskard Reventlov in Robots and Empire, although it was mentioned earlier in "The Evitable Conflict" by Susan Calvin. In Robots and Empire, Giskard was the first robot to act according to the Zeroth Law, although it proved destructive to his positronic brain, as he violated the First Law. Daneel, over the course of many thousand years, was able to adapt himself to be able to fully obey the Zeroth Law.

0. A robot may not injure humanity, or, through inaction, allow humanity to come to harm

A condition stating that the Zeroth Law must not be broken was added to the original Laws.

fusion telex
Calexico - A Feast Of Wire on Quarterstick (2003) Screeching Weasel - Anthems For A New Tomorrow 12inch on Lookout! (1993)


Flaming Lips - Yoshimi Battles The Pink Robots (2002)

fusion telex
film internal link_Repo Man_ (vhs/ntsc)atomjacked inventory cache

Otto and Bud in car in alley

[Bud snorting a line: Jesus Christ.]

Bud: Never broke into a car. Never hot-wired a car. Kid. I never broke into a trunk. I shall not cause harm to any vehicle nor the personal contents thereof. Nor through inaction let that vehicle or the personal contents thereof come to harm. That's what I call the repo code kid. Don't forget it etch it in your brain. Not many people got a code to live by anymore.

The Repo Man Code

Bud: Hey! Hey look at that. Look at those assholes over there. Ordinary [fucking] people I hate 'em.

Otto: Me too.

Bud: What do you know? See an ordinary person spend his life avoiding tense situations. Repoman spends his life getting into tense situations. [Assholes]! Lets go get a drink.

fusion telex

first mention of the Three Laws in internal linkUsenet:

From: David Levine (davidl@orca.UUCP)
Subject: Three Laws needed
Newsgroups: net.sf-lovers
Date: 1983-08-17 21:00:18 PST

The following article cropped up in this evening's paper and I thought that, with the recent interest in this net about how SF prophecies are coming true (someone's query about waldos recently, for one thing) it might be of interest. It seems that the internal linktime when the Three Laws of Robotics are required is fast approaching... faster, in fact, than the time when we can build machines which  are smart enough to obey them!  (This raises intriguing questions about ethics and technolgy which I don't feel like going into right now.)  The alternatives are to surround the robots with safeguards (which reminds me of the laws requiring automobiles to be preceded by a man on foot waving a red flag) or to make them smarter.  The additional processor power required to interpret and obey the Three Laws is presently more expensive than mechanical safeguards (e.g. a fence around the robot) and so we won't be seeing moral robots for some time, if ever.  A thought to think about: at what point does the phenomenal expense of intelligent robots outweigh the cost in lives and injury incurred by dumb ones?  (This, of course, assumes that robots smart enough to distinguish a "human being" from a trash can, never mind avoid harming one, are technically possible.)  Given normal business ethics, is there any situation in which the Three Laws would be preferable (i.e. cheaper in the long run) than mechanical safeguards?


The following article appeared in  The (internal linkPortland) Oregonian,  Aug. 11, 1983, p. A18.  Reprinted without permission.

                   ROBOT FIRM LIABLE IN DEATH

By Tim Kiska, Knight-Ridder News Service

DETROIT  -- The  manufacturer  of a one-ton robot that  killed  a  worker  at Ford Motor Co.'s Flat Rock casting plant must pay  the  man's family  $10 million,  a Wayne County  Circuit  jury  ruled Tuesday.   The  jury of three men and three women deliberated for 2 1/2 hours  before  announcing  the  decision  against  Unit  Handling Systems  in  a suit by the family of  Robert  Williams,  who  was killed  Jan.  25,  1979.   Unit Handling is a division of  Litton Industries.   It is believed to be the largest personal injury award  in state history.  The case was tried before Judge Charles Kaufman.  At the time of his death, Williams, 25, of Dearborn Heights, Mich.,  was one of three men who operated an electronic  parts-retrieval system at Ford's Flat Rock plant.   The plant has since been closed.  The  system,  made by Unit Handling,  was designed to have a robot  autoamatically  recover parts from a storage area at the plant.  On the day of his death,  Williams was asked to climb into a storage rack to retreive  parts  because the robot was malfunctioning at the time and not operating fast enough, according to the Williams family's attorneys.

The  robot,  meanwhile,  continued to work silently, and a protruding  segment of its arm smashed  into Williams'  head, killing him instantly.  The robot kept operating while Williams lay dead for about 30 minutes.  His body was discovered by workers who became concerned because he was missing.  Attorneys  for  the family said the robot should  have  been equipped with devices to warn workers that it was operating.  "If  they  didn't want people up there when  the  robot  was moving  around,  they should have installed safety devices," said Joan Lovell, one of the two attorneys representing the family.   "Human beings are more important than production."  The  jury's  award went to Williams'  widow,  Sandra,  their three children, ages 8, 6, and 5, his mother, and five sisters.  The  6-year-old was celebrating his second birthday  on  the day of his father's death. "They  were an extremely close family," said Lovell.   "I've seen a lot of people who have been injured,  but this family  was particularly devastated by this loss."

        -- end of article --

  -- David D. Levine   (...decvax!tektronix!tekecs!davidl)      [UUCP]
                       (...tekecs!davidl.tektronix@rand-relay)  [ARPA]

fusion telex Telex external link External Link internal link Internal Linkatomjacked inventory cacheInventory Cache
fUSION Anomaly.  Locutions
return to the source...fUSION Anomaly.
fUSION Anomaly.