robotandbaby.pdf

(87 KB) Pobierz
THE ROBOT AND THE BABY
John McCarthy
2004 Oct 16, 4:56 p.m.
1
John McCarthy
885 Allardice Way
Stanford, CA 94305
(h) 650 857-0672 (c) 650 224-5804
email: jmc@cs.stanford.edu
“THE ROBOT AND THE BABY”
A story by John McCarthy
“Mistress, your baby is doing poorly. He needs your attention.”
“Stop bothering me, you fucking robot.”
“Mistress, the baby won’t eat. If he doesn’t get some human love, the
Internet pediatrics book says he will die.”
“Love the fucking baby, yourself.”
Eliza Rambo was a single mother addicted to alcohol and crack, living in
a small apartment supplied by the Aid for Dependent Children Agency. She
had recently been given a household robot.
Robot Model number GenRob337L3, serial number 337942781—R781 for
short—was one of 11 million household robots.
R781 was designed in accordance with the not-a-person principle, rst
proposed in 1995 and which became a matter of law for household robots
when they rst became available in 2055. The principle was adopted out of
concern that children who grew up in a household with robots would regard
2
them as persons: causing psychological di culties while they were children
and political di culties when they grew up. One concern was that a robots’
rights movement would develop. The problem was not with the robots, which
were not programmed to have desires of their own but with people. Some
romantics had even demanded that robots be programmed with desires of
their own, but this was illegal.
As one sensible senator said, “Of course, people pretend that their cars
have personalities, sometimes malevolent ones, but no-one imagines that a
car might be eligible to vote.” In signing the bill authorizing household
robots but postponing child care robots, the President said,“Surely, parents
will not want their children to become emotionally dependent on robots, no
matter how much labor that might save.” This, as with many Presidential
pronouncements, was somewhat over-optimistic.
Congress declared a 25 year moratorium on child care robots after which
experiments in limited areas might be allowed.
In accordance with the not-a-person principle, R781 had the shape of
a giant metallic spider with 8 limbs: 4 with joints and 4 tentacular. This
appearance frightened most people at rst, but most got used to it in a short
time. A few people never could stand to have them in the house. Children
also reacted negatively at rst but got used to them. Babies scarcely noticed
them. They spoke as little as was consistent with their functions and in a
slightly repellent metallic voice not associated with either sex.
Because of worry that children would regard them as persons, they were
programmed not to speak to children under eight or react to what they said.
This seemed to work pretty well; hardly anyone became emotionally at-
tached to a robot. Also robots were made somewhat fragile on the outside;
if you kicked one, some parts would fall o. This sometimes relieved some
people’s feelings.
The apartment, while old, was in perfect repair and spotlessly clean, free
of insects, mold and even of bacteria. Household robots worked 24 hour days
and had programs for every kind of cleaning and maintenance task. If asked,
they would even put up pictures taken from the Internet. This mother’s taste
ran to raunchy male rock stars.
After giving the door knobs a nal polish, R781 returned to the nursery
where the 23 month old boy, very small for his age, was lying on his side
whimpering feebly. The baby had been neglected since birth by its alcoholic,
drug addicted mother and had almost no vocabulary. It winced whenever
the robot spoke to it; that eect was a consequence of R781’s design.
3
Robots were not supposed to care for babies at all except in emergencies,
but whenever the robot questioned an order to “Clean up the fucking baby
shit”, the mother said, “Yes, its another goddamn emergency, but get me
another bottle rst.” All R781 knew about babies was from the Internet,
since it wasn’t directly programmed to deal with babies, except as necessary
to avoid injuring them and for taking them out of burning buildings.
Baby Travis had barely touched its bottle. Infrared sensors told R781
that Travis’s extremities were very cold in spite of a warm room and blan-
kets. Its chemicals-in-the-air sensor told R781 that the pH of Travis’s blood
was reaching dangerously acidic levels. He also didn’t eliminate properly—
according to the pediatric text.
R781 thought about the situation.
Here are some of its thoughts, as
printed later from its internal diary le.
(Order (From Mistress) “Love the fucking baby yourself”))
(Enter (Context (Commands-from Mistress)))
(Standing-command “If I told you once, I told you 20 times, you fucking
robot, don’t call the fucking child welfare.”)
The privacy advocates had successfully lobbied to put a negative utility
-1.02 on informing authorities about anything a household robot’s owner said
or did.
(= (Command 337) (Love Travis))
(True (Not (Executable (Command 337))) (Reason (Impossible-for robot
(Action Love))))
(Will-cause (Not (Believes Travis) (Loved Travis)) (Die Travis))
(= (Value (Die Travis)) -0.883)
(Will-cause (Believes Travis (Loves R781 Travis) (Not (Die Travis))))
(Implies (Believes y (Loves x y)) (Believes y (Person x)))
(Implies (And (Robot x) (Person y)) (= (Value (Believes y (Person x)))
-0.900))
(Required (Not (Cause Robot781) (Believes Travis (Person Robot781))))
(= (Value (Obey-directives)) -0.833)
(Implies (¡ (Value action) -0.5) (Required (Verify Requirement)))
(Required (Verify Requirement))
(Implies (Order x) (= (Value (Obey x)) 0.6))
(? ((Exist w) (Additional Consideration w))
(Non-literal-interpretation (Command 337) (Simulate (Loves Robot781
Travis)))
(Implies (Command x) (= (Value (Obey x)) 0.4))
4
(Implies (Non-literal-interpretation x) y) (Value (Obey x) (* 0.5 (Value
(Obey y)))))
(= (Value (Simulate (Loves Robot781 Travis)) 0.902))
With this reasoning R781 decided that the value of simulating loving
Travis and thereby saving its life was greater by 0.002 than the value of
obeying the directive to not simulate a person. We spare the reader a tran-
scription of the robot’s subsequent reasoning.
R781 found on the Internet an account of how rhesus monkey babies who
died in a bare cage would survive if provided with a soft surface resembling
in texture a mother monkey.
R781 reasoned its way to the actions:
It covered its body and all but two of its 8 extremities with a blanket.
The two extremities were tted with sleeves from a jacket left by a boyfriend
of the mother and stued with toilet paper.
It found a program for simulating a female voice and adapted it to meet
the phonetic and prosodic specications of what the linguists call motherese.
It made a face for itself in imitation of a Barbie doll.
The immediate eects were moderately satisfactory. Picked up and cud-
dled, the baby drank from its bottle. It repeated words taken from a list of
children’s words in English.
Eliza called from the couch in front of the TV, “Get me a ham sandwich
and a coke.”
“Yes, mistress.”
“Why the hell are you in this stupid get up, and what’s happened to your
voice.”
“Mistress, you told me to love the baby. Robots can’t do that, but this
get up caused him to take his bottle. If you don’t mind, I’ll keep doing what
keeps him alive.”
“Get the hell out of my apartment, stupid.
I’ll make them send me
another robot.”
“Mistress, if I do that the baby will probably die.”
Eliza jumped up and kicked R781. “Get the hell out, and you can take
the fucking baby with you.”
“Yes, mistress.”
R781 came out onto a typical late 21st century American city street.
The long era of peace, increased safety standards, and the availability of
construction robots had led to putting automotive tra c and parking on
a lower level completely separated from pedestrians.
Tremont Street had
5
Zgłoś jeśli naruszono regulamin