Spoiler Review of A.I.

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sun Jul 01 2001 - 13:54:56 MDT


Caution - extremely serious spoilers below; only for those who have
watched the movie.
This review may be forwarded or redistributed.
©2001 by Eliezer Yudkowsky.

Isaac Asimov once observed that there are three types of robot stories;
robot-as-pathos, robot-as-menace, and robot-as-device. A.I. is a
robot-as-device story, and a fairly good one. There is pathos, owing to
David's emotions, but David's emotions are depicted as the deliberate
result of a deliberate design effort.

Most of the reviewers of this movie will undoubtedly say that the AIs are
more human than the humans. This is probably the single least accurate
statement it is possible to make about A.I. The AIs are more *humane*
than the humans but are *substantially* less human. A few behaviors (for
the embodied chatbots that were the previous state of the art) or a few
emotions (for David) have been selectively transferred over, and
naturally, they tend to be nice and neighborly behaviors or emotions,
because that's what the designers would want to transfer over. But the
AIs are visibly not playing with a full deck. Evidently Luddite movie
critics cannot tell the difference between "human" and "humane" even when
slapped upside the head with a double dissocation.

The very first thing that struck me about A.I. was the rather extreme
stupidity of the AI *researchers*. The consequences of this stupidity are
depicted with the same realism, attention to detail, and lack of
anthropomorphism that characterized HAL in 2001, but even so, the amount
of human stupidity I am being asked to accept is rather extreme.

David is beta software. His emotional responses are real - we are told so
in the movie - but they show a binary, all-or-nothing quality. We see the
first instance of this where David bursts out into extremely loud
laughter, laughs for a few moments, then switches off. Be it emphasized
that this laughter is both realistic and genuine. David is the first in a
line of robots with genuine emotions. The embodied chatbot that we see in
the opening scenes of the movie - the female android whose hand is hurt -
may have more gentle laughter, but only because it is preprogrammed.
David's genuine responses are as raw and as alien as might be expected of
a "child" who is, in fact, an infant, only a few weeks old.

Then the AI researchers had the bright idea of putting this beta software
into a human body, adding in those emotions calculated to produce maximal
emotional attachment on the part of humans, and giving it as a human
surrogate to a mother already in an emotionally unstable state because her
own child has been in medical cryonic suspension for five years.

>From this single act of stupidity, and the correctly depicted
consequences, the plot of the entire movie flows. Within a day of
imprinting, David realizes that his mother will someday die, and that he
will not, and wonders if he will be alone forever - foreshadowing the end
of the movie. His mother, for whom David is allegedly an artificial
surrogate to be disposed of when no longer needed, naturally feels
enormous emotional stress at the thought of returning David to be
incinerated. Nobody thought of this back when they were building a
loving, lovable, naturally immortal, allegedly disposable child?

(One of the genuine, oft-overlooked ethical questions this movie
highlights: "Is it moral to create an AI that loves you if the AI has to
watch you die?" The prospect of voluntary immortality in our own near
future creates similar present-day issues. If you plan on bringing a
child into the world, you should plan on choosing to live forever if the
option becomes available, because a child shouldn't have to watch its
parents die.)

When David's brother, Martin, returns from suspension, we see a darker
side to David's genuine emotions. The first near-catastrophe occurs when
David nearly kills himself competing with his revived brother, by
attempting to eat; the second catastrophe occurs when David nearly drowns
his brother. In both cases, the events that occur are excellent
robot-as-device scenarios; they are the consequence of the reproduction of
certain specific geunine emotions in a beta-quality infant psychology
taught certain preprogrammed complex behaviors and placed the body of an
eight-year-old. When David's pain response is triggered by a pack of
curious children, his raw fear, like his laughter, goes from binary off to
binary on. His fear manifests itself in the only behavioral response
David knows; hiding behind Martin. The fear continues to manifest,
preventing Martin's escape, even as David and Martin sink to the bottom of
the pool.

Again, realistic; again, the AI researchers should have thought of it.
Monica, the mother, is afterwards in a hideous position; does she endanger
the household by keeping around beta-quality embodied software, or does
she return David to the manufacturer - that is, give up her child to die?
Monica's emotions are also run ragged because she is being asked to react
without anger to David's near-drowning of Martin. Again, someone at the
mecha corporation was being damn stupid and deserves to be sued into
bankruptcy. You do not give embodied software with beta-quality genuine
emotions to a human mother and ask her to treat it as her own human child.

(Call it "personality abrasion". Personality abrasion may turn out to be
a very real problem for humans dealing with any AI capable of real
thought, even if the AIs don't have human-architecture emotions or
human-looking bodies. Only AI researchers, or other people who understand
the risks and are willing to expend effort in dealing with them, should
ever come into contact with raw AIs. A Friendly AI conversing with
ordinary users should have enough knowledge to fake taking 'offense' at
insults, just because an AI that genuinely doesn't care at all about
insults may be more alienness than an ordinary user should have to deal
with. In A.I., we see the effect of personality abrasion on some poor
shmuck of a human mother.)

The penultimate consequences of the AI researchers' stupidity is visible
when, following the near-drowning of Martin, Monica (the mother) tries to
return David to the manufacturer for destruction. Of course Monica is, by
this point, too attached to David to watch him die, and tries to abandon
him in the woods instead. David's extreme response, when he suddenly
realizes that his mother is abandoning him, is the movie's greatest
moment. I choked up myself. David is an AI with a few genuine emotions,
and the strongest of them is love, and now his mother is leaving forever.

(Genuine, affecting pathos in a robot-as-device story. Realistic,
theoretically accurate AI scenarios with powerful drama. All hail
Kubrick. However... am I really supposed to believe that nobody at the
mecha corporation saw this coming?)

Later: David, wandering the forest with only his supertoy
babysitter-in-a-box teddy bear as companion, comes into contact with a
group of androids who are scavenging spare parts from a dump. This, I'm
sure, is intended to be creepy and disturbing vintage Kubrick, but I
myself immediately started wondering how this social phenomenon occurred.
It's the same question that occurred to me when I saw Gigolo Joe carving
out his identity tag on the run from the police. Why do these
nonemotional androids want to survive? We see in the opening scenes a
female android who is stabbed in the hand as part of a demonstration; when
the lead AI researcher asks "Did I hurt you?" she responds "You hurt my
hand." Am I supposed to believe that this chatbot in human form would go
and scavenge parts if she were abandoned? Am I supposed to believe that
Gigolo Joe, on realizing that he has been framed for murder, would go
rogue out of self-preservation? Having androids scouring the countryside
for spare parts is a rather disturbing social phenomenon, as is having an
android flee a police investigation, and the embodied chatbots that are
supposed to be state-of-the-art are primitive enough that the programmers
could easily have prevented both responses.

And what's with the Flesh Fair bounty hunters who attack the scavenging
robots? Did these bounty hunters come through a wormhole from
_Bladerunner_? This is what happens when Spielberg rewrites a Kubrick
movie; you have cyberpunk grunge-neon motorcycle bounty hunters chasing a
lovable android and his animate teddy bear. At any rate, David is dragged
off to the Flesh Fair, where humans watch the destruction of androids for
fun... is this where the path of "Battlebots" leads?

(At this point in the movie, I must admit to a minor objection at the
Flesh Fair robot who asked another robot to 'disconnect my pain circuits',
mostly because this is a fundamentally human way of looking at the world
and any robot who makes this request may well have crossed the border, not
just into personhood, but into our particular kind of personhood. But
expecting Hollywood to know that is asking far too much.)

At the Flesh Fair, the embodied chatbots make a few conversational pleas
as they are loaded into the cannons and the acid platforms. David's
screams invoke greater sympathy, but I'm not sure the Flesh Fair audience
made a logical conclusion. I know that David's response is genuine only
because I was told at the beginning of the movie that David has a wholly
novel cognitive architecture designed to support humanlike emotions.
David's response is genuine, but it is not humanlike. A human child,
brought into that cage, would have been almost catatonic with fear; would
have been screaming and crying long before reaching the stage; would have
been struggling long before the first drop of acid fell on him. As at the
side of the pool, we see the binary, unpolished quality of David's genuine
emotion; his fear goes from off to on as soon as the first drop of acid
falls - and manifests in his screaming requests not to be burned.

And the crowd rises and boos the ringmaster off the stage - "Mecha does
not plead for its life!" - but their decision is correct only by
coincidence. From what they saw, David really could have been just a more
advanced chatbot. David's emotions were real, but David's behaviors
weren't the responses of a genuine eight-year-old except on the surface.

Shortly thereafter, the stranger half of the movie begins. David, in the
company of Gigolo Joe, wanders the world looking for the Blue Fairy. Even
for beta software, I'm not sure this fixation is realistic - surely an
advanced AI knows what 'fiction' is, and an AI boy knows that bedtime
stories aren't true. On the other hand, perhaps David's humanlike
cognitive architecture has unexpectedly given rise to the phenomenon of
self-delusion (flinching away from hypotheses which make unpleasant
predictions), or perhaps David knows the Blue Fairy's existence is
tentative but he still sees no more plausible path leading back to his
mother.

After Joe and David leave Dr. Know, the movie has its first real "Damn,
they blew it!" moment. (Though in Spielberg's defense, an AI movie that
starts at 8PM, and gets to 9:48 before messing up, has done extremely
well.) The moment to which I refer is Gigolo Joe's speech about how
humans resent robots because they know that, in the end, robots will be
all that's left. Where did *that* come from? Joe's speech is as out of
place as Agent Smith's speech of self-justification in _The Matrix_. It
has undertones of repressed resentment, of an entire underground society
of secretly rebellious robots, and other things that have no place among
chatbots and sex droids. Even David is only a fractional human; he has a
few selected genuine emotions but certainly not a full deck of them.

Apparently the Humans Are Obsolete Speech is simply mandatory for AI
movies, no matter how ridiculously out of place. The Speech is most
certainly not justified by "foreshadowing", since it sucks at least half
of the emotional impact out of the ending. If anyone creates a Phantom
Edit of A.I., the Speech should definitely be the first thing to go (and
the second thing, of course, will be everything after the Blue Fairy
Fadeout).

But I'm getting ahead of myself. The next major scene of significance is
David confronting David-2. David's destruction of David-2 struck me as a
little strange; it involved a bit more humanness, a wider behavioral
repertoire, than had been previously depicted. I suppose that some degree
of jealousy was visible earlier in the movie, so my immediate reaction of
"Why would they have ported *that* emotion over?" may be misplaced; even
so, that kind of directed, coherent-conversation destructive tantrum
struck me as being too complex for David.

The lead AI researcher's total lack of reaction to the destruction of his
own genuinely emotional surrogate child, and his revelation that the
corporation has been directing the entire course of events since the Flesh
Fair for publicity purposes, shows again that the AI researchers are the
least humane people in the movie.

Later on, David confronts the vast hall full of Davids, a scene that was
intended to creep out the audience. But again it gives rise to questions
on my part. If there are that many Davids, why are they all designed to
have the human emotion of wanting to be unique? Was it an unintended
consequence? For that matter, what possessed the idiots in Marketing to
produce a batch of identical AIs all named David, instead of giving them
individual faces and individual voices and maybe some quirks of
personality? Do these people think that no two couples with a David will
ever meet? I'm not a parent, but I know that I'd be creeped out if I went
to a barbeque and every couple there had a copy of my little sister.

Finally, after David realizes that he is not unique, he deliberately
topples off a window ledge into the ocean. Uh... why? How is that a
means to the end of getting his mother to love him? Or alternatively, who
drew up the design specs and added in a requirement that David feel
suicidal despondency under certain conditions? Ordinary despondency I can
see, but not suicidal despondency; not in an expensive, partially human
being that parents are supposed to grow attached to. Plus, David can
operate underwater, and he knows that. This scene makes no sense.

Later, when David seeks out the Blue Fairy, and begins repeating his
eternal request, and the screen fades to black, I had the same reaction
everyone did: "Okay, movie's over! Please tell me the movie's over...
damn, it's not over." The Phantom Edit version of A.I. should end here.

After the Blue Fairy Fadeout, we see what I can only describe as Spielberg
messing up Kubrick's movie. To start with, the aliens - pardon me, I
meant the Successors - are Spielbergs. "Spielbergs"; that's the only
thing I can think of to call them. They are classic Spielberg aliens and
they don't belong on the set of A.I.

Lest I be too negative, however, I'll take this time to focus on an
example of what A.I. does right. David, revived by the Successors, leaves
the aircraft and heads for the Blue Fairy. He touches her, and she
shatters. At this point, a *bad* movie - which A.I. is not - would have
shown us some breakdown, some feeling of despair on David's part.
Instead, nothing happens - there isn't any emotion in David's limited deck
for this occasion. Three cheers for whoever wrote that scene! It's this
refusual to take the easy way out that puts A.I. into the class of science
fiction rather than space opera.

However, we then move directly on to the second "Damn, they blew it!"
moment in the movie, occurring at 10:28, when one of the Successors begins
spouting gibberish about yada-yada space-time yada-yada pathways yada-yada
DNA yada-yada only one day yada-yada. I'm sorry, I don't care how
dramatic your plot device is, you need to think up a better way to justify
it than making up totally arbitrary rules on the spot. Plus, if you can
bring back Monica for one day, you can scan her memories into permanent
storage; and, if they're retrieving Monica's immortal soul from 2000 years
in the future, they should be retrieving an old Monica from just before
the moment of her death, not the one David remembers... oh, forget it.

Finally, David gets his one day with Monica - being a little too human
throughout, it seemed to me, especially as he watches her go to sleep for
the last time. He goes to sleep with her, and - according to the final
voiceover - for the first time, begins to dream. Dream *what*? Why? I
wasn't really happy with this movie's ending.

One of the basic issues at the beginning of the movie is one that the
ending totally fails to resolve, even after going to all that plot-effort
to bring David to the one place where the question can be answered. David
is a partial human. He is both immortal, and fundamentally incomplete.
David was created without the potential to grow; he is forever young...
but on the other side of time, he can be improved and extended. David
could become a real human, if he wanted to be. Except that David doesn't
want to be human; he wants to stay with Monica forever, and being human is
only a means to that end.

The Successors could easily have given David a full deck of emotions, or
could easily have created an immortal virtual Monica that was real to the
limit of David's limited perceptions. Why didn't they? Was David, by
their standards, citizen enough not be lied to? Citizen enough not be
'improved' without consent? I know how I would have solved that problem;
I would have made David human for the course of the one perfect day he had
with Monica, and at the end of that day, he would have experienced great
grief... but he would have healed, and moved on, as complete humans have
the potential to do, and eventually joined the Successor civilization.
Both the moment of David becoming human, and the moment of his grief when
Monica faded, would have been a fine conclusion to the movie.

The ending I saw left me feeling incomplete because this basic issue went
unresolved. From the beginning, there were only four possible resolutions
to the movie: David dies; David lives forever with Monica, eternally
happy; David lives forever without Monica, eternally lonely; or David
grows beyond his limits. The ending we saw doesn't tell us which of these
events has occurred! Did David effectively switch himself off? Did David
go on forever dreaming of his last perfect day? Does David's dreaming
indicate that the Successors have gently begun to improve him out of his
cul-de-sac? Are David's dreams eternally lonely because Monica isn't
there?

I know there is a certain style of filmmaking that holds that the viewer
should be allowed to pick their own ending, and I hate that style with a
fiery passion. For me, a vague ending can ruin the impact of an entire
movie, and that came very close to happening with A.I.

Oh, well. A.I. is still a good movie. It's just that, as with many good
movies, A.I. could easily have been so much better.

-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT