RE: SIAI's flawed friendliness analysis

From: Ben Goertzel (ben@goertzel.org)
Date: Thu May 29 2003 - 18:51:38 MDT


Ok, I concede -- if that is really Bill's definition of happiness, then of
course a superintelligent AI that is rigorously goal-driven and is given
this as a goal will create something like euphoride (cf "The Humanoids") or
millions of micromachined mannequins.

Detailed specification of a richer definition of "human happiness", as
hinted at in The Humanoids and its sequel novels, is an interesting and
nontrivial problem...

ben

> "Happiness in human facial expressions, voices and body language, as
> trained by human behavior experts".
>
> Not only does this one get satisfied by euphoride, it gets satisfied by
> quintillions of tiny little micromachined mannequins. Of course, it will
> appear to work for as long as the AI does not have the physical
> ability to
> replace humans with tiny little mannequins, or for as long as the AI
> calculates it cannot win such a battle once begun. A nice, invisible,
> silent kill.
>
> If you want an image of the future, imagine a picture of a boot stamping
> on a picture of a face forever, and remember that it is forever.
>
> --
> Eliezer S. Yudkowsky http://intelligence.org/
> Research Fellow, Singularity Institute for Artificial Intelligence
>



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:42 MDT