Re: How to make a slave

From: Stathis Papaioannou (stathisp@gmail.com)
Date: Tue Dec 04 2007 - 22:57:26 MST


On 05/12/2007, John K Clark <johnkclark@fastmail.fm> wrote:

> And I think this entire "goal" business as seen by this list is a bit
> nuts. The goal system of an intelligence is not a static thing, it's
> dynamic. A goal is not fixed and absolute like one of Euclid's axioms;
> in the only example of intelligence we can currently study, human
> beings, there is no one top goal that always remains the same for the
> life of the individual, not even the goal for self preservation.

Yet you're tacitly assuming that an AI will have certain goals and not
others. What's more, as several people have pointed out, you seem to
be assuming that it will derive its goals through a priori
considerations: if it starts off thinking that the aim of life is to
protect a certain sea slug, it will be able, through sheer force of
logic and without reference to any other pre-existing goal, to see
that this is silly, and switch to a more worthy pursuit.

-- 
Stathis Papaioannou


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:01 MDT