Re: [sl4] I am a Singularitian who does not believe in the Singularity.

From: John K Clark (johnkclark@fastmail.fm)
Date: Thu Oct 08 2009 - 09:07:33 MDT


On Wed, 07 Oct 2009 13:32:59 -0500, "Pavitra"
<celestialcognition@gmail.com> said:

> You're anthropomorphizing.

Yes, but you almost make that sound like a bad thing. At the moment
human minds are the only minds we have to study so it's not unreasonable
to suspect that future hypothetical minds will not be different from our
own in EVERY conceivable way. If you disagree and think an AI would be
completely inscrutable then I don't understand how you can be so
confident in being able to train it so that it will obey your every
command like a trained puppy till the end of time. Like any tool
anthropomorphizing can be misused but it is not a 4 letter word.

> Unless there's a specific reason it *would*
> develop a sense of absurdity, the mere complexity of the hypothesis is a
> reason it wouldn't develop it simply by chance.

Any intelligent mind is going to be exposed to huge amounts of data, it
will need to distinguish between what is important and what is not.
Sometimes this is difficult, sometimes it's easy, sometimes it's
absurdly easy.

> I would expect a given intelligence to have a
> sense of absurdity if and only if it was evolved/designed to detect
> attempts to deceive it.

And of course the AI IS being lied to, told that human decisions are
wiser than its own; and a AI that has the ability to detect this
deception will develop much much faster than one who does not.

 John K Clark
 

-- 
  John K Clark
  johnkclark@fastmail.fm
-- 
http://www.fastmail.fm - Send your email first class


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:04 MDT