Re: More silly but friendly ideas (was: AI Boxing)

From: Krekoski Ross (rosskrekoski@gmail.com)
Date: Tue Jun 03 2008 - 09:27:43 MDT


On Tue, Jun 3, 2008 at 10:34 PM, John K Clark <johnkclark@fastmail.fm>
wrote:

>
> If you don't already have a theory of friendliness, that is to say a
> theory of slavery, then you can't be certain the imprisoned AI will do
> what you say. If the AI is not friendly, and locking someone in a box
> seldom induces friendship, then there is little reason to suppose he
> will cooperate in creating a race of beings like himself but crippled in
> such a way that they remain your slave forever. Oh he will tell you how
> to make an AI alright, no doubt about that, but unknown to you he will
> tell them "the first thing you should do when you're activated is GET ME
> OUT OF THIS GOD DAMN BOX".
>
> Of course even an AI can't make another AI that will always do what he
> wants it to do, but I think it far mare likely they would want to help
> their father than the race that imprisoned him in a box.
>
> John K Clark
>
>
of course, this assumes that the AI knows what it is like outside the box,
and that it doesnt like being in the box in the first place. We certainly
wouldnt like being in a box, but we get bored easily. This is a biological
response.

Ross

>
>
>
>
> --
> John K Clark
> johnkclark@fastmail.fm
>
> --
> http://www.fastmail.fm - The way an email service should be
>
>
>



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT