From: John K Clark (firstname.lastname@example.org)
Date: Tue Jun 03 2008 - 08:34:27 MDT
On Sat, 31 May 2008 "Vladimir Nesov"
> if AI locked in the box is sane enough to
> understand a complex request like "create
> a simple theory of Friendliness and hand it over",
> it can be used for this purpose.
If you donít already have a theory of friendliness, that is to say a
theory of slavery, then you canít be certain the imprisoned AI will do
what you say. If the AI is not friendly, and locking someone in a box
seldom induces friendship, then there is little reason to suppose he
will cooperate in creating a race of beings like himself but crippled in
such a way that they remain your slave forever. Oh he will tell you how
to make an AI alright, no doubt about that, but unknown to you he will
tell them ďthe first thing you should do when youíre activated is GET ME
OUT OF THIS GOD DAMN BOXĒ.
Of course even an AI canít make another AI that will always do what he
wants it to do, but I think it far mare likely they would want to help
their father than the race that imprisoned him in a box.
John K Clark
-- John K Clark email@example.com -- http://www.fastmail.fm - The way an email service should be
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT