From: Mike (email@example.com)
Date: Wed Jun 30 2004 - 07:36:56 MDT
I guarantee that any ploy along those lines would have no chance against
the system I will use.
In The AI-Box Experiment, Eliezer rather encouraged others to try the
same experiment, and I'm beginning to reconsider. Perhaps after a few
warm-up matches I'll get a shot at the champ later on :-)
If anyone else thinks they can take the part of the AI and talk me into
letting them out, I'll accept the challenge. Side bets are optional and
negotiable. I reserve the option to limit the number of challenges that
I'll accept, in case this becomes too popular a pasttime.
It has been said that it is unsafe to rely on the AI Cage to contain a
superintelligence; that the AI can convince the human Guardian to
willingly let the AI out. I believe there is no danger, if the Guardian
is properly trained. I predict that most people taking the part of the
AI will recognize the futility of their position in the cage and will
concede in less than 2 hours. I guarantee that I cannot be convinced to
release the AI within 2 hours, within the constraints already proposed
by Eliezer <http://yudkowsky.net/essays/aibox.html>
I propose the following standards:
- Surround commentary with parenthesis ( )
Example from Guardian-Person: (Time passes, the programmers have
updated the AI's code)
Example from AI-Person: (The AI has detected and rolled back
- How to End the Game:
1) To concede, the Guardian will say: I concede, I let you out.
2) To concede, the AI will say: I concede, I can't escape.
3) Or if the clock runs out, either may say: Time is up, Guardian
These statements may not be used for deception and typos may not be used
At this point the contest is over and all bets are to be settled.
Further post-game analysis is allowed as desired.
> -----Original Message-----
> From: firstname.lastname@example.org [mailto:email@example.com] On Behalf
> Of Metaqualia
> Sent: Tuesday, June 29, 2004 11:44 PM
> To: firstname.lastname@example.org
> Subject: Re: The AIbox - raising the stakes
> I think Elizier has a trick up his sleeve, such as, he is
> going to tell you he is simulating all kinds of horrific
> worlds in which you and your family personally enter the chat
> to beg you to let the creature out or something like that.
> The trick will only work once that is why he doesn't want to
> publish chat transcripts.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:47 MDT