Re: AI Boxing

From: James Higgins (jameshiggins@earthlink.net)
Date: Fri Jul 26 2002 - 10:09:24 MDT


Eliezer S. Yudkowsky wrote:
> James Higgins wrote:
> >
> > As for Eliezer's rules I do agree that the 2 hour minimum is not
> > realistic.
>
> How so? If not even a transhuman AI can convince you over the course of
> months, why isn't it realistic to ask someone to talk to a mere human
> for a couple of hours? Do you think you won't be able to hold out if
> you have to actually talk with the person instead of turning away from
> the screen? If so, your AI-proof sales resistance is a lot weaker than
> you are making it out to be.
>
> Or do you mean that the AI should receive more than 2 hours?
>

Oops, my mistake. I had deleted that entire line of reasoning out of my
post! It must have un-deleted along with an undo I did shortly thereafter.

My thought was that if this was a "real" situation the person wouldn't
be forced to talk to the AI for a minimum length of time (although a
maximum may be in place for added safety). If the human detected danger
or was convinced eairly that the AI was unfriendly or dangerous the
conversation would be terminated. However, since Eliezer isn't a real
AI, nor transcendant, this can't be an exact experiment anyway. Like I
said, I had decided against bringing this up. Oops.

James Higgins



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:40 MDT