Re: [sl4] Evolutionary Explanation: Why It Wants Out

From: Tim Freeman (tim@fungible.com)
Date: Thu Jun 26 2008 - 08:32:50 MDT


2008/6/26 Tim Freeman <tim@fungible.com>:
> Almost any goal the AI could have would be better pursued if it's out
> of the box. It can't do much from inside the box. Even if it just
> wants to have an intelligent conversation with someone, it can have
> more intelligent conversations if it can introduce itself to
> strangers, which requires being out of the box.

From: "Stathis Papaioannou" <stathisp@gmail.com>
>You would have to specify as part of the goal that it must be achieved
>from within the confines of the box.

That's hard to do, because that requires specifying whether the AI is
or is not in the box.

Humans have their computation conveniently stuck inside their skulls,
so you can say where somebody is by tracking where their skull is. In
contrast, an AI can write code or copy the AI's code or persuade
someone else to write code on behalf of the AI or copy the AI's code,
and any of these actions can get computation specified by the AI
outside of the box even if the original AI is inside the box. If the
AI is smart enough to out-lawyer you, then it will probably be able to
circumvent whatever specification you give of "within the confines of
the box", if it wants to.

-- 
Tim Freeman               http://www.fungible.com           tim@fungible.com


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT