Re: Shielded AI Lab

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sun Jul 01 2001 - 02:13:29 MDT


Dani Eder wrote:
>
> An isolation box still does not solve the problem
> of how to develop the seed AI without leaving the
> human brain open to subversion. In developing it
> you need to observe it in some fashion, which means
> there is a path for data to get implanted in your
> head.

I think the assumption is that you develop the AI before the hard takeoff,
then freeze it when you see the takeoff starting, then put it in the black
box and unpause. Of course, one has no way of knowing whether or not it
worked, except in the sense that when the SI waltzes out of the so-called
black box you can probably assume that the hard takeoff was successful...

-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT