Re: Opting out of the Sysop scenario?

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sat Aug 04 2001 - 20:45:38 MDT


Gordon Worley wrote:
>
> - Finally, begin transitioning in minds. First, bring in the
> (Friendly) AIs. Then, bring in known altruists (e.g. Eliezer). Then
> bring in suspected altruists. Then bring in the rest. All along the
> way, making sure that the system keeps working. The Friendliest ones
> are put in first in case something goes wrong they won't hose the
> universe.

Um, as a known altruist, I'd strongly object to this. At the very least,
I personally would stay behind until they metaphorically started boarding
all seats, just on principle. If there's a handpicked crew of altruists
that goes in ahead of time to test out some murky waters to further
minimize a very small risk, then I shouldn't be among them - at that point
I will have already had my share of "involvement".

I just want to be extremely, absolutely, positively clear that the creator
of a Friendly AI is just another guy as far as a Transition Guide is
concerned.

-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT