Re: Deliver Us from Evil...?

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Wed Apr 04 2001 - 22:36:46 MDT


Samantha Atkins wrote:
>
> Of course the rub is that it might end death and misery forever by
> simply wiping out all biological sentients, by accident or on purpose.

I object to the phrasing of this sentence. It should say: "The rub is
that it might wipe out all biological sentients, by accident or on
purpose." Otherwise it looks like you're proposing a Devil's Contract
definitional problem, which is only one single type of possible failure,
and not necessarily even a very worrying one.

> Or we might find that the Sysop no matter how wise and benevolent and
> transparent intolerable for the types of creatures we are but be unable
> to choose differently.

Well, then, I guess the challenge lies in creating a Friendly AI that
doesn't like intolerability. Your statement also implicitly assumes that
less frustration exists in a Sysop-free Universe, which doesn't
necessarily follow; the next most probable alternative to a Sysop might be
imprisonment within a less benevolent entity. If a nonSysop scenario is
both desirable and temporally stable, then it still looks to me like
you'll need a superintelligent, Friendly, Transition Guide as a means of
getting there without any single human upload taking over until there's a
stable base population. Friendly AI is still the best tactic in either
case.

> We are spinning the barrel and pulling the trigger in a cosmic game of
> russian roulette. The barrel holds thousands of rounds and only a few
> chambers are empty.

Where do you get *that* set of Bayesian priors from? Not that it makes
much of a difference, I suppose; all that counts are the proportional
qualities of the empty chambers, not how many empty chambers there are.

> If we "win" we either are set for all eternity or
> get the chance to play again some other time. Except that it is the
> entire world and all of us forever that the gun is pointing at. To do
> that we have to be very, very damn sure that there is no other way
> and/or that this (building the SI) is the best odds we have.

One, all we need is the conviction that (a) time is running out and (b)
the conviction that building the SI is better than any other perceived
course of action. Certainty is a luxury if you live in a burning house.

Two, I am very, very damn sure.

-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT