RE: Coercive Transhuman Memes and Exponential Cults

From: Ben Houston (ben@exocortex.org)
Date: Mon Jun 11 2001 - 20:25:33 MDT


Hi Eliezer,

I feat that those not at the top of the reasoning ability hierarchy will
always be easy pons for those that are no matter what controls are in
place. My suggestion is to make sure that you are not too far from the
top.

Somewhat dystopian outlook on things but that's the way it works now and
I don't see it changing in the future.

Cheers,
-ben houston
http://www.exocortex.org/~ben

-----Original Message-----
From: owner-sl4@sysopmind.com [mailto:owner-sl4@sysopmind.com] On Behalf
Of Eliezer S. Yudkowsky
Sent: Monday, June 11, 2001 3:26 PM
To: sl4@sysopmind.com
Subject: Re: Coercive Transhuman Memes and Exponential Cults

Durant Schoon wrote:
>
> Imagine transhuman Bill becomes vastly more intelligent than person
Alan. So
> much more that Bill can easily influence Alan to do things Alan might
not want
> to do on his own.
>
> Fortunately, the Sysop is there to protect Alan by advising against
actions
> which might have deleterious outcomes for him.

Actually, the Sysop is there to protect Alan by saying: "You have a
message from Bill which will take over your brain. I advise in the
strongest possible terms that you delete it without looking at it. Do
you
want to delete it?"

> Now, imagine Charlie. Transhuman Charlie is sufficiently more
intelligent than
> transhuman Bill. Again our favorite Sysop can warn Bill off from doing
any
> thing he might not want to do, if only he knew all the consequences.
But there
> is a new twist with this case. Transhuman Charlie might convince
transhuman Bill
> to modify vis own volition so that accepting new memes from Charlie is
highly
> desired. Transhuman Charlie might advertise his suggested memes
cleverly so that
> transhuman Bill bill chooses them with no violation of vis volition.

Actually, transhuman Bill automatically flushes all messages from
Charlie
in a single thought smoothly integrated with the Sysop API. Charlie
can't
convince Bill to modify vis volition in the first place - at least, not
using invalid arguments. And if the arguments are valid from Bill's
perspective, even with the consequences taken into account, what's the
problem? Charlie has to be smarter than the *Sysop* to *covertly* take
over Bill, at least if Bill has a normal relationship with the Sysop.

> The result? Exponential cults win (in a weird way).

Only for people who like exponential cults...

> These seem to me to be the interesting kinds of problems that arise
once you've
> chosen volition as the "be all, end all" metric of the universe (and
yes, I
> currently subscribe to this view).

Hm, it looks to me like, in a volition-dominated Universe, only people
who
want to be taken over by exponential cults wind up in them... isn't that
sort of the point?

-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT