From: Durant Schoon (email@example.com)
Date: Mon Jun 11 2001 - 13:15:25 MDT
In March, Eliezer mention the notion of "coercive transhuman memes" when
discussing another issue:
> Date: Tue, 20 Mar 2001 05:02:56 -0500
> From: "Eliezer S. Yudkowsky" <firstname.lastname@example.org>
> To: email@example.com
> Subject: Re: How To Live In A Simulation
> What do I mean by a "commonsense" solution? I mean one in which
> second-order moral imperatives are applied with room for slack, rather
> than being absolute. By "second-order", I refer not to basic
> Friendliness, but to consequences of Friendliness - for example, the idea
> that Old Earth needs to be kept free of coercive transhuman memes, and
> thus the idea that transhumans and humans shouldn't interact.
My question is: What boundaries can we expect to be formed when intelligence
goes through the roof? I'm sure this is a question to be answered by an SI
or Sysop, but maybe someone would venture some interesting speculation in the
Imagine transhuman Bill becomes vastly more intelligent than person Alan. So
much more that Bill can easily influence Alan to do things Alan might not want
to do on his own.
Fortunately, the Sysop is there to protect Alan by advising against actions
which might have deleterious outcomes for him.
Now, imagine Charlie. Transhuman Charlie is sufficiently more intelligent than
transhuman Bill. Again our favorite Sysop can warn Bill off from doing any
thing he might not want to do, if only he knew all the consequences. But there
is a new twist with this case. Transhuman Charlie might convince transhuman Bill
to modify vis own volition so that accepting new memes from Charlie is highly
desired. Transhuman Charlie might advertise his suggested memes cleverly so that
transhuman Bill bill chooses them with no violation of vis volition.
The result? Exponential cults win (in a weird way).
Cults do not violate volition, do they?
Will there be protections instead? Classifications of Transhuman Intelligence?
Level N may not try to influence anyone below Level N....
These seem to me to be the interesting kinds of problems that arise once you've
chosen volition as the "be all, end all" metric of the universe (and yes, I
currently subscribe to this view).
-- Durant Schoon
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT