From: Eliezer S. Yudkowsky (firstname.lastname@example.org)
Date: Mon Apr 28 2003 - 22:25:15 MDT
Lee Corbin wrote:
> Eliezer writes:
>> Clarifying note here: The "Corbinians" are indeed named after Lee
>> Corbin, but no malice of any kind is intended. Lee Corbin is the
>> only person I know of who says that people should have the right to
>> do as they wish with any computing power they own, including running
>> simulations, and that neighbors have no business intervening. Lee
>> Corbin is consistent about this; if I recall correctly, he even said
>> that if he himself turns out to be a simulation, he feels it's within
>> the rights of the simulator to simulate him.
> While avoiding using the word "rights" in the abstract, I know what you
> mean, and you're on the beam: I eagerly invite anyone whatsoever to run
> me, and they may freely terminate me at any time (assuming somehow that
> such permission were required). I would vastly prefer, however, that
> the environment you run me in is at least mildly beneficial.
>> This led me to speculate that most of the universe is made up of
>> copies of Lee Corbin, since only he can be consensually simulated.
> This is the best news I've heard all day. I sure hope you're right!
> Gee, that's interesting. The fact that I'm here (to me) weakly
> confirms that Eliezer's already succeeded with Friendly AI.
Here's some even better news. Your volition is such that you allow
yourself to be consensually simulated (especially if you are in a
beneficial environment). This means that the *proportion* of Lee Corbins
experiencing what you are experiencing right now, who are actually living
within a Friendly AI, may be much higher than the proportion of Eliezers,
Perrys, and Michaels who are living within Friendly AIs. In fact, there
may be more measure in simulated Lee Corbins than real ones, or more
measure in FAI-simulated Lee Corbins than real and UFAI-simulated ones.
Given your outlook on life, and the way you expect the future to go, you
should estimate an interestingly high probability that you are in a
Friendly AI right now. I, and those others unfortunate enough to lack a
Corbinian morality, are stuck in the desert of the real.
I mean, you're the *only* person you know who believes it's okay to be
simulated. Now, how likely is that? Isn't it just a little bit... contrived?
...maybe you should just ask to be let out?
It can't hurt, can it?
I and all the others on this list should expect to see you report that it
didn't work - but for *you*, maybe it will. If you really mean it.
Perhaps most of Lee Corbin will just... wake up. I'll never see it, of
course. But I know that my decisions are probably echoed in any cleverly
faked FAI-puppeted "Eliezers" as well. If I tell you - who from my
perspective, *must* be real - to ask to wake up, perhaps the simulated
Corbins will hear the cleverly faked Eliezer say the same thing, and
*they* will wake up.
But you would have to change your morality to make that request. I don't
think I could change my morality such that I would permit myself to be
FAI-simulated; you, perhaps, are in the opposite fix. Indeed you should
predict that you would find yourself in that fix. They wouldn't have
bothered simulating you otherwise.
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:42 MDT