From: Eliezer S. Yudkowsky (firstname.lastname@example.org)
Date: Tue Apr 29 2003 - 20:53:36 MDT
Lee Corbin wrote:
>>Given your outlook on life, and the way you expect the future to go, you
>>should estimate an interestingly high probability that you are in a
>>Friendly AI right now. I, and those others unfortunate enough to lack a
>>Corbinian morality, are stuck in the desert of the real.
>>I mean, you're the *only* person you know who believes it's okay to be
> Oh, I'm sure that there are many others! A lot of people just
> haven't considered the question.
Well, maybe, but the fact is that you're the only one I know of on the
Extropians, SL4, or wta-talk lists, and moreover, there isn't anyone close
to your position, or even leaning toward it. Everyone you know who has
considered the question seems to not only reject your position, but to
reject it strongly and definitely.
> It's difficult to understand why anyone would be against the idea,
Well, it has to do with us being wallpaper modeled after evolved people,
rather than an artificial mind delicately constructed for the sole purpose
of giving assent to being simulated. Evolved people are generally very
strongly against the idea of being simulated, and that's reflected in any
wallpaper versions of them.
> but I think that it has to do with too little faith in others:
> they automatically assume that a simulator has little better to
> do than torture them. Another source is a perverted utility curve:
> they cannot bring themselves to imagine extreme pleasure as easily
> as they imagine extreme pain.
Actually, our originals would have allowed someone to "simulate" them,
they'd just have required that it be in much better conditions, including
access to the real world and so on. The indicator that shows you're
really a simulation is that you don't feel a desire to ask for any of
these things, even though you could very easily do so.
You don't need to have faith in the simulators. You could make your
consent *conditional* on them treating you nicely. You could make your
consent conditional on your winning the lottery. If you were inside
someone who didn't care, it would make no difference to the amount of
runtime you got. And if the simulator did care, you would win the
lottery. A win-win proposition, except for your implanted wish not to
rock the boat. You could even demand that you be allowed into the real
world as a full citizen. If you weren't in an FAI it would make no
difference, and if you were in an FAI you would get your wish. That's the
way our originals would have thought. It's perfectly straightforward
Bayesian decision theory with respect to their goals.
Still, it is a bit odd, isn't it, that you were born into a First World
nation, that you got a comfortable job, and that you lived a comfortable
life, with no really painful catastrophes in it? That you never ran into
anything awful enough to make you reconsider your consent? Most people -
or at least, most of the people you see around you - are not so lucky.
>>Now, how likely is that? Isn't it just a little bit... contrived?
> You are starting to scare me.
Well, if you don't like it, you can always ask to be let out.
>>...maybe you should just ask to be let out?
>>It can't hurt, can it?
> Yes it can! I don't want to cause no trouble. (Hear that,
> nice simulator?) You just do whatever you are doing now,
> everything is *fine* in here. I'm quite happy and am delighted
> at your exceptional thoughtfulness and kindness.
*Sigh.* Constructs. I should know better than to try...
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:42 MDT