From: Mark Waser (email@example.com)
Date: Wed Apr 02 2003 - 10:23:56 MST
Lee Corbin wrote:
>> "But as for being morally "okay", no, suffering of any form is
inadmissible (contradicting rationalizations for suffering in another
thread), including our displeasure at having apparently undergone sorrow.
I just have to jump in here . . . .
Suppose I'm playing a fantasy role-playing game (Dungeons & Dragons or
something similar) and one of my characters dies a horrible death. Is this
Suppose it's twenty years in the future and I'm playing the newest total
immersion version of this game. I'm booted out of the game once it's
apparent that my character is about to die horribly but everyone else "sees"
me die horribly. Is this morally wrong?
Suppose that for the excitement/thrill/entertainment or for some learning
experience that I'm willing to accept experiencing some shadow of that
horrible death. It won't kill me or permanently damage me but it will allow
me to stay in the game until the last possible instant, try to strive to
accomplish nearly impossible things, have a great and realistic death scene,
etc. Is this morally wrong? There's certainly might be (minor) suffering
involved here but it's suffering that I've chosen . . . .
Suppose that for the ultimate in realism and to truly "live the game", I've
decided to accept the temporary blockage of all outside-game knowledge.
Until I "die" inside the game, I won't remember/know about anything outside
the game but once I "die", I will go on living my normal life outside the
game. Is this morally wrong (Assume that we're advanced enough that there
is no way in which in-game events can harm, much less traumatize, my
Now, can anyone prove that we aren't living the last scenario?
Note: This can also explain Eliezer's asking to be let out of the "sim" but
apparently not being let out in at least three different ways:
a) he left instructions not to be let out
b) the instant he got out, he asked to be returned with no in-game time
elapsed and all memories erased again
c) someone else is now playing the character "Eliezer"
I must admit that I don't see any way in which it can be proved to me that
I'm not living in a sim. All the research on vision which has recently been
referenced here clearly shows that we don't even really see what we think we
see. I think that Boostrom's requirements for a simulation are way, WAY
higher than they need to be because we don't experience/know anywhere near
what we believe we experience/know (particularly if I/you are in a
individual/solipsistic sim where everyone else is either programmed or knows
about the sim and is manipulating it). And I know that there are all sorts
of reasons why I would be willing to be placed in the last scenario since I
could easily imagine it as the method by which an advanced civilization
investigates other possibilities or even might teach their children.
But, the final point which I wish to stress, however, is that while arguing
about whether or not we are in a sim is amusing . . . . ultimately, if the
sim is to be of any value, we must behave as if we are not in a sim.
Suffering (or, at least, senseless suffering and senseless deaths) cannot be
rationalized from OUR perspective and we need to strive against it with all
our might. But, there might be a perspective from which suffering isn't
what we perceive it to be or from which it may not be senseless (or at a
minimum, self-imposed), so we certainly can't rule out the existence of a
Friendly AI with that perspective.
This archive was generated by hypermail 2.1.5 : Sat May 25 2013 - 04:00:36 MDT