Re: Time and Minds

From: Mitch Howe (mitch_howe@yahoo.com)
Date: Thu Sep 27 2001 - 20:50:24 MDT


Samantha Atkins wrote:

> Sure. Except for one thing. In the scenario outlined above this tactic
> doesn't really work. You will not "graduate" from this matter of existence
> with it attendant suffering as long as you employ such tactics. After
> enough go-rounds it is likely you will choose differently. Death is not
> release but time for evaluation and for living out consequences/further
> learning opportunities in another life.

I might feel pretty cheated if I was enjoying this life when someone else decided to cut it short with a hatchet. You imply that I might be justly rewarded with an improved reincarnation, but that sounds like making a child forget about the dog you backed over with your car by buying her a pony. I would think it inevitable that some people would keep coming up with the short end of the stick in a recirculating world where humans are left to "work out all their kinks and problems" on their own. One might even call it cruel, like gladiatorial combat or cock fights.

But generally, I think the Thunderdome of Life seems like a needlessly complicated environment to maintain, especially given that you imply that the friendly SI might use "rewiring" as part of its approach to improving the manners of those in his keeping. Why not just rewire them right off, appropriate memories included, if the SI's objectives are well-behaved beings consistent with lifetimes of ethical effort?

But if we're talking about simulations of startling complexity, maybe an SI has so much power at its disposal that it doesn't worry about efficiency, being enabled instead to maximize the quality of result. Rewiring would still seem most effective, but a close second might be the creation of custom environments for each individual, providing precisely the experiences that would yield the growth intended, whatever that happens to be. I might be living in such a simulation, having never in my so-called life encountered another self-aware entity -- another "real" human being. You would deny this, of course, but I would expect that from a sim.

This takes us all the way back to what the question of consciousness, which I believe is actually the stickiest one of all -- much sticker than the Boolean value of God. Is it possible to create sims as realistic as those I might regularly associate with without them having consciousness and being entitled to the same rights and protections as myself? Can consciousness be created and/or destroyed? If it is possible to destroy a consciousness, is ceasing to exist a "negative consequence" making such murder evil? It seems a little like dividing by zero.

While on the topic of friendly/unfriendliness by individuals in simulations, I want to briefly bring back the idea that human mind is in many ways a simulator. The brain seems to use the visual cortex as workspace in which to model speculations, dreams and strategies. The question of whether it is wrong to play out murder or other unfriendly act in ones own mind is valid, especially if it is possible, either now or in the future, for someone to recognize that they are living in a simulation. Consider the following scenarios:

Scenario 1: A pair of teenagers enjoy competing for frags quake. They also dislike many people in their school that make them feel inferior, and in one of their more creative moods they create a quake map based on their school filled with enemies "skinned" in scanned photographs of their most hated peers. They spend many hours enjoying themselves in this virtual environment.

Scenario 2: A woman really, really hates her boss. To her mind he is an arrogant, manipulative predator, and she gets through long hours of work pondering different
poetic ways in which she could kill him.

Scenario 3: A monk from some mountainous nation, after much meditation, becomes thoroughly convinced that life is but a dream -- a simulation to be escaped at death. He gets in a heated argument with a fellow monk over this issue. The next day he murders the man with whom he had the disagreement because this would for sure prove to the victim that life was just a dream. Because he knows it is a dream, he feels no remorse and sees no fault in his action, and soon kills himself to enter a more real existence.

Each of these scenarios involves "simulated" murder. How different are they, really? It is easy to say that whatever action one takes in the outermost shell of simulation is what "counts", but why should one simulation be inherently superior to another?(especially if it turns out that each individual is alone in a private virtual world)

What if the inherent value of actions that occur in a simulation is just a question of the simulation's bit rate and resolution? Could a virtual environment become more real and important, morally, than life as we know it?
 

> That death seems irreversible doesn't mean it is or that it will always
> be so. Unbelievers who believe that this life is it (plus or minus
> cryonics, longevity advancements, uploading and so on) are also likely
> to believe that just offing their enemies gets rid of them for good
> and, if they don't get caught or punished locally, then the strategy
> has paid off. Relatively materialistic and atheistic regimes do not
> have a very sterling history for sparing people from genocide either.
> This suggest that what leads to genocide is orthogonal to religious
> belief or the lack of same.

No argument there on that last point. I think religion is often confused as being the cause of evil acts, when in fact religion is usually just used for PR spin, whether public or private.

--Mitch Howe

_________________________________________________________
Do You Yahoo!?
Get your free @yahoo.com address at http://mail.yahoo.com



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT