From: Jeff Bone (firstname.lastname@example.org)
Date: Mon Dec 03 2001 - 23:34:27 MST
Emil Gilliam wrote:
> I find it difficult to believe that nonconsensual simulations should
> not be allowed, ever ever ever.
(1) Human beings (as example "intelligent" beings) are great at justifying
nonconsensual / coercive expression of will over other intelligent beings.
(2) You might want to be aware that there are a large number of software "toys"
today which have just exactly that purpose --- simulating "life" in various
environments. Do you really think that we'll stop being amused by these at some
point, just because the quality of the simulation advances to the point that the
simulated beings are self-aware?
I'm not saying this is right, I'm just saying let's not be naive about human beings.
> And of course, you see where this is going. Suppose we have six
> billion happy puppeteers controlling six billion "suffering" human
> puppets in a simulation, including some who have the line "I hope I'm
> not in a simulation; if I were, I'd want out of it now!" (Don't
> nominate this drama for a Pulitzer.) Is this immoral?
What a question. What do you mean by morality?
> The possible resolutions:
> 1.) What Bob does is immoral and should not be allowed.
Yes, but now you are letting your own subjective morality override Bob's expression
of free will. Is that moral? Is it *ever* moral for party A to interfere with the
actions of another party B, when B's actions have no impact on party A? Isn't that
just moral "busybody-ness?"
> I truly fear
> for our freedom if a Friendly AI decides that a street puppeteer
> Pedestrian can't even put on a innocuous tragic puppet show (or, for
> that matter, that nobody can perform Shakespeare).
I fear for our freedom if we cede control of the physical world to any Power,
Friendly or otherwise, for "protective" purposes.
This archive was generated by hypermail 2.1.5 : Fri May 24 2013 - 04:00:20 MDT