From: Lee Corbin (email@example.com)
Date: Thu Apr 03 2003 - 20:32:47 MST
> > [Eliezer writes]
> > > No. I am reasonably certain that fictionally imagined characters
> > > don't have qualia. I'm in really, really deep trouble if they do.
There is no chance they have "qualia". The imagined characters
are only depicted, only described, only portrayed. It requires
specific code to go beyond that and emulate a being.
> > So, as AI empowered characters in such games advance, perhaps to
> > the point of having real qualia, it eventually becomes morally
> > wrong. Where is this point? How can it be determined?
If this were a gradual process, i.e., providing characters
in games with experiences (or, if you don't follow Dennett
giving them "qualia"), then it would always be a matter of
degree. How many fleas *must* your dog have before you do
something about it? How can that point be determined?
Come now! We know how to deal with continua; the problem
only comes up about 10^6 times a year.
> > If we exist within a sim/VR for the amusement of more powerful
> > beings then that at least is immoral even though we might be
> > strictly their creations.
All one ever NEEDS do for characters is provide *portrayals*.
Nothing more; nothing more is *ever* needed unless for some
reason you wish to equip your simulation with *emulations*.
If (for some benevolent reason, shall we say) you decide to
equip your characters with feelings and thoughts, so that in
effect they are emulating some real or imagined person, then
you must give them thoughts and feelings they'll enjoy. But
for a game or story, portrayal is all that is required.
> If we are [simulated and emulated], then they [the VR
> controllers] have apparently decided that it isn't immoral.
Any bit of egregious pain is immoral.
> Maybe from their perspective, we are predictable automatons, and
> comparable to their insects. In their opinion, we probably lack some
> property that is the essence of experiencing pain and suffering.
> A higher order of qualia, maybe? "Qualia^2"?
Yes, one might at first suppose that beings sufficiently advanced
from us wouldn't care, so comparatively trivial are our emotions
to theirs. But this isn't right. When we get to the point, and
it will be soon, that we can simulate ants, then we'll quickly
reach the point beyond that where we must choose either to continue
to merely portray them, or instead to emulate them.
If we emulate them and furnish them with even a modicum of unpleasant
experience, then we are guilty of a moral crime. Anyone here should
be able to see that it is unacceptable to emulate creatures who are
having pain, or even unpleasant experiences. Depicting or portraying
characters never crosses that line unless we specifically build it in.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:42 MDT