From: Lee Corbin (email@example.com)
Date: Sun Apr 06 2008 - 10:47:05 MDT
> [Lee wrote]
>> [Matt wrote]
>> > That is what I meant. But choose your own definition if you prefer.
>> > If we could agree on a definition then the rest would be easy.
>> I'm pretty leery of definitions (Korzybski explained very well
>> how easy it is for Aristotelian definitions to be useless or
>> even dangerous), but I would say that consciousness is
>> the process whereby a program or operating brain reports
>> upon its own substates to itself, and is then able to distinguish
>> these reports from the substates, (informally, "so that they can
>> be 'though about'".) I mean by "substate" a particular sense
>> organ's state or a report of a sense organ's state or a recall
>> of some memory, all of which is recursive, so that a report
>> of a report may be also considered to be a substate.)
>> This is admittedly pretty rough, but I've heard people say much
>> the same thing, and it sounded right to me.
> In that case Homer without a hippocampus is conscious,
You seem amazingly neutral about the meaning of such
concepts. Are you so sure that mathematics is meaningful
to any of our inquiries if the real-world meanings of
any terms used in the math are entirely arbitrary?
> but so are many computational processes, for example a
> database that logs read-transactions to itself.
I would suggest that yes, to a very minimal degree, programs
(and bacteria) achieve very minimal levels of consciousness.
Not only is this important for future actions that we may take,
e.g., whether to bother getting yourself cryopreserved or
uploaded, but is also a necessary guide to certain moral
actions (e.g. whether it is better for humans to go without
certain medical treatments or animals to suffer, and how
we might go about systematically arriving at consistent
> But consciousness is just a distraction, a futile attempt to
> extend our ethics to AI.
Well, I'm not quite sure why you consider it futile. After all,
you will sooner or later (assuming that things go well for you)
be in a position to choose certain outcomes at the expense
of certain other outcomes. Have you no desire for a guide
to which actions of yours will be consistent with other
actions? May I inquire as to whether you consider your
survival beyond normal human lifespan to be possible, and
if possible, desirable?
> Which machines have rights?
I shy away from the term "rights", having found expressions
involving it to be very misleading or meaningless. For example,
you often hear "X has the right to do Y", which so far as I
have ever been able to tell is equivalent to "I (or we) approve
of X being able to do Y".
But to not evade your question, my own opinion is that of
the two things that have contributed most to the progress
of western civilization, (1) respect for legal rights, (2) respect
for private property, it is the second which should take
preeminence in these discussions. In other words, we have
to begin by regarding the present owners of hardware and
software as entitled to do as they please with their property.
(Eliezer and I argued about this as early as 2001 or 2002,
however.) Further comments about it should *definitely*
be begun in a different thread. But you asked, and I have
> Is teleportation ethical?
Absolutely. In the first place, in my opinion (again) if any process
whatsoever is available, then any adult human being or other
citizen can freely utilize said process so far as it only directly
affects himself or herself. On the other hand, to teleport people
against their will would be as ethically questionable as, say,
strapping them onto a roller-coaster against their will, even
though we believe that ultimately the ride is harmless.
> (Can I kill you if I just made a copy of you?)
I would consider that to be very impolite. I do not approve
of one person going around making copies of arbitrary
people he encounters. However, if I have granted you
permission to make a copy of me, then in my opinion you
may do to the copy what you like, provided only that you
do not commit the immoral action of providing me with
additional experiences that have negative worth (e.g.
unpleasantness). Again, I realize that these statements
are incendiary, and beg that new threads be started to
discuss any of them.
> Is a nondestructive upload really you?
It *certainly* is me, if the upload has been what we call
successful, i.e., with any foreseeable realistic technology
we obtain a program that is functionally equivalent to me.
(Now *this* question does totally pertain to the subject
line, for, the heresy that I have committed is my specific
claim that under extreme conditions that are probably
impossible in practice, through a GLUT a fully functional
version of me could *appear* to be me, but actually
have no experiences whatsoever.)
Now to focus on one of your words, here, whether or not
it is *non-destructive*, as you have stipulated, is totally
irrelevant. The "original" may be lightyears away, and
so the upload at hand either has certain characteristics (e.g.
"is me") or it doesn't.
> Forget consciousness. Just do the math.
I will again say that the writing of one mathematical equation
after another, unless the terms pertain to actual valid concepts
outside the mathematics, is only an exercise best left to those
just acquiring facility with mathematics.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT