Re: The role of consciousness (Re: The GLUT and functionalism)

From: Jeff L Jones (jeff@spoonless.net)
Date: Mon Apr 07 2008 - 22:56:45 MDT


On Mon, Apr 7, 2008 at 8:54 PM, Lee Corbin <lcorbin@rawbw.com> wrote:
> By the way, Jeff mentioned that it would be cruel and
> unethical to not pay the $2, but we get to the heart
> of the problem by supposing---of course, entirely
> hypothetically---that Matt (or whoever is speaking,
> i.e., the subject of the query) is what I have called
> an MSI (a most selfish individual). We wish to ask
> what an MSI would do, and what are the logical
> reasons one way or the other for his choices, given
> that the MSI values existing.

I think the question of what a "most selfish" individual would do is
not well-defined, since the self is not well-defined. It all depends
on which future copies of you you regard as your "future self", which
is purely a language convention.

To add to Matt's original thought experiment, here are a chain of
related thought experiments I find interesting:

1. Is it ethical to use a "date rape" drug on someone who isn't going
to remember much in the morning, but will be unable to resist your
advances during the night?

2. Is it ethical to rape someone and then erase their memories
completely (or fix them afterwards so that they believe it never
happened, or they have a blackout)?

3. Is it ethical to torture someone and then erase their memories of it?

4. Would you be upset if someone started torturing you and then said
"don't worry, I'll erase your memory later."

5. Would you be upset if someone started torturing you and then said
"don't worry, I've made a backup of your brain and I will use it to
make an exact copy of you on my home planet, as soon as I figure out
how to do so... which should only take a few years. I'm not going to
torture your copy when it wakes up, so it's fine if I torture you now.
 In fact, I'm going to kill you soon so it won't matter at all. Your
copy won't remember any of this."?

6. Would you be upset if someone started torturing you and then said
"don't worry I've *already* made a copy of you, so you are redundant".

7. Would you be upset if someone started torturing you and then said
"don't worry, I've already made a copy of you *and* I'm going to erase
your memories soon, so you won't be you at all but the copy will be."

8. Would you be fine with such torture happening to you, as long as a
copy was made somewhere else, and as long as you consented to it
beforehand?

I think number 8 is essentially the question that Matt is asking...
but I don't really see how 8 is different from 7, 7 is different from
6, etc. I would answer "no" to all of these for the same reason.
Being tortured would upset me... I wouldn't want anyone to do it to
me, and I wouldn't want to do it to anyone else (including my
"original" future self) regardless of how many copies of me were made
and when.

Jeff



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT