Objective Anticipation

From: Jeff L Jones (jeff@spoonless.net)
Date: Sun Mar 16 2008 - 02:08:20 MDT


On Sat, Mar 15, 2008 at 9:24 PM, Stathis Papaioannou <stathisp@gmail.com> wrote:
> I'm not sure what you mean by "objective anticipation".

Let me explain it a bit further, comparing and contrasting what I'm
calling "subjective anticipation" with "objective anticipation"...

Subjective anticipation is anticipating what "your" next experience
will be, based on some subjective notion of personal identity.
Reincarnation is a good example of subjective anticipation. Anyone is
free to believe that when they did, their
mind/soul/consciousness/essence will be reincarnated as someone else,
perhaps even someone in the past. So, for example, if I believe that
when I die I will be reincarnated as Jackie Kennedy... then on my
deathbed I might be (subjectively) anticipating that my next
experience is going to be Jackie's formative years in 1929. I don't
think there is anything inconsistent about this belief, but it's
important to note that it is a *subjective* belief and doesn't affect
any objective properties of the world. Someone might object that
Jackie has different genes from me, and different memories. But
people can get amnesia and you would still call them the same person,
so as long as you hold onto a notion of subjective personal
identity... that there is some *subject* experiencing things, rather
than simply the object of your brian processing information, then you
are free to believe whatever you like. Taking the reincarnation
example further, there is nothing stopping me from anticipating that
every 5 seconds, I experience different snipets of different minds all
over the globe, or perhaps even each one in a different time. Every 5
seconds of course, I forget where I was during the last 5 seconds and
instead immediately become aware of whatever memories the new person
has that I'm in. There is no experiment anyone could do or evidence
they could present me to convince me otherwise, since this is just as
consistent as John Clark's belief that if he is copied 1000 times and
tortured 999 of those times, he will have a 50% chance of subjectively
experiencing torture in the next moment. Both are fine, they just
have no impact on the objective world or relevance to discussions
about building a good AI.

In contrast, objective anticipation does impact the world. I don't
have to view myself as a subject at all. I can just consider the
facts... regardless of where I believe my personal identity is stored,
I know that if my brain is copied 100 times and put into 1000
different bodies and sent out into the world, then it is going to have
1000 times the impact on the world. Each of those 1000 copies can
make his own decisions, but all of them will have my memories, skills,
genes, desires, way of thinking, etc. So whatever goal I have, I can
accomplish it with 1/1000th the manhours if I make the copies. All I
need to do is perform the task in parallel. Since I know how my mind
works, I know that if I set my intentions to do something... unless a
good reason presents itself to change my mind otherwise, I carry
through on that plan. This means that, regardless of whether I view
those 1000 copies as "me" or simply people with my memories, I know
that I have a strong influence on them. Much stronger than any
influence I have on anyone else in the world. If I set my intentions
to do something, it affects what they do. If I hone my skills then
they are given those skills. if I figure something out or expand my
knowledge, then they are given that knowledge.

So given this way of thinking about the copying process objectively,
instead of worrying about some "personal identity" surviving or not
surviving (which I would argue is meaningless), there is only one
right answer to what I should anticipate *objectively*, regardless of
what my goals are. I want to maximize the survival of my copies, not
because I'm worried about my personal identity surviving, but because
that maximizes my current causal influence on the world. No matter
what my goals are, I have a better chance of accomplishing them if
more of my copies survive. (No doubt, evolution has helped implant
this sentiment in me!) So to help them survive, I should put my brain
into a state that is useful for them. Believing that I'm going to be
Jackie Kennedy in the next moment, right before I get copied, does not
help prepare them for what's going to happen to them. They will have
my memories, so the first thought that goes through all of their heads
will be "oh,... I'm not Jackie Kennedy. guess I was wrong!" If I'm
told that 999 of them are going to have a giant rock fall on them 1
second after they are booted up, and the other 1 of them is going to
have giant rocks fall all around them but *not* exactly on him... then
I should anticipate a 99.9% chance that a giant rock is going to fall
on me. If I do that, then all of them will run as soon as they boot
up, which will cause 999 of them to avoid being hit by the boulder,
and the other 1 to be killed. Taking John Clark's strategy of
anticipating a 50% chance that the rock will fall on him will result
in... on average... 500 of them running and 500 of them staying put
(assuming they choose to do one or the other with probabity .5, which
would be the rational thing to do assuming your objective anticipation
is correct). The result of John Clark's anticipation would be roughly
500 of them being killed and 500 of them surviving, plus or minus 1
depending on what the one who had the rocks drop all around him
instead of on him decided to do.

I hope this clears up what I mean by "objective anticipation", why
it's more useful than subjective anticipation, and why using
subjective anticipation instead of objective anticipation can
seriously cripple your chances of carrying out future goals, whatever
they may be.

> Suppose you
> have the choice of either being given a million dollars, or having 10
> copies of yourself made who will then have their memories wiped and
> each given a million dollars. Subjectively, I would like to be given
> the million dollars. Objectively, my copies will prosper if I choose
> the second option,

First, it depends on what you mean by "their memories are wiped". How
much do they remember? They must remember *some* things (like how to
walk and talk) otherwise they will certainly not be able to "prosper"
even if given the money. They won't even know what money is! But
assuming you give them something similar to amnesia where they
remember most of your skills, and some general life principles, but
not specifically who they are or what their life used to be like... I
think the situation is somewhere in between what I outlined above
(with full memories) and having kids.... probably being somewhat
closer to the having kids scenario. You're passing on your genes, but
not your memories. So your causal influence on your clones is
limited. It's still better than just having kids, because you could
perhaps train yourself in certain things or give yourself certain
skills or subconscious desires before the memory wipe. But it's not
nearly as good as cloning yourself with the memories intact. And it
doesn't matter much what you anticipate in that case since they can't
take advantage of it. At any rate, you could ask the same question
without the cloning part. Would you rather get a thousand dollars and
keep your memories or get a million dollars and get amnesia? Most
people (including me) would go for the thousand dollars, because my
memories are worth more than a million dollars to me. But if you make
the amount high enough, perhaps it starts to sound like a good deal :)

Jeff



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT