From: Charles D Hixson (firstname.lastname@example.org)
Date: Wed Jul 26 2006 - 11:59:47 MDT
R. W. wrote:
> Yes. ACCEPTING mortality. I don't expect love or even rationality.
> In fact, I don't expect any response. What good is there in outliving
> all the stars in the universe?
You will probably be interested in a book that Charles Stoss is working
on. It's working title is "Halting State".
I, personally, expect that there is a finite amount of extension that is
reasonable for any particular person. I also suspect that it's not the
same for all. Of course, my suspicion is on very poor footing,
evidentially speaking. The current evidence is basically that when
people's bodies start breaking down in ways that they believe to be
irreversible, they tend to lose interest in keeping them going. Mine is
just a suspicion that any state machine will eventually end up in either
a loop or a halting state. And that if you notice the loop, you'll
eventually choose to get out of it. So if you are intelligent enough to
notice the loops you get into, you'll eventually arrive at a halting state.
Now as to whether this would occur before or after the last star has
died... I don't see this specified by the problem conditions, so I
expect a variability in the responses. A question that is to me more
interesting is "would these minds see merger as a viable option?" (Think
Spock's mind meld made permanent.) If some trans-human entity has a
stronger "theory of mind" of your mind than *you* do, can you die
without it being willing to allow it's model to terminate? (I.e., what
do you mean "I".) Etc.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:56 MDT