From: Samantha Atkins (firstname.lastname@example.org)
Date: Fri May 02 2003 - 00:55:07 MDT
Simon Gordon wrote:
> The reason i am more concerned with Hell worlds than
> Apothesis worlds is that in Apothesis worlds the
> sentients have it easy, they dont need to suffer. Of
> course we can still empathasize with beings that dont
> suffer but thats a bit of a waste of our empathy. As
> humans it is natural for us to react more to those who
> do suffer, and to extend our sympathies most to those
> who also suffer the most.
Well, I don't tend to center my concerns or have my drivers
centered around suffering. It feels healthier and I tend to
function better if I am centered on what is possible to acheive
for self and as many others that I can possibly gift with new
possibilities and share parts of life's adventures with. It is
a question of focus.
I have empathy for suffering beings galore but it is not a
primary driver. I would much prefer to have no suffering beings
requiring that kind of empathy to the extent I can. I will not
miss that sort of empathy (more again to pity in some ways) when
there isn't a good target for it. Nothing stands alone as a
good to be preserved outside its useful context.
> Assuming FAI, or the proliferations of FAI which have
> a same basic ethical system NOT alien to humankind,
> are moreorless ubiquitous throughout the multiverse.
> Then the subjective probability of you quickly jumping
> into an Apotheosis state if you are already unlucky
> enough to be in a Hell state is very VERY high.
How so? It is doubtful that such a transtion is just a matter
of the FAI's desire. It might take a much more gradual movement
to get *you* versus something wearing your skin across that much
of a gap.
> But but but...when i talk about "the cusp of
> infinitesimalism" im not joking. Im well aware of the
> ridiculously low probability involved subjectively in
> remaining for any extended length of time in a Hell
> world. Those sentients on the very cusp however, who
> follow a trajectory which a hypothesized devil would
> delight in i.e. those that necessarily experiance
> indefinate extended (and heightened) pain and
> suffering, are just as real as You and I. The fact
> that we are aware of the existance of them is indeed
> shocking. Its especially shocking to learn that there
> is absolutely nothing we can do about it, just as the
> word "necessary" implies.
Well, it is all pretty darn hypothetical really. It is amusing
how the list is beginning to look like a discussion of Tibetan
Buddhism with hell worlds and the equivalent of cloud dakini
> Given what i have said above how our empathy works:
> used towards those that suffer most. Far future AIs
> (assuming they havent already abandoned their human
> empathy anchors) are going to have some obvious
> problems with this "necessary-ness". In their time and
> space, all sentients around them will presumably be
> living a relative bliss (Apotheosis state as you call
> it) and its very difficult to see a need for local-use
> of empathy in such a state. Why would you need to care
> about someone who you already know is in absolutely no
Why would it be rational to care about what you cannot
conceivably do anything about?
> I suggest that in order for these future beings to
> keep their anthropic empathy they will have to target
> it elsewhere, to those beings that actually suffer.
I don't see why this is of overriding importance.
> And i see the obvious target of this as the necessary
> cusp of suffering implied by the Level IV multiverse.
> All such worlds would be accessible by simulation, so
> the need for advanced beings to express their
> heightened humanlike empathy may end up becoming quite
> a morbid pastime, as beings try to simulate the
> feeling of being in a Hell world and experiencing
> intense pain.
Whatever in the multiverse for? If experiencing such cannot
help the beings in question then it is little more than
emotional masturbation. It certainly is not rational.
> [Aside: Can there be such a thing as the
> qualia of infinite pain felt by a consciousness in a
> single moment? We had better hope not!!] Im beginning
> to think that at some stage, when advanced sentients
> finally reach the state of maximum local apotheosis,
> it might be a very wise idea for them to abandon their
> empathy, at least as we know it in the human form.
> This could be an unnecessary sacrafice of at least one
> quality of humanness though so im still in favour of
> extended ignorance holidays :-)
I am not terribly interested in preserving "qualities of
humanness" or any other artifacts and aspects of past states of
being beyond their usefulness and/or the happiness I derive from
them. Why would I be? I think it will be a disservice to
ourselves if we cling to what no longer serves us just out of
squemishness or sentimentality regarding letting go of
where/what we used to be. Carrying all that baggage could
really slow us down and even cripple us.
As I am currently in the process of jettisoning a lot of
mystical/religous baggage, I have some immediate understanding
of this theme.
This archive was generated by hypermail 2.1.5 : Mon May 20 2013 - 04:00:32 MDT