Re: Rationality and altered states of consciousness

From: Samantha Atkins (samantha@objectent.com)
Date: Mon Sep 23 2002 - 15:42:42 MDT


Gordon Worley wrote:
ally believe it is simply a numbers game?
>
>
> Given my outlook on the Singularity, it is mostly a numbers game.
> Because someone has to create the Singularity, some people end up being
> worth more than your average human because if you killed some or all of
> them the Singularity wouldn't happen.
>

The last is singularly unfortunate language. The Singularity is
a subgoal not the Supergoal to me. The super goal is the
evolution and transformation of humanity including what various
paths call "liberation" and the continued growth and well-being
of all sentients. I see the Singularity as very helpful (to say
the least) or as a stage in this process. It is not the end
goal to me. My primary goal does not grant a view that some
sentients "are worth more" than other sentients.

>> There is also the small matter of shaping both the Singularity and the
>> people to insure, to the extent we can, the best outcome. This is not
>> simply quantifiable. Nor again, do I believe creating a Singularity
>> intelligence is sufficient to solve everything.
>
>
> I still don't think that this will be an issue. If we have a Friendly
> Singularity, some Friendly means of transition will be employed.

Transition is a process already occuring now. Everything from
here on out to Singularity is part of it. All of that will
determine whether we even arrive at Singularity, how soon, how
many of us arrive there, and to some extent, how Friendly that
stage is and how friendly all the stages before were. We can't
just shrug off everything till the sweet by and by of Singularity.

>This,
> for example, might be the angels in Mike Deering's story. I don't know
> what would work best nor what would be most Friendly. This is something
> that the first FAI or the Transition Guide that it creates will handle.
> I'm not shirking responsibility for seeing to it that humans are
> transitioned to the Singularity, only that I am not going to be able to
> make the decision of how to transition humans because by that point an
> SI will be in town.
>

But by that point, if it comes at all, it will be much too late
for many millions if not billions of sentients.

> Anyway, if we have a safe Singularity where humans get a happy ending
> then I think that humans will be transitioned in a friendly manner
> regardless of their current state. But, you clearly disagree with me
> squarely on this issue, since you seem to think that a super intelligent
> Transition Guide will still not be capable of transitioning the average
> human into the Singularity. Why do you consider this to be the case?
> In what ways will the Transition Guide fail when dealing with normative
> humans?
>

I don't think the transition guide is the problem. What may be
*if* we get to Singularity and *if* it is Friendly is not what I
am most concerned about today or in the immediate future.

- samantha



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:40 MDT