Re: nagging questions

From: xgl (xli03@emory.edu)
Date: Mon Sep 04 2000 - 23:39:33 MDT


On Mon, 4 Sep 2000, Samantha Atkins wrote:

>
> Even granted that this Power is a much higher sentience, I still feel as
> if I am betraying humankind, betraying my own primary motives in working
> to bring it about sometimes. How do the rest of you deal with this?
> What am I missing?
>
> I know that the Singularity is eventually inevitable for some
> intelligent species and inevitable for us barring major disaster or some
> totally unforeseen bottleneck. But how can I be in a hurry to bring it
> about and still claim I work for the good of humanity?
>

        there's no denying it -- the creation of an yudkowskyian
transcendent mind (ytm) may be our salvation; it also may well be our
doom. however, the same can be said for other ultra-technologies,
especially nanotech. the issue here is mainly one of navigation.

goal: survive the next 50 years;

facts:
        - accelerating technological progress is virtually inevitable;
        - any technological revolution carries significant risk;
        - different technologies differ in risk;
        - while we might not be able to suppress any one technology, we
          may be able to influence the order in which they arrive;

action:
        contribute my effort to increase the probability that the
technology with the least risk arrives first.

        in other words, one doesn't need to be certain of the eventual
outcome of one's work -- all one needs to believe is that under the
circumstances, one's present course of action is the most likely to lead
to a good end.

        singularitarians, for instance, believe that the creation of an
ytm is the best bet for humanity. if ytm indeed arrives first, it will
trump all risks posed by other ultra-technologies; however, other
technologies, especially nanotech, has the lead in the race -- hence the
hurry.

-x



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:35 MDT