Re: Article: The coming superintelligence: who will be in control?

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Thu Aug 02 2001 - 01:24:53 MDT


"Amara D. Angelica" wrote:
>
> Brian, those are interesting and important points. I've forwarded your post
> to Ray for his thoughts.
>
> Is there a consensus on this list that the Singularity is good and should be
> accelerated?

Consensus? On SL4? You'd be lucky to get consensus that the sun will
rise tomorrow. (Followed by twenty posts questioning my definition of
"sunrise" since it's actually the Earth rotating.)

Well, you know my opinion. The Singularity is good, and should be
accelerated. There *are* certain things you need to do to
preserve/implement the goodness and make sure that an actual Singularity
is the outcome of your Singularity-accelerating actions. You don't want
to wind up in a Gandhi situation where you work for fifty years and get
remembered as the archetypal nice guy but basically your entire life
backfires because you focused on solving the wrong problem. But that
said, a successfully concluded Singularity is the happy outcome for
humanity, if any happy outcome exists. Accelerating the Singularity both
preserves the lives of those who would otherwise die, and reduces the risk
to humanity. Tomorrow would be fine with me, and if James Higgins or
anyone else feels they didn't get enough time and want to go on playing at
being human, I'm sure it will be technically feasible.

-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT