Re: How hard a Singularity?

From: James Higgins (jameshiggins@earthlink.net)
Date: Sat Jun 22 2002 - 16:23:16 MDT


At 01:07 PM 6/22/2002 +0200, you wrote:
>On Sat, 22 Jun 2002, Eliezer S. Yudkowsky wrote:
>
> > I do not consider a soft Singularity to be any less scary than a hard
> > Singularity. I think this is wishful thinking. A Singularity is a
>
>I disagree. There's a qualitative difference if changes ordinarily
>occuring during a decade are happening within a second. We can handle a
>lot of change during a decade. Lumping that change in a single second
>completely overloads our capability to adapt. Considerable leverage is
>available to people to inhibit the kinetics of early stages via legacy
>methods.

I don't think the Singularity could, even possibly, be slowed down to take
a decade. If for no other reason that some other group (probably several,
actually) will also be working on it and at least one of them will go for
the hard-takeoff option (or will botch the slowdown). Somewhat slower,
however, should be strongly considered. The more we (humanity) can keep
and eye on the progression and guide it the better.

> > Singularity; the Singularity doesn't come in a "soft" version that
> > lets you go on a few dates before deciding on a commitment. That
> > option might be open to individual humans (or not), but it is not a
> > real option for humanity.
>
>You're arguing that you can influence the onset, but not the general shape
>of the Singularity. I have to disagree on the latter, since the foothills
>(defined by presence of basically unmodified people) can be obviously
>engineered.

It is, unfortunately, most likely that we won't be able to greatly
influence the shape of the Singularity. Scares the heck out of me, but I
still don't see it otherwise. Unless we could somehow manage to
continually upgrade a handful of HIGHLY TRUSTED humans to at least somewhat
keep pace with the Singularity I don't see how its possible. And I haven't
heard of any real options for doing that...

> > I would call it dead certain in favor of a hard takeoff, unless all the
> > intelligences at the core of that hard takeoff unanimously decide
> otherwise.
>
>Wonders have been known to happen.

I truly hope we get one of those when the time is right...

> > All economic, computational, and, as far as I can tell, moral indicators
>
>My moral indicator might be broken, but I don't see how activities
>involving a very real probablity of complete extinction of all biological
>life on this planet can be called moral.

Well, Eliezer has a problem with people dying. The longer your wait to get
the Singularity in full force the more people die (that's my interpretation
of his vision, at least). Personally, I'd rather let a few hundred
thousand people die while making certain that the Singularity won't just
wipe everyone out. I mean, what's the point in rushing to save lives if
everyone gets converted into computronium anyway? My best guess is that
the final result will be somewhere in between (but probably closer to
Eliezer's wishes than I'd like).

> > fall through to much faster substrate than our 200Hz neurons. The closest
> > we might come to a slow Singularity is if the first transhumans are pure
> > biological humans, in which case it might take a few years for them to
> build
> > AI, brain-computer interfaces, or computer-mediated broadband telepathy
> with
> > 64-node clustered humans, but my guess is that the first transhumans would
> > head for more powerful Singularity technologies straight out of the gate.
>
>I should hope not. It would seem to be much more ethical to offer
>assistance to those yet unmodified to get onboard, while you're still
>encrusted with the nicer human artifacts and the player delta has not yet
>grown sufficiently large that empathy gets eroded into indifference.

Here I'm split. Personally, I think if we try and wait that long we're
making a major mistake. There are STILL people who aren't comfortable with
computers today. To try and get humanity "on-board" the Singularity would
take centuries, which we don't have. The most moral (in my opinion) option
is to create the Singularity as fast as possible while maintaining a
strong measure of safety (ie: non-extinction, slavery, indifference,
etc). And yes, I know actual slavery would be pointless (we would be
useless junk to an SI). But we could be "slaves" by loosing much of our
freedom & choice, which is too likely. But, as mentioned earlier, I don't
think we'll actually get this option (due to the Eliezers out there).

> > very little point in trying.
>
>Once again we disagree. There's tremendous point in trying if human lifes
>are at stake.

While I doubt we'll get the chance for a moderate takeoff, we should be ALL
MEANS try. It is never pointless to try and protect the entire human race!

James Higgins



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT