Re: How hard a Singularity?

From: Eugen Leitl (eugen@leitl.org)
Date: Sat Jun 22 2002 - 05:07:12 MDT


On Sat, 22 Jun 2002, Eliezer S. Yudkowsky wrote:

> I do not consider a soft Singularity to be any less scary than a hard
> Singularity. I think this is wishful thinking. A Singularity is a

I disagree. There's a qualitative difference if changes ordinarily
occuring during a decade are happening within a second. We can handle a
lot of change during a decade. Lumping that change in a single second
completely overloads our capability to adapt. Considerable leverage is
available to people to inhibit the kinetics of early stages via legacy
methods.

No such leverage is available for later stages. This is our window of
operation to reduce our vulnerabilities by addressing some of our key
limitations.

> Singularity; the Singularity doesn't come in a "soft" version that
> lets you go on a few dates before deciding on a commitment. That
> option might be open to individual humans (or not), but it is not a
> real option for humanity.

You're arguing that you can influence the onset, but not the general shape
of the Singularity. I have to disagree on the latter, since the foothills
(defined by presence of basically unmodified people) can be obviously
engineered.
 
> I would call it dead certain in favor of a hard takeoff, unless all the
> intelligences at the core of that hard takeoff unanimously decide otherwise.

Wonders have been known to happen.

> All economic, computational, and, as far as I can tell, moral indicators

My moral indicator might be broken, but I don't see how activities
involving a very real probablity of complete extinction of all biological
life on this planet can be called moral.

> point straight toward a hard takeoff. The Singularity involves an inherent
> positive feedback loop; smart minds produce smarter minds which produce
> still smarter minds and so on. Furthermore, thought itself is likely to

This assumes a) you are in the possession of a human equivalent mind b) it
profits immediately from self enhancement c) it is allowed unfettered
operation.

All these points are questionable.

> fall through to much faster substrate than our 200Hz neurons. The closest
> we might come to a slow Singularity is if the first transhumans are pure
> biological humans, in which case it might take a few years for them to build
> AI, brain-computer interfaces, or computer-mediated broadband telepathy with
> 64-node clustered humans, but my guess is that the first transhumans would
> head for more powerful Singularity technologies straight out of the gate.

I should hope not. It would seem to be much more ethical to offer
assistance to those yet unmodified to get onboard, while you're still
encrusted with the nicer human artifacts and the player delta has not yet
grown sufficiently large that empathy gets eroded into indifference.

> Beyond that point it would be a hard takeoff. I see no moral reason for
> slowing this down while people are dying.

I see the reason for slowing down (that's a wrong picture, because most
things needed to speeded up, and only a couple of dangerous ones need to
be slowed down) if a hard takeoff will kill all life on this planet in a
few hours flat.
 
> I would say that the Singularity "wants" to be hard; the difficulty of
> keeping it soft would increase asymptotically the farther you got, and I see

We don't have to shovel the inhibition agent by the ton all the time.
Later stages can't and shouldn't be controlled.

> very little point in trying.

Once again we disagree. There's tremendous point in trying if human lifes
are at stake.



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT