Re: [Fwd: Existential Risks: Analyzing Human Extinction Scenarios andRelated Hazards]

From: Gordon Worley (redbird@rbisland.cx)
Date: Mon May 14 2001 - 21:52:55 MDT


At 12:58 PM -0400 5/12/01, Eliezer S. Yudkowsky wrote:
>This paper is scary and well worth reading.

After reading this, I am certainly more aware of some things. To be
honest, I never really gave that much thought of things happening
where SL4 technologies lead to a capping of potential developments.

Some of the stuff is obviously eliminated by Friendliness (especially
some of the bangs), but if Friendliness is not implimented in some
fashion, some of these seem very likely (as in > 75% chance of
happening). And, in case you're wondering, yes, this means that I've
decided that Friendiness matches closely enough with what I'd want to
endorse it (as if Eliezer or anyone needed it :^)).

One thing that I was disapointed with was the assessment of the
simulation bang. The kind of society that would kill billions of
sentient beings, no matter how dumb by their standards, doesn't seem
very likely to last long enough to develop the technology to even run
such a simulation. I'd put this in the ~0.001% probability of
happening area. Of course, the simulation whimper or shriek scenario
still seems possible, but I'd hope that they'd realize how unFriendly
it would be to keep us locked up and prevent our reaching our full
potential.

And, if you haven't read the paper yet, but have been interested by
what I wrote, go read it at:

>http://www.nickbostrom.com/existential/risks.html
>http://www.nickbostrom.com/existential/risks.doc

-- 
Gordon Worley
http://www.rbisland.cx/
mailto:redbird@rbisland.cx
PGP Fingerprint:  C462 FA84 B811 3501 9010  20D2 6EF3 77F7 BBD3 B003


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT