Re: singularity arrival estimate...

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Fri May 17 2002 - 12:34:25 MDT


ben goertzel wrote:
>
> I am befuddled by your sudden insertion of a quantitative figure, 10%, into
> this conversation
>
> How did you derive this?? ;)
>
> This seems like what, at Webmind Inc., we used to call an "ass number" ...
> [I will omit the .gif file showing the production of an ass number
> graphically...]

10% is my standard figure for "Minimum, irreduceable chance that you don't
know what the heck you're talking about, for any given really complex
problem." I.e., the chance that the whole "simulation" business is
fundamentally a case of barking up the wrong tree. Of course this is only a
numerical name for a special kind of subjective probability. My point is
that even if Nick Bostrum should come up with an argument that the chances
are billions to one against our occupying bottom-level physical reality, the
chance that this argument and all similar arguments are barking up the wrong
tree - wrongly framed against the backdrop of genuine reality - will have a
certain minimum subjective probability which, to me, feels like it should
translate into around 10%.

> Eliezer wrote:
> >
> > I challenge this. If this world is a simulation and [insert speaker
> > variable] is the main character, then at most one life is at stake and
> > probably not even that. If it's a mass communal simulation, then six
> > billion lives are at stake and there is little or nothing we can do about
> > it in any case except trying for the Singularity - which, if there's any
> > remotely helpful exit scenario, is likely to be it. If this world is real,
> > then six billion lives and the entire (*)illionfold greater future of
> > Earth-originating intelligent life is at stake. So as long as there's at
> > least a 10% chance that this world is not a computer simulation, and it's
> > hard to see how the probability could drop below that, it makes sense for
> > me to act as if the world I see is the real one.

-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:38 MDT