RE: singularity arrival estimate...

From: ben goertzel (ben@goertzel.org)
Date: Fri May 17 2002 - 13:13:50 MDT


I suppose we all react differently to these various existential
realizations

For me, realizing there's a high chance we are living in a "simulation" was
part of a larger process of recognizing the semi-reality of the world
around me ("semireal" being a PhilDickian term) -- and not just recognizing
it intellectually, but learning to live with it as a part of everyday,
minute-by-minut existence. I did not react by deciding to feel "as if the
world were probably real anyway", nor by becoming entirely nihilistic, but
quite differently in fact.

I am sure that, similarly, as the Singularity becomes more tangible to more
people, we will see a fascinating diversity of reactions and
interpretations -- we already have a hint of this in the diversity of views
on this list

ben g

-----Original Message-----
From: Eliezer S. Yudkowsky [SMTP:sentience@pobox.com]
Sent: Friday, May 17, 2002 12:34 PM
To: sl4@sysopmind.com
Subject: Re: singularity arrival estimate...

ben goertzel wrote:
>
> I am befuddled by your sudden insertion of a quantitative figure, 10%,
into
> this conversation
>
> How did you derive this?? ;)
>
> This seems like what, at Webmind Inc., we used to call an "ass number"
...
> [I will omit the .gif file showing the production of an ass number
> graphically...]

10% is my standard figure for "Minimum, irreduceable chance that you don't
know what the heck you're talking about, for any given really complex
problem." I.e., the chance that the whole "simulation" business is
fundamentally a case of barking up the wrong tree. Of course this is only
a
numerical name for a special kind of subjective probability. My point is
that even if Nick Bostrum should come up with an argument that the chances
are billions to one against our occupying bottom-level physical reality,
the
chance that this argument and all similar arguments are barking up the
wrong
tree - wrongly framed against the backdrop of genuine reality - will have a
certain minimum subjective probability which, to me, feels like it should
translate into around 10%.

> Eliezer wrote:
> >
> > I challenge this. If this world is a simulation and [insert speaker
> > variable] is the main character, then at most one life is at stake and
> > probably not even that. If it's a mass communal simulation, then six
> > billion lives are at stake and there is little or nothing we can do
about
> > it in any case except trying for the Singularity - which, if there's
any
> > remotely helpful exit scenario, is likely to be it. If this world is
real,
> > then six billion lives and the entire (*)illionfold greater future of
> > Earth-originating intelligent life is at stake. So as long as there's
at
> > least a 10% chance that this world is not a computer simulation, and
it's
> > hard to see how the probability could drop below that, it makes sense
for
> > me to act as if the world I see is the real one.

-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:38 MDT