Re: ESSAY: How to deter a rogue AI by using your first-mover advantage

From: Stathis Papaioannou (stathisp@gmail.com)
Date: Tue Aug 28 2007 - 08:31:12 MDT


On 28/08/07, Vladimir Nesov <robotact@mail.ru> wrote:

> SP> All simulations which provide an identical 1st person POV are
> SP> equivalent, and in a significant sense redundant, because it is
> SP> impossible even in principle to know which one you are in. If the
> SP> number of simulations were suddenly increased or decreased, it would
> SP> be impossible for you to know that anything unusual had happened: your
> SP> next subjective moment will be in one and only one simulation, any of
> SP> the simulations will do as that one simulation, so provided there is
> SP> at least one simulation available to choose from you can never know
> SP> what's "really" going on.
>
> What then if ALL the simulations are available at least once, through
> TM enumerator? Does it mean that ALL other simulators are irrelevant,
> as for each simulation there is at least one protected identical simulation (with
> identical past and future) running on TM enumerator?

By TM enumerator I take it you mean a program that enumerates all
possible programs, like a universal dovetailer. In the sense I have
described, then yes, all the other simulations are irrelevant.
However, where there are multiple competing futures (as below) the
weighting of each one matters. There are theories in which it is
assumed that the universe is the set of all possible programs (which
perhaps need only exist as Platonic objects), but I don't know if it
has been successfully shown that this idea yields the known laws of
physics.

> SP> However, if there are two or more competing "next moments" then the
> SP> number of simulations is relevant. If there are X simulations in which
> SP> you are tortured and Y simulations in which you are not tortured in
> SP> the next moment, then you have a X/(X+Y) chance of being tortured.
>
> You can't distinguish between worlds where you will be tortured, and you can't distinguish
> between the worlds where you won't. You also can't distinguish between
> all these worlds together prior to potential torture time point. Even
> if you assume subjective POV is located in individual simulations, you
> can't jump to different simulation. All you can do is prepare to
> different future possibilities to a different degree, to balance
> resources among possible futures. Will your actions be different if
> ratio of futures with torture to futures without is 1000:1 or
> 1:1000? In all 1000 simulations of that or another outcome you will be
> a completely identical entity, thinking the same thoughts about the
> same events, at the same moments. How is 1000 of identical copies
> experiencing torture worse than 1 copy experiencing torture, if it's
> the same experience?

I dispute the statement that I can't jump to a different simulation.
If one second ago my simulation in world A was ceased and my
simulation in world B immediately started up, so that one second ago
only the A version was running and now only the B version is running,
I could not possibly know that this had happened. I still find myself
in one and only one simulation at any one moment, but as I cannot know
which simulation, my POV can jump around from one to another every
moment.

A consequence of this is that if I am being tortured in say 100
parallel simulations, I will not notice anything different if this
number is increased to 1000 or decreased to 1: in the former case my
POV will randomly shift to one of the 1000, in the latter case it will
definitely shift to just the single simulation, and in both cases I
will continue experiencing the torture with 100% probability.

But suppose instead I am given the option of having 900 new
simulations start up in which I am being tortured and the torture
suddenly stops, while the simulation continues. At this point, my POV
will randomly be allocated to either one of the original 100
simulations where the torture continues or to one of the 900 new
simulations in which the torture stops, since any one of these 1000
simulations is equally well qualified to continue my identity. I will
therefore experience a 9/10 chance of having the torture stop.

> To the point of my previous message: how do you count simulations?
> What is your solution to that counting paradox I wrote about in previous message?
> Does a presence of 2^N simulations within a single implementation
> somehow influence probability of you being in certain simulation?

If I understand you correctly, you are suggesting that doubling the
number of implementations in a recursive simulation will increase the
total number of entities in that implementation not by a factor of 2,
but by a factor of 2^N, with N being the number of levels. I don't see
why this shouldn't also increase the weighting by 2^N for the purpose
of probability calculations, although this does provide a possible
experimental method to determine if we are in a recursive simulation.

-- 
Stathis Papaioannou


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:58 MDT