Re: Separate Copies Contribute Separately to One's Runtime

From: Peter de Blanc (peter@spaceandgames.com)
Date: Fri Mar 07 2008 - 14:55:43 MST


On Fri, 2008-03-07 at 11:52 -0800, Eliezer S. Yudkowsky wrote:
>
> If you flip a fair quantum coin, have your exoself generate 100
> separated isomorphic copies of you conditional on the coin coming up
> heads, then, when (all of) you are about to look at the coin, should
> your subjective anticipation of seeing "heads" be 1:1 or 100:1?
>
> This is a question that confuses even me, btw.

If you're an altruist in a world with 6 billion people to drown out your
individual hedonism, then we can say that to calculate the EU of some
action, you add half the utility of having one copy of you take that
action to half the utility of having 100 copies of you take that action.
Subjective anticipation is really just one of the steps involved in
making decisions, so once you know what the decision is, who cares what
you anticipate?

If we make not just 100 copies of you, but 100 copies of everything you
care about, then I'm less sure how to answer this. It would sure make
things easy if we could say that 100 copies of everything is precisely
100 times as good as 1 copy of everything.

This would, however, imply that it's better to have 2 non-interacting
copies of a world with utility 10 than to have 2 distinct
non-interacting worlds with utility 10 and 9. I think this probably
wouldn't make the AI tile the universe with small optimal objects,
because the possible utility of an object could increase more than
linearly with size.

This would also contradict my previously-stated opinion that utility
functions should be bounded.



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT