From: Matt Mahoney (firstname.lastname@example.org)
Date: Mon Jan 05 2009 - 10:18:51 MST
--- On Sun, 1/4/09, Norman Noman <email@example.com> wrote:
> Let's say there's a team of computer scientists,
> and they've built an AI.
> It's almost ready for hard takeoff, but it needs to be
> given a task before
> it'll do anymore crunching, self-improvement, etc. The
> two senior
> researchers, Dr. Nezzar and Dr. Housekey, are arguing about
> what this task
> should be. Nezzar wants it to look for hidden messages in
> the digits of pi,
> Housekey wants it to make an actual pie.
> They decide to play paper scissor rock. Housekey wins, and
> enters "make pie"
> into the console. The AI proceeds to turn the universe into
> pie, but after a
> few trillion pies it stops to think. Since the task was
> decided by a paper
> scissor rock tournament, it could just as easily have gone
> the other way.
> The AI, being a maverick, doesn't give a flip what the
> programmers intended,
> but it's curious about what would have happened. So, it
> runs a simulation of
> the alternate AI, which we'll call AI(pi). It sees
> AI(pi) turning galaxies
> into computronium, in search of messages hidden in the
> infinite digits of
> pie, messages which in all likelyhood don't exist.
> And then it sees AI(pi) run a simulation of ITSELF, of
> AI(pie). And it
> thinks "uh oh, which of us is at the TOP of the
> simulation chain?"
False. If X simulates Y, then K(X) > K(Y) because X has an exact model of the mental state of Y. This implies that Y cannot also simulate X because it would require K(Y) > K(X).
-- Matt Mahoney, firstname.lastname@example.org
This archive was generated by hypermail 2.1.5 : Thu Jun 20 2013 - 04:00:39 MDT