From: Matt Mahoney (firstname.lastname@example.org)
Date: Mon Jan 12 2009 - 11:21:10 MST
--- On Mon, 1/12/09, Benja Fallenstein <email@example.com> wrote:
> Hi Matt,
> On Mon, Jan 12, 2009 at 3:23 PM, Matt Mahoney
> <firstname.lastname@example.org> wrote:
> > Simulating past versions of each other does not allow
> > 2 way communication.
> Why not?
> * You say something (to yourself).
> * You simulate the other AI until *it* has simulated *you*
> enough to see what you said.
> * You simulate it some more, until it says something (to
> itself) in return.
> * You say something in return (to yourself).
> Do you see a reason that wouldn't work?
If X simulates Y, then X must know Y's state. That knowledge is part of X's state. Likewise, if Y simulates X, then knowledge of X's state must be part of Y's state. The only way this can happen is if X = Y.
Perhaps you could write a simple program of two agents simulating each other and prove me wrong.
-- Matt Mahoney, email@example.com
This archive was generated by hypermail 2.1.5 : Thu May 23 2013 - 04:01:41 MDT