From: Matt Mahoney (firstname.lastname@example.org)
Date: Sun Mar 16 2008 - 18:44:33 MDT
--- John K Clark <email@example.com> wrote:
> On Sat, 15 Mar 2008 "Matt Mahoney" <firstname.lastname@example.org> said:
> > If an agent performs N independent experiments […]
> Stop right there. In this thought experiment 99 were NOT independent and
> were in fact absolutely identical, so N=2.
That's not what I said. N is the number of trials, not the number of agents.
Let me rephrase the question. A robot zombie with no qualia or consciousness
flips a possibly biased coin N times. The coin has a probability p of coming
up heads. The robot does not know the value of p. If the coin comes up
tails, then the robot is killed with probability q and we restart the
experiment with another robot. Or equivalently its memory is erased and we
continue the experiment with the same robot. (Its memory consists of 2
counters, which are reset to 0). The robot does not know it might be
killed/reset and does not know q. We continue the experiment until the two
counters sum to N. At this point, what is the robot's expected estimate of p?
For concreteness, say p = 0.5, q = 0.99.
Now replace the robot with a human. How does consciousness affect the result?
-- Matt Mahoney, email@example.com
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT