Re: how to do something with really small probability?

From: Rolf Nelson (rolf.h.d.nelson@gmail.com)
Date: Mon Nov 05 2007 - 20:13:39 MST


On 11/5/07, Wei Dai <weidai@weidai.com> wrote:
> I was using the standard prior for Solomonoff Induction.

Sounds good. But there's no constraint that a super-intelligence has
to strive for Solomonoff Induction, I know of no AI project that is
being built solely based on Solomonoff Induction, and no human being
on Earth directly uses Solomonoff Induction, so if you're assuming the
use of Solomonoff Induction in this scenario, then that's significant
enough that you should have said that up front as part of the premise.
That's the only reason I said you're "begging the question".

I would agree that if this AI says, "what's the probability that my
next action will look like x?" then the probability should normatively
be > 1/3^^^3, for any x that the AI can remotely begin to *distinctly*
think about (that is, it's not an anonymous part of an ensemble.)

> >> Now is it
> >> possible that SI can take an arbitrary string x and tell us whether P(x)
> >> <
> >> 1/2^(2^100)?

<snip>

> This P is supposed to be the same function as before (i.e., the standard
> prior for Solomonoff Induction).

By the same logic, you can never do something with probability .5,
since you can't consistently prove that the measure of any reasonable
x lies in the range .5 - 1/3^^^3 < p < .5 + 1/3^^^3. The same is true
for any real number.



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:00 MDT