From: Peter de Blanc (firstname.lastname@example.org)
Date: Fri Jun 06 2008 - 11:25:14 MDT
Byrne Hobart wrote:
> Slightly offtopic, perhaps, but does this apply to logical operators?
> It's important to state our attitude towards uncertainty very clearly,
> so we avoid internal contradiction or nihilism: I would argue that
> logical rules, mathematical premises, etc., should not be considered
> open to question. If your AI is only 99.99999% certain that + 1 is
> equivalent to "count to the next natural number", this infects all
> mathematical operations it performs with some uncertainty, and means
> that the higher it counts, the less sure it is (since any natural number
> can be stated as a 0 + 1 ... + 1, and no datum can have two different
> levels of uncertainty, this could cause the AI to be less sure that 100
> + 1 = 101 than that 1 + 1 = 2).
If a universal rule has a probability p of being correct, then n
applications of the rule have a probability >= p of being correct, not p^n.
This archive was generated by hypermail 2.1.5 : Wed Jun 19 2013 - 04:01:39 MDT