From: Shane Legg (firstname.lastname@example.org)
Date: Thu Aug 23 2007 - 09:57:55 MDT
You're right, it does end up sounding like a contradiction :-)
In order to straighten things out, it's necessary to get into some of
the technical details...
When defining intelligence I use the algorithmic complexity of the
environment. The environment is an enumerable semi-measure
and the set of all of these is enumerable. Thus we can describe
the environment by giving its index in this enumeration. Then the
complexity of the environment is the prefix Kolmogorov complexity
of this index. The important point for you is that the environment
is a kind of "probability distribution" (roughly speaking). Thus an
environment that generates bits randomly (by that I mean with a
uniform distribution, for example by flipping a fair coin) has a low
complexity because the uniform distribution is a very simple function.
However, the sequence that it generates (with probability 1) has
infinitely high Kolmogorov complexity as you can't describe the
exactly sequence simply.
In short: A simple process (uniform random distribution) can generate
a sequence that is very complex (to describe exactly you would have
to list the bits).
Now, in the other paper of mine that Matt was talking about I prove
a bound on the performance of AI systems based on their complexity.
In short, in order to be able to learn to exactly predict a complex
sequence that is deterministic (e.g. not a random one, they can't be
predicted obviously), then the predictor itself has to be quite complex.
I hope this helps a little..
On 8/23/07, Peter de Blanc < email@example.com> wrote:
> On Wed, 2007-08-22 at 17:10 -0700, Matt Mahoney wrote:
> > > So you're saying that noise is smarter than anything?
> > No.
> This contradicts your statement that intelligence is proportional to
> algorithmic complexity. What did you really mean when you said that?
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:58 MDT