Re: Simulation argument in the NY Times

From: Matt Mahoney (matmahoney@yahoo.com)
Date: Wed Aug 22 2007 - 18:10:09 MDT


--- Peter de Blanc <peter@spaceandgames.com> wrote:

> On Sun, 2007-08-19 at 17:44 -0700, Matt Mahoney wrote:
> > Sorry, I did not express myself correctly. I mean that a population of
> > organisms is intelligent, and evolution is the algorithm. My argument is
> that
> > the intelligence of the population is proportional to its algorithmic
> > complexity, the length of the shortest program that could output all of
> the
> > DNA of the population.
>
> So you're saying that noise is smarter than anything?

No. An additional property of intelligent systems such as DNA, brains, and
large software projects seems to be that they can be updated incrementally.
Kauffman [1] observed that intelligent systems lie on the boundary between
stability and chaos, i.e. the Lyapunov exponent is near 1 (or its
approximation to discrete systems, doing some handwaving here). I believe the
reason for this is that if the system is stable (a large state change causes a
small change in behavior), then it is not very intelligent, and if it is
chaotic (a small state change causes a large change in behavior), then there
isn't an efficient learning algorithm.

[1] Kauffman, Stuart A., “Antichaos and Adaptation”, Scientific American, Aug.
1991, p. 64. Kauffman studied randomly connected logic gates. He found that
the system transitions from stable to chaotic as the number of connections per
gate increases from 2 to 3. At the boundary, the number of attractors is
about the square root of the size of the system. He also studied gene
regulation in human DNA (genes turn other genes on or off, resulting in stable
states and cell differentiation). He found that then number of cell types
(about 250) is very roughly the square root of the number of genes (about
30,000).

-- Matt Mahoney, matmahoney@yahoo.com



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:58 MDT