Re: strong and weakly self improving processes

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sat Jul 15 2006 - 16:53:23 MDT


Eliezer S. Yudkowsky wrote:
> Eric Baum wrote:
>>
>> Eliezer> Considering the infinitesimal amount of information that
>> Eliezer> evolution can store in the genome per generation, on the
>> Eliezer> order of one bit,
>> Actually, with sex its theoretically possible to gain something like
>> sqrt(P) bits per generation (where P is population size), cf Baum,
>> Boneh paper
>> could be found on whatisthought.com and also Mackay paper. (This is
>> a digression, since I'm not claiming huge evolution since chimps).
>
> I furthermore note that gaining one standard deviation per generation,
> which is what your paper describes, is not obviously like gaining
> sqrt(P) bits of Shannon information per generation. Yes, the standard
> deviation is proportional to sqrt(N), but it's not clear how you're
> going from that to gaining sqrt(N) bits of Shannon information in the
> gene pool per generation. It would seem heuristically obvious that if
> your algorithm eliminates roughly half the population on each round, it
> can produce at most one bit of negentropy per round in allele
> frequencies. I only skimmed the referenced paper, though; so if there's
> a particular paragraph I ought to read, feel free to direct me to it.

Yeah, so, I reread my paragraph above and it doesn't make any sense.
Standard deviations are not proportional to the square root of the
population size (duh). N in Baum, Boneh, and Garrett (1995) is the
length of the string, not the population size. Why
http://citeseer.ist.psu.edu/82728.html shows that you can gain anything
proportional to sqrt(P) per generation, I have no idea. As for gaining
sqrt(P) bits of negentropy in the gene pool by eliminating merely half
the population, I just don't see how that would work. Maybe you're
thinking of sqrt(log(P)) which would be how much you could gain in one
generation by culling all but the uppermost part of a Gaussian distribution?

-- 
Eliezer S. Yudkowsky                          http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:56 MDT