Re: The Human Brain.

From: Eugen Leitl (eugen@leitl.org)
Date: Sat Jun 22 2002 - 02:41:41 MDT


On Sat, 22 Jun 2002, Mike & Donna Deering wrote:

> Estimates of how much conventional computing hardware is required for
> human level intelligence based on observations of the human brain have
> been seriously inflated for the following reasons.

Of course you realize that the trend in growth of past estimates done by
people with otherwise impeccable accomplishment track speaks against your
statement. Granted, though smart people, they had no idea of modern
neuroscience. We might be not as smart, but we have several decades worth
of literature more to study.

If you track the neuroscience publications more or less closely you'll
find little basis for optimism there. It's not just the complexity which
makes accurate numerical models of wet systems a stark nightmare, it's the
number of bells and whistles, and how much of systems intrinsic artifacts
are being hijacked for implementing features.

The trend does not look good here.
 
> #1. The human brain uses a lot of it's neurons for autonomic bodily
> maintenance functions like blood pressure, heart rate, endocrine
> system control, balance and locomotion, motor control of speech, and
> many others that we don't generally associate with higher level
> reasoning tasks and would certainly not assist in solving the protein
> folding problem or design of a nano assembler.

That what the rest of the CNS is there for. Your brain is generally not
concerned with low level functions. If you want to include low level
homeostasis, please factor in the neurons in the entire body.

Conversely, since human equivalent intelligence needs to be embodied, or
contain representational systems of the outside world to be able to seize
control of actuators about the same fraction of computational resources
are required.
 
> #2. The human brain is very physically robust in a dangerous
> environment with delicate components, by using massive neural
> redundancy. Redundant neurons are rewired around damaged areas to

Redundancy is not the word. I would term this as graceful degradation and
adaptive remapping. Pretty small localized lesions can knock out complete
functionalities. If you think you can randomly destroy every second or
even every third neuron in your brain and not notice anything out of
ordinary you're kidding yourself.

Structure shrinkage in circuitry will soon force us to use similiar
techniques to be able to fabricate and operate systems with >>10^9
components. Molecular circuitry won't be possible without this. The
fabbing defect density and the failure rate is brutal with molecular
components.

> recover functions lost in strokes or other trauma. You can lose half
> your brain with only a slight reduction in effective IQ according to
> neuroscientists.

I guess that says something about the relevance of IQ. If you think you
can lose a hemisphere and not show cognitive and motoric deficits, you
should put down that glass pipe.
 
> #3. The computational methodologies resulting from evolutionary
> functional adaptations are very inefficient and subject to
> optimization by intelligent design.

Calling evolutionary algorithms inefficient is neat. Inefficient in
relation to what? Intelligent design, huh?

Intelligent design is a) has been invented by evolutionary algorithms
starting with the prebiotic ursoup, a pretty harsh handicap b) a number of
plausible hypotheses explain intelligent action as darwinian selection on
a population of noise-driven planners (<http://williamcalvin.com/>) c) in
a number of narrowly defined areas evolutionary algorithms on today's
hardware outperform humans in design tasks. Koza's stuff is pretty weak,
but he's got a few patents done by a machine.

Latter doesn't mean humans are such hotshots design-wise, but it clearly
means noise-driven stochastic algorithms don't have clear limitations, and
achieve human-level performance in specific tasks under very adverse
conditions (lousy algorithms, lousy hardware, lousy population size). I
think we'll be rather impressed with what mature evolutionary algorithms
on large scale molecular hardware can achieve.
 
> #4. Estimates of the number of neurons times the maximum firing rate
> to get the computational capacity are highly inflated due to the fact
> that at any given moment not all the neurons are doing productive
> work. If all the neurons were firing at their maximum rate you would

Estimates based on average firing rate are complete hogwash, because they
completely ignore other forms of processing, other sites of processing,
and completely ignore timing info.

> have something like super epilepsy. Many neural circuits are
> specialized for a function and are idling if the function isn't being
> presently called.
>
> #5. The human brain as a technological problem solving device is
> seriously screwed up. Many of the supposedly "productive functions"
> are actually counter productive and it is a miracle that any useful
> problem solving gets done at all.

Well, it's all we've got for now. (I don't see how this gives you a bound
on crunch, or design constraints, or anything).
 
> Therefore, I estimate, that the $1000 desk top of 2002 is not more
> than three doublings away from human level complex problem solving
> general deliberative software implementation capability.

You're still confusing crunch (measured in what? remember memory
bandwidth) with a generally intelligent system. You'll observe that a kg
of neuron suspension in liquid culture is usually not very intelligent.

Even if we assume you're not wildly overoptimistic, you're completely
ignoring the bootstrap issue. It may require a lot of crunch to be able to
do neat stuff with a modest amount of crunch.



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT