Human intelligence is obviously absurd

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sat Jan 29 2005 - 07:00:12 MST


Suppose that humanity, instead of evolving intelligence on a hundred
trillion 200Hz synapses, had instead evolved essentially equivalent
intelligence on a million 2 GHz processors using slightly more efficient
serial algorithms (my example postulates a factor-of-ten efficiency
improvement, no more). Let's call these alternate selves Humans.

Would anyone here dare to predict, in advance, that it was even
*theoretically possible* to achieve Human-equivalent intelligence on 200Hz
processors no matter *how* many of them you had?

Even I wouldn't dare. Trying my best to be conservative and to widen my
confidence interval, my guess is that I would guess 10KHz, or 1KHz given a
superintelligent programmer, and I would probably have the lowest guess in
the crowd - both because of my guess that intelligence doesn't require much
crunch, and because I knew to widen my confidence intervals.

Ben Goertzel would laugh at me, saying that Human-equivalent intelligence
carried out with one thousand sequential serial operations per second was
obviously impossible. Perhaps Ben would suggest that I try writing code
that executed with a bound of ten thousand sequential serial operations, to
get a feel for how restrictive that limit was.

And if you suggested two hundred serial instructions per second - pfft!
Now you're just being silly, they would say; and while I might credit you
for fearless audacity, I probably wouldn't defend you, lest I be tarred
with the same brush. Like the reaction you might get if you suggested that
intelligence could run on a 286, or use less than 4KB of RAM, or be
produced by natural selection.

If any computational neurobiologists were present, they might even be able
to provide a quantitative mathematical argument, showing that some of the
basic algorithms known to be used in Human neurobiology intrinsically
required more than ten thousand serial steps per second. So too did Lord
Kelvin prove by quantitative calculation that the Sun could not have burned
for more than a few tens of millions of years.

One of the great lessons of history is that "absurd" is not a scientific
argument. The future is usually "absurd" relative to the past. Reality is
very tightly restrained in the kinds of absurdity it presents you with; the
human history of the 20th century might be absurd from the perspective of
the 19th century, but not one of those absurdities violated the law of
conservation of momentum. Even so, "absurd" is not good evidence because
of the historical observation that the answers we now know were "absurd" to
people who didn't grow up with our background assumptions. "Obvious" is
often wrong and "absolutely certain" isn't remotely close to 1.0 calibration.

Widen the bounds of your confidence interval. Spread the wings of your
probability distribution, and fly.

-- 
Eliezer S. Yudkowsky                          http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:50 MDT