Re: Fwd: We Can Understand Anything, But are Just a Bit Slow

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Mon Apr 24 2006 - 13:45:36 MDT


justin corwin wrote:
>
> OR, more plausibly, it could apply only to the boundary capabilities
> of intelligence in general, where humans and chimpanzees represent
> implementations nowhere near "optimal", and thus can have vast
> differences in 'performance' without conforming to the proportional
> computational resource differential. Chimpanzees and Neandrathals
> could very plausibly be simply missing architectural elements that
> reduce computational cost for intelligent behavior.

 From a first-person perspective, here on pre-Singularity Earth, the end
effect of this variant hypothesis is the same: you still get I. J.
Good's intelligence explosion, but it runs even faster because there was
so much room to run in. There are a variety of things that could
finally upper-bound an intelligence explosion - physical resource
limits, predicted by current models, being more plausible in my eyes
than elaborate hypotheses of computer science with no supporting
evidence. But so long as the top is much, much higher than where we
are, and the trajectory there is subjectively fast from the perspective
of a modern-day human, it seems a distinction without a distance. From
our perspective, it just adds up to: AI go ker-FOOM.

I admit that my perspective here is a bit narrow - but then, existential
risks and existential benefits tend to dominate the information value of
questions.

-- 
Eliezer S. Yudkowsky                          http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:56 MDT