Re: How hard a Singularity?

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sat Jun 22 2002 - 15:33:55 MDT


Eliezer S. Yudkowsky wrote:
> Michael Roy Ames would have written:
>
>> Eli1: What is the mistaken part? I don't see Eli2 drawing an analogy
>> between chimpanzees and humans... I see him drawing an analogy between
>> every-other-species-before-now and humans. The point being, once you
>> can think 'as well as' a human, a large number of things become
>> possible. Where's the mistake?
>
> The mistake is that all every-other-species-before-now was not generally
> intelligent, whereas humans were.

If I can amplify on what Eliezer says here: The notion is that the shift
from chimpanzee to human may have been much more of a qualitative,
system-level shift than the transition from an infrahuman to human-level AI
would be; consequently a transition which opens fewer doors. To cite one
concrete difference between our species' respective developmental lines, the
point at which an AI crosses the line into human-level smartness would
probably not be the point at which the AI first became capable of what we
would consider "abstract thinking"... although it might be the point at
which the AI gained some other, equally valuable ability.

-- 
Eliezer S. Yudkowsky                          http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT