RE: Uploading with current technology

From: Ben Goertzel (ben@goertzel.org)
Date: Tue Dec 10 2002 - 18:17:22 MST


> Basically, this is an intro into why I think it is very difficult (nigh
> impossible) to build an AI that is superior than a human (across
> the board).
> There isn't a clear target to aim at, or a meaningful measure of success.

This strikes me as a rather silly argument. The fact that "superiority" is
subjective is no reason that we can't build AI's that have vastly superior
intelligence to humans in many meaningful senses.

Are cars and trains "better" than horse-drawn carriages? Are planes
"better" than birds?

Is human intelligence superior to dog intelligence? Perhaps dog
intelligence is superior in some senses. It may be superior at finding
other dogs to mate with, for example....

The vagueness of the word "superior" should be dealt with by proposing
particular, contextually-meaningful measures of superiority.

I can define a lot of meaningful measures of "superior intelligence."

For example, if I can create a program that can:

1) Prove any mathematical theorem from the math research literature (without
being shown the proof)

2) Figure out how to modify humans (via genetics and/or pharmacology and/or
nanotech) so that humans don't age.

3) Create a molecular assembler a la Dreyfus

4) Create original music that the majority of humans find extremely
emotionally moving

5) Effectively manage a team of people working on a project via e-mail and
chat communication

6) Interact with humans in a way that makes many humans feel positively
spiritually touched

7) Formulate a unified theory of the four known physical forces

8) Defeat any human in all games of mental skill

9) Perform surgical operations on humans with more skill than any human

10) Operate mobile robots that drive cars, fly airplanes and spaceships, and
climb mountains

then I will rate this program as having "superior general intelligence" to a
human. Perhaps you don't agree with this list of goals [of course, the
details could be modified], but it's certainly clear and meaningful....

If you don't agree that this constitutes "superior intelligence", it doesn't
really matter to me. You can go along [choosing to have your life extended
by this program's inventions, perhaps!] thinking that your intelligence is
superior or equal to its. I'll bet it won't be too concerned with your
interpretation of the human concept of "superiority" either ;)

-- Ben G



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:41 MDT