RE: Seed AI (was: How hard a Singularity?)

From: Ben Goertzel (ben@goertzel.org)
Date: Tue Jun 25 2002 - 10:56:16 MDT


Eli wrote:
> > and then set about collaborating with our AGI on building
> > uploading-enabling and life-extension technology... ;>
>
> Swatting a fly with a sledgehammer.

Swatting a fly with a sledgehammer, only if you are right that the interval
between human-level AGI and superhuman AGI is going to be very very brief

> I didn't work on the moonshot or the Genome Project, but I think AI will
> take substantially less money and substantially more intelligence
> to solve.
> I could be wrong about the money part.

Yes, and you could ALSO be wrong about the other part! A huge amount of
human intelligence, accumulated over centuries and decades, went into the
other two projects you cite. The same is true of AI: if it's solved by one
of us on this list, or another of our contemporaries, the bulk of the credit
still goes to the vast community of scientists and engineers who have built
the concept and tools we are using, i.e. to the collective intelligence of
the human race.

I think it gratifies human vanity to think that creating intelligence is
"vastly harder than any other science/engineering problem ever confronted."
Maybe it is, maybe it isn't... we don't have enough information to know how
hard the problem is, at the moment, really. A lot of problems seem far
harder before they're solved than after!

-- Ben G



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT