From: Richard Loosemore (firstname.lastname@example.org)
Date: Sun Apr 23 2006 - 17:11:54 MDT
This is an important thread.
I think that Kurzweil's tendency to emphasize exponential curves of
general technology is really a nuisance. I don't care how many curves
he finds, I think that without AGI we simply cannot get more than a
certain rate of new technological development. Frankly, I think it will
hit a plateau. (I even have my suspicions that we already did hit the
And as for AGI, this for me is *the* definition of the Singularity.
None of the rest is very important. Without it, I see so many
limitations on the ability of human engineers to handle the complexity
of ordinary technology that I do not believe there will be any dramatic
advances in the near future. I think spacecraft will continue to blow
up because of single-line errors in their code, and the IRS will fail
again and again to upgrade their computer software.
And there seems to be some confusion about the timeline. When the first
AGI reaches near or at human level intelligence, it will still take a
while before the takeoff. After thinking through all the factors
involved, my guess would be a year or two. I get the impression some
people think it will be human level AGI one day, and then up as far as
superintelligence later on the same day.
So Kurzweil has done a disservice to the idea by emphasizing those
curves. It just does not make sense to look at a curve and suppose that
no new factors will kick in as it heads off to infinity. It was the
same sort of silly reasoning that made people predict, a hundred years
ago, that horse dung would soon be filling city streets to a depth of 10
feet, because of the rate at which horse drawn traffic was increasing.
Phillip Huggan wrote:
> The point of the non-Vinge (Kurzweil's?) Singularity is that we get tech
> progress as usual but at some point it will happen so quickly that
> present social power structures won't be able to keep up. I guess the
> danger would be tech gadgets as weaponry. At the same time, maybe the
> progress will plateau if the AI (not AGI) gadget-programming doesn't
> keep up.
> The difficulty I have with the AGI Singularity is that some people think
> it means actual brains, and some think it means intelligently behaving
> but not sentient software. This is very confusing to people.
> */Robin Lee Powell <email@example.com>/* wrote:
> On Sun, Apr 23, 2006 at 01:06:34PM -0400, Philip Goetz wrote:
> > The classic Singularity, as popularized by Vinge and now Kurzweil
> > (I think Ben G. said that Von Neumann used the term in the same
> > way?), is that the Singularity is a divide by zero on the
> > timeline, the place at which you can't calculate the summed change
> > because your calculations go to infinity. This relies on
> > exponential curves.
> Actually, that's a common misconception: Vinge makes it *very* clear
> that his singularity is about the rise of greater than human
> intelligence in his original essay. He extrapolates infinite rate
> of change from that event, but the core event is about intelligence,
> not change.
> The more people talk about the singularity as a rate-of-change
> event, the more I hate that definition. It's amazingly easy to
> argue against, and really not all that useful.
> Celebrate Earth Day everyday! Discover 10 things you can do to help slow
> climate change. Yahoo! Earth Day
This archive was generated by hypermail 2.1.5 : Mon May 20 2013 - 04:01:04 MDT