From: Phillip Huggan (email@example.com)
Date: Sun Apr 23 2006 - 14:02:02 MDT
The point of the non-Vinge (Kurzweil's?) Singularity is that we get tech progress as usual but at some point it will happen so quickly that present social power structures won't be able to keep up. I guess the danger would be tech gadgets as weaponry. At the same time, maybe the progress will plateau if the AI (not AGI) gadget-programming doesn't keep up.
The difficulty I have with the AGI Singularity is that some people think it means actual brains, and some think it means intelligently behaving but not sentient software. This is very confusing to people.
Robin Lee Powell <firstname.lastname@example.org> wrote:
On Sun, Apr 23, 2006 at 01:06:34PM -0400, Philip Goetz wrote:
> The classic Singularity, as popularized by Vinge and now Kurzweil
> (I think Ben G. said that Von Neumann used the term in the same
> way?), is that the Singularity is a divide by zero on the
> timeline, the place at which you can't calculate the summed change
> because your calculations go to infinity. This relies on
> exponential curves.
Actually, that's a common misconception: Vinge makes it *very* clear
that his singularity is about the rise of greater than human
intelligence in his original essay. He extrapolates infinite rate
of change from that event, but the core event is about intelligence,
The more people talk about the singularity as a rate-of-change
event, the more I hate that definition. It's amazingly easy to
argue against, and really not all that useful.
Celebrate Earth Day everyday! Discover 10 things you can do to help slow climate change. Yahoo! Earth Day
This archive was generated by hypermail 2.1.5 : Thu May 23 2013 - 04:01:16 MDT