RE: Article: The coming superintelligence: who will be in control?

From: Ben Goertzel (ben@webmind.com)
Date: Wed Aug 01 2001 - 14:08:15 MDT


Amara helped me edit an article I wrote on Webmind for the site, but the
"powers that be" at kurzweilai.net wouldn't let it appear -- ostensibly
because of the "commercial" aspects of Webmind, but I don't believe that was
the real reason (after all, Webmind Inc. no longer exists, WM is being
developed by a bunch of us working for free at the moment). I think that
Ray Kurzweil is a genuine visionary, but that he has a relatively narrow
vision of how the Singularity is going to be, and that his site is REALLY
devoted to promoting that narrow vision, although it's positioned as a
general site about AI and the future. This doesn't mean that interesting
stuff going beyond Kurzweil's particular views can't get on the site, but it
does mean that getting such things on there takes a lot of care. Frankly, I
think we'd have a better chance of starting our own site and getting really
deep content up that way. Obviously, there are many people on this list who
could contribute significantly to such a site, but we would require ONE
person who wanted to devote a significant chunk of their time to keeping the
thing going.

-- Ben G

> -----Original Message-----
> From: owner-sl4@sysopmind.com [mailto:owner-sl4@sysopmind.com]On Behalf
> Of Brian Atkins
> Sent: Wednesday, August 01, 2001 4:00 PM
> To: sl4@sysopmind.com
> Subject: Re: Article: The coming superintelligence: who will be in
> control?
>
>
> What I really find lacking at the Kurzweilai.net site are some articles
> going beyond trend tracking and guessing what will happen. I'd like to see
> people take the issue farther and try to figure out a) can we manipulate
> the timing and character of the Singularity significantly b) if so, should
> we accelerate it? Perhaps Amara could do an article on that with some
> of our help? Of course all singularitarians realize the responsibility
> to take action to support and accelerate the Singularity safely, but many
> other people seem to either not have considered this yet, or are content
> to sit back and "let history take its course", which IMO is not rational.
>
> Peter Voss wrote:
> >
> > For those of you who haven't seen this on KurzweilAi.net
> >
> > "The coming superintelligence: who will be in control?"
> >
> > http://www.kurzweilai.net/meme/frame.html?main=/articles/art0223.html
> >
> > Peter
> >
> > www.optimal.org - Any and all feedback welcome: peter@optimal.org
>
> --
> Brian Atkins
> Director, Singularity Institute for Artificial Intelligence
> http://www.intelligence.org/



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT