From: Ben Goertzel (firstname.lastname@example.org)
Date: Sun Jan 19 2003 - 12:10:03 MST
We *could* have a superintelligent self-improving AI within 10 years, in my
view. Whether this will happen depends on a lot of things, including the
presence of adequate funding for good-enough AI approaches, and the absence
of severe Luddite anti-AI activity ...
If things go slowly it could be 30 years instead...
Exactly how this superintelligent AI will impact human society, remains to
be seen, of course. It may happen that the superintelligence decides to
mostly mind its own business, releasing its insights into human society
gradually. In that case the Singularity will be more of a continuous event.
Or it may decide to go all-out....
-- Ben Goertzel
> -----Original Message-----
> From: email@example.com [mailto:firstname.lastname@example.org]On Behalf Of Thomas R
> Sent: Sunday, January 19, 2003 1:48 PM
> To: email@example.com
> Subject: Hello
> I am your newest member, Tom Mazanec http://tmazanec1.xepher.net
> I am roughly at SL3, although I have some characteristics of SL2 and even
> Of course, even technophiles sometimes like to putter in the garden,
> so this is not unusual.
> I also have some quibbles about the order of some of the developments
> given in the Future Shock Levels...for example, I think nanotechnology
> is less upsetting and closer in time than most of the SL2 features
> (of course, that might just be me).
> I skimmed the archives, but maybe this topic was covered and I missed it.
> What is your gut feeling about when the Singularity will occur?
> Just roughly, of course
> (I actually saw a prediction of December 21, 2012).
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:41 MDT