RES< Re: Inevitability of Singularity

From: sunrise2000@mediaone.net
Date: Wed Jun 20 2001 - 14:43:24 MDT


Does anybody have recordings/transcripts workshops at Extro5?
I'd like to scan them for memetic attractors.

-Your favorite Sysop

> Date: Tue, 19 Jun 2001 20:26:46 -0400
> From: Brian Atkins <brian@posthuman.com>
> To: sl4@sysopmind.com
> Subject: Re: Inevitability of Singularity
>
> "Christian L." wrote:
> >
> > Just out of curiosity: has the SIAI discussed these matters? Worst case
> > scenarios?
> >
>
> There has been a little bit of discussion, but this really isn't our
> area other than simply making sure of our own safety... we try to
> support other organizations like ExI and now Pro-Act which was announced
> at Extro 5.
>
> As for the Singularity, I hold that it is inevitable except in the
> case where humanity wipes all life completely out. For starters, Kurzweil
> made a nice point in his Friday night talk showing how little the
> effect was of the Great Depression and WW2. Secondly, even if the luddites
> were to somehow take us back to the dark ages we would eventually work
> our way back up to this level again... it is the nature of intelligent
> beings to drive the Singularity. So in the long term sense, I do see it
> as the inevitable result of evolution and intelligence.
>
> But of course we want to get there sooner rather than later.
> --
> Brian Atkins
> Director, Singularity Institute for Artificial Intelligence
> http://www.intelligence.org/



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT