From: Ben Goertzel (firstname.lastname@example.org)
Date: Wed Dec 31 2003 - 14:43:09 MST
IMO, how easy it is to accidentally create gray goo, is something we're not
going to really know until we have strong nanotech in our hands...
-- Ben G
> -----Original Message-----
> From: email@example.com [mailto:firstname.lastname@example.org]On Behalf Of Robin Lee
> Sent: Wednesday, December 31, 2003 2:33 PM
> To: email@example.com
> Subject: Re: An essay I just wrote on the Singularity.
> On Wed, Dec 31, 2003 at 12:48:52PM -0500, Ben Goertzel wrote:
> > It's just not true that humans developing strong nanotech will
> > *necessarily* lead to destruction.
> Agreed. However I, personally, see it as far more likely than for
> any previous technology (all one of them: nukes). Mostly because
> you can accidentally create grey goo, but you can't accidentally
> bomb the world to the stone age.
> > So far as preparing for the future goes, the most important thing
> > is to get ourselves -- individually and collectively, with
> > whatever human and/or AI intelligence is available -- in a mental
> > position that is capable of dealing adequately with the advent of
> > wild surprises. This is much more important than planning for any
> > particular conjectural contingency.
> I think that's a good point.
> Me: http://www.digitalkingdom.org/~rlpowell/ *** I'm a *male* Robin.
> "Constant neocortex override is the only thing that stops us all
> from running out and eating all the cookies." -- Eliezer Yudkowsky
> http://www.lojban.org/ *** .i cimo'o prali .ui
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:43 MDT