From: Ben Goertzel (firstname.lastname@example.org)
Date: Sat Aug 10 2002 - 07:22:38 MDT
Well, I disagree with moderate force ;>
I'd say that software robustness is exactly as relevant to SL4 as many other
things that have occurred on this list. It's certainly very important for
seed AI: we do want our seed AI to behave as specified, rather than having
its behavior directed by bugs.
Whether quasi-neural or engineered software is more robust is pretty damn
relevant to seed AI. the conversation may not be focusing on this point
explicitly, but it's there implicitly...
Also, it's very annoying to continue off-list a discussion that started
on-list and involves *so many people*
> -----Original Message-----
> From: email@example.com [mailto:firstname.lastname@example.org]On Behalf
> Of Eliezer S. Yudkowsky
> Sent: Saturday, August 10, 2002 6:33 AM
> To: email@example.com
> Subject: META: project COSA
> Unless someone strongly objects, I'm ruling that this thread has
> insufficient future-shock content for SL4.
> Eliezer S. Yudkowsky http://singinst.org/
> Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Sat May 25 2013 - 04:00:35 MDT