From: Richard Kowalski (email@example.com)
Date: Tue May 24 2005 - 17:03:49 MDT
On Tue, 2005-05-24 at 16:22 -0700, Eliezer S. Yudkowsky wrote:
> ornamentation and tinsel. I don't think humans could build an AI that had no
> goal system at all until it was already a superintelligence.
Have you produced any papers or speeches that further clarify or
validate this thought? Do you know of anyone else who has come to the
same or similar conclusion independently?
Be the Singularity
This archive was generated by hypermail 2.1.5 : Sat May 25 2013 - 04:00:56 MDT