From: Tennessee Leeuwenburg (email@example.com)
Date: Mon Aug 28 2006 - 23:18:46 MDT
Suppose that it is possible to build expert systems that do very well.
Might not a very powerful AI entity be built from such components, yet
have no consciousness of those components?
If an entity achieves singularity, what does it mean if it is not
conscious of all of its parts?
Also, might not a very powerful AI entity prevent the Singularity from
Is there more to be read on those topics?
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:57 MDT