From: Mark Waser (firstname.lastname@example.org)
Date: Wed Mar 12 2008 - 21:05:11 MDT
> Try reading http://yudkowsky.net/singularity.html to get an idea of
> the potential power behind AGI. Note that this paper was originally
> written in 1996.
Please assume that I have read (several times) and assimilated all of the
papers on the SIAI and Yudkowsky websites (unless you can see specific
points that you believe that I have missed). I am not so arrogant that I
would have done something like this without attempting to assimilate all of
the necessary background information. *No one* is good enough to do that.
> This is the vast majority of systems. In general, there are going to
> be many more simple systems than complex systems, because each
> additional bit of complexity requires additional optimization power.
> This is the principle behind Solomonoff induction.
Of course. The vast majority of systems are going to be simple but they
will also be unintelligent. The intelligent systems are going to be complex
and have many goals (or else all the effort in making them complex was
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT