Re: Definition of strong recursive self-improvement

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sun Jan 02 2005 - 10:22:43 MST


maru wrote:
> So, in other words, you merely want an recursive optimizer that can code
> 'well enough' until it has leveraged weak superintelligence enough to
> solve that problem somehow?

No, because it would have lost the original optimization target (goal
system) by the time it solved the problem. I do not think you can ask
an SI to solve a problem without a well-formed description of that
problem. If you can produce a well-formed description of how to infer
well-formed descriptions from vague human requests, you'd be home-free,
but you still have to solve the initial problem and it's not simple.

-- 
Eliezer S. Yudkowsky                          http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:50 MDT