Re: How hard a Singularity?

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sun Jun 23 2002 - 04:39:58 MDT


Eugen Leitl wrote:
> On Sat, 22 Jun 2002, Eliezer S. Yudkowsky wrote:
>
>>which successfully increases her own intelligence (though it might take much
>>more than a month to manage this, for an unprepared upload) will increase
>
> I challenge you to make nontrivial progress in a month if given full low
> level access to your current cognition processes (emulated wetware at
> molecular level), besides producing a lot of neat neuroscience papers.

That's why I said it might take much more than a month, *for a human*!
Under the scenario given the uploaded human arrives on her new substrate
with no prior experience in recursive self-improvement; no tools, no skills,
no protocols. This is not true of a human-level seed AI; how do you think
the seed AI got there in the first place?

-- 
Eliezer S. Yudkowsky                          http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT