[Fwd: Re: How Kurzweil lost the Singularity]

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Thu Jun 20 2002 - 01:57:46 MDT


 From Max More:


attached mail follows:


At 04:57 PM 6/19/2002 -0400, Eliezer S. Yudkowsky wrote:

>The role of individuals in scientific history is often exaggerated but it
>is certainly no exaggeration to say that a stroke of genius can accelerate
>progress by years or decades. I believe that by encouraging individuals
>to direct their efforts specifically toward the Singularity, it may be
>possible to place on a firm basis scientific projects that might otherwise
>suffer from poverty of resources or poverty of genius at a critical moment
>in history.
>
>To put it bluntly, you have enormously more resources at your disposal
>with which to effect a Singularity, but it appears to many of us in the
>Singularity community that those resources are going unused. [...] If
>Friendly AI is important, why not fund the Singularity Institute?

Eliezer, I understand and appreciate the points you made. I, too, would
like to see enormously increased efforts towards intelligence augmentation
(more than towards AI, though the two will overlap).

However, since you cc'ed me on your message, I want to point something out
that I'm not sure you are particularly aware of. Though your message called
for Ray in particular to fund technological research that you see as
important to the Singularity (to use the term that I continue to have
reservations about), it's clear that you believe that your own work should
be funded. I know that you are supremely confident in your own abilities.
You are surrounded by those who you have convinced. I feel that it's up to
me to point out more explicitly that not everyone shares your view of your
research. I have no idea of Ray's opinion on this, but I do know the
opinion of a number of other intelligent and informed people. I share the
view of those who are highly skeptical of your expectations for your work.
I and others would no doubt be more impressed if you were to demonstrate
some actual results.

I hope you show us to be mistaken. And I write this with no intention of
offending you. It's just my view and, I know, that of some others. If Ray
-- or anyone else -- does not fund your work, it could be that the reason
is not the goal but lack of confidence in your abilities at this stage.

If I had a few billion, I would probably throw some resources your way just
in case. As it is, I may not be the only one who supports your general
sense of urgency to fund intelligence augmentation (not necessarily the
approach you favor) but who isn't at all convinced that you are the One.

On a more positive note, someone looking for funding might be interested in
this:

DARPA ANNOUNCES RESEARCH INITIATIVE IN COGNITIVE SYSTEMS

The Defense Advanced Research Projects Agency (DARPA) today announced a new
research initiative in the field of cognitive systems, releasing a Broad
Agency Announcement (BAA) seeking innovative research proposals.

The DARPA Cognitive Information Processing Technology Initiative will
develop the next generation of computational systems with radically new
capabilities. "Cognitive systems" will demonstrate levels of autonomy and
reasoning far beyond those of today's systems. With the ability to reason,
learn and adapt, and with facilities for self-awareness, these will
literally be systems that know what they're doing.
http://www.darpa.mil/body/NewsItems/pdf/iptorelease.pdf

Max

_______________________________________________________
Max More, Ph.D.
max@maxmore.com or more@extropy.org
http://www.maxmore.com
Strategic Philosopher
President, Extropy Institute. http://www.extropy.org <more@extropy.org>
________________________________________________________________
Director of Content Solutions, ManyWorlds Inc.: http://www.manyworlds.com
--- Thought leadership in the innovation economy
m.more@manyworlds.com
_______________________________________________________



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT