Re: How Kurzweil lost the Singularity

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Thu Jun 20 2002 - 02:13:22 MDT


Max More wrote:
>
> However, since you cc'ed me on your message, I want to point something
> out that I'm not sure you are particularly aware of. Though your message
> called for Ray in particular to fund technological research that you see
> as important to the Singularity (to use the term that I continue to have
> reservations about), it's clear that you believe that your own work
> should be funded. I know that you are supremely confident in your own
> abilities. You are surrounded by those who you have convinced. I feel
> that it's up to me to point out more explicitly that not everyone shares
> your view of your research. I have no idea of Ray's opinion on this, but
> I do know the opinion of a number of other intelligent and informed
> people. I share the view of those who are highly skeptical of your
> expectations for your work. I and others would no doubt be more
> impressed if you were to demonstrate some actual results.
>
> I hope you show us to be mistaken. And I write this with no intention of
> offending you. It's just my view and, I know, that of some others. If
> Ray -- or anyone else -- does not fund your work, it could be that the
> reason is not the goal but lack of confidence in your abilities at this
> stage.

Max, I am fully aware of this, which is why I cited the Foresight Institute
and Neural Signals, Inc., neither of which I am personally involved with, as
well as the work of researchers who Kurzweil specifically mentioned as
important at the Foresight Gathering. (Perhaps I should have mentioned the
Extropy Institute as well.) This is not about the merits of any one course
of research, but a question of how we are to fulfill those moral obligations
that devolve upon us when we become aware of the Singularity. I think that
first we have to resolve the moral issue of "Can we speed up the
Singularity? Do we have a moral obligation to try?" before considering the
pros and cons of particular approaches. This is why I did my best to
mention a broad spectrum of paths to the Singularity in my reply to
Kurzweil, concentrating specifically on those research directions
(brain-computer interfaces, integrative computational neurology) that
Kurzweil has singled out as important.

If Kurzweil chooses to become an activist on behalf of *any* technology that
he believes is the most critical, the most neglected link on the path to
Singularity, I will applaud the ethics and responsibility of that moral
decision *before* raising any issues with his choice of technology. If
Kurzweil decides to throw his full efforts behind implementing the
Singularity, others will follow in his footsteps, with different choices of
key technology, and whatever real critical links exist will be funded
eventually. What I fear is a world in which people hear about the
Singularity and use it to rationalize whatever they were already doing.
What I fear is a world in which Kurzweil, as the first presenter of the
Singularity meme, inadvertantly inoculates his audience with the idea that
we can derive comfort from the Singularity concept without *doing* anything
to achieve the actual Singularity. If this happens, spreading the word
about the Singularity will do nothing to achieve the humanistic goal that is
the Singularity itself, and the people who insist that the Singularity is a
religion will be right.

-- 
Eliezer S. Yudkowsky                          http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT