RE: How hard a Singularity?

From: Ben Goertzel (ben@goertzel.org)
Date: Tue Jun 25 2002 - 13:58:08 MDT


Eliezer,

I'm sorry if I've misinterpreted your statements over the past couple
years...

In our various interactions, you have seemed to me to display a HUGE
confidence that

a) you personally are somehow uniquely suited or even "destined" to play a
key role in bringing the Singularity about

[others have gotten this impression from you as well; I recently received a
personal e-mail in which someone else referred to (his wording) your idea
that you are "The One"]

b) your approach to Friendly AI is the right way to ensure the Singularity
comes out well

My impression has been that your confidence in a) and b) is at least a
little excessive -- enough so to make me a bit uncomfortable sometimes.

However, my subjective impressions of other human beings are not always
accurate, and apologies are due if my impressions have been inaccurate.

I hasten to add that I am far from a perfectly rational being myself, and am
surely just as full of flaws as any other human (just ask my wife ;), I'm
not trying to be "holier than thou" here...

-- Ben G

> Don't worry. My alleged "self-confidence" is Ben's invention. I
> happen to
> be fairly confident that many of Ben's theories are wrong; it's
> not at all
> the same as being confident that my own theories are right.
>
> Nobody who thought the Singularity was understandable would ever have
> invented Friendly AI.
>
> --
> Eliezer S. Yudkowsky http://intelligence.org/
> Research Fellow, Singularity Institute for Artificial Intelligence
>



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT