Re: An essay I just wrote on the Singularity.

From: Michael Anissimov (altima@yifan.net)
Date: Wed Dec 31 2003 - 10:40:08 MST


Robin, this is an interesting and entertaining essay! Congratulations
on getting the motivation to write down some of your ideas and reasoning
regarding the world-shaking issue of how humanity ought to approach the
Singularity. I disagree with the way you present/argue some things
though, so here I go with all the comments:

1. Why do you call Singularitarianism your "new religion"? I know it's
basically all in jest, but thousands of people have already
misinterpreted the Singularity as a result of the "Singularity =
Rapture" meme, and I don't think they need any more encouragement. I
would personally prefer that Singularitarians have the reputation of
being extremely atheistic and humanistic.

2. Like Tommy McCabe, I too have a problem with the "FAI means being
nice to humans" line. This gives a lot of people the mistaken
impression that FAI is going to be anthropocentric, unfortunately.

3. This is a fun paragraph:

"Combining a few issues here. I believe that strong superintelligence is
possible. Furthermore, I believe that to argue to the contrary is
amazingly rank anthropocentrism, and should be laughed at. Beyond that,
I think full AI is possible. It's the combination of the two that's
interesting."

I agree that people who believe strong superintelligence is impossible
are memetically distant enough from Singularity-aware thought that
trying to avoid offending/confusing them is pointless. Saying that the
combination of the two is what's interesting unfortunately gives the
reader the impression that AI and strong superintelligence in concert is
the only thing capable of initiating a Singularity (when self-improving
IA seeds are indeed possible, albeit unlikely.) It might cause readers
to mistakenly overestimate the safety of the IA path. The Singularity
is complicated and confusing enough that little wording issues such as
these can actually influence how the paper is interpreted by casual
surfers (if that matters.)

4. It seems like you're saying the range of possible Singularities
basically breaks down into either "seed AI" or "uploading", when other
IA techniques are indeed possible. Pre-uploading technology could
probably be applied to yield substantial human intelligence
enhancements, even though AI would almost certainly come before that as
well.

5. " Source code, /any/ source code, is a paragon of clarity by
comparison." gives the audience the impression that you are worshipping
code. :) Of course code will be "clearer" in a mathematical sense but
"paragon of clarity" in the sense of "it works cleanly" would take a lot
of programming effort, of course, and not any code would qualify.

6. "You see, Eliezer <http://yudkowsky.net/beyond.html> has convinced
me that a Friendly AI must be the first being to develop strong nanotech
on Earth, or one of the first, or we are all going to die in a mass of
grey goo." makes you sound like a cult victim, unfortunately. :( I
know it's fun to write down stuff exactly as it sounds in our heads, but
with Singularity issues, the wrong presentation can really damage your
credibility... I also think it's important that we present the FAI meme
in a way that doesn't focus on Eliezer so much - even though he
originated the idea, FAI-esque thinking has been going on for the past
decade or two, and its present day supporters include people like Nick
Bostrom, Brian Atkins, etc, not just Eliezer. Placing too much emphasis
on Eliezer will also make you look like a cult victim.

7. "Please understand that if someone gets to strong nanotech before
everyone else, they rule the world. This is not a subject for debate,
you can't fight back, there is no passing Go or collecting two hundred
dollars." is put very clearly, and concisely, and correctly. A little
skimp on the explanations again, but I suppose that if people seriously
question you here, they aren't likely to understand the issues
surrounding FAI anyway.

8. It could be nitpicking, but near the end of the essay, I would
personally say we're working towards a "successful" or "benevolent"
Singularity, rather than a "sysop scenario". "Sysop scenario", sadly,
gives people the wrong idea 90% of the time.

Anyway, congratulations again on writing something. Politics is indeed
largely irrelevant. This becomes clear around high SL2, as a matter of
fact. At the very least, politics is something we cannot influence
unless we pursue high-leverage goals, like devoting our lives to
politics, or, far better yet, building a Friendly AI.

Michael Anissimov

Robin Lee Powell wrote:

>This kind of captures, I think, why I believe as strongly as I do.
>
>http://www.digitalkingdom.org/~rlpowell/beliefs/sysop.html
>
>Of course, it's almost 2 in the morning, so it might not capture as
>much as I think. 8)
>
>-Robin
>
>
>



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:43 MDT