Re: Article: The coming superintelligence: who will be in control?

From: Brian Atkins (brian@posthuman.com)
Date: Wed Aug 01 2001 - 23:19:05 MDT


"Amara D. Angelica" wrote:
>
> Brian, an intriguing idea. Can you or anyone else elaborate?
>
> > -----Original Message-----
> > From: owner-sl4@sysopmind.com [mailto:owner-sl4@sysopmind.com]On Behalf
> > Of Brian Atkins
>
> > going beyond trend tracking and guessing what will happen. I'd like to see
> > people take the issue farther and try to figure out a) can we manipulate
> > the timing and character of the Singularity significantly b) if so, should
> > we accelerate it?

I'm not sure if you wanted me to elaborate on the fact that no one is
talking about this issue, or to elaborate on possible answers to A and B.
I'll do the former, because as you can see below I think Ray and the
people on this list have already decided for themselves that the answer
to B is Yes. The answer to A IMO is Yes, and I think Ray would also agree
with that at least to a limited extent. What I am frustrated by is seeing
people realize that both A and B are Yes, but then not helping to push the
Singularity closer to us in time. So I'd like to get the word out about
this issue, since (again IMO) this is even more important than simply
realizing a Singularity is coming. Once you see it's coming then you have
to go a step farther and "pick a side". Sitting on the fence like a reporter
(no offense :-) is not a rational choice in this situation. If you believe
the Singularity will be good (and I mean good in the sense of saving your
life, not in the sense of whether congress lowers your tax rate 3%) for you
then you should try to advance it. If you believe the Singularity will be
bad for you, then you should try to prevent it. This is a world changing
bit of history.

Here's some examples of the "blindspot" that I'm talking about. Go look at
part 4 of the Extro-5 Kurzweil talk here:

http://www.kurzweilai.net/meme/frame.html?main=/articles/art0235.html

skip ahead to around 13:15 at which point Eliezer asks Ray something
along the lines of "Well, you've described the Singularity and our
progress to it so far, but you haven't said what kind of Singularity
you would like to see or what time you would /prefer/ it to happen".
Then Ray sits there for like 7 seconds (which makes me think he might
not have thought about this much) before someone in the back says
something that causes him to then go off on a tangent without answering
the question. Very frustrating since that was the one question I wanted
an answer to! :-)

If you or anyone else here has ever seen him address this I'd like to
know about it. He seems to have chosen a clinical observer style when
it comes to the Singularity, which lets him make predictions about what
the future might be like, but yet not consider the fact that someone
with his excellent grasp of the situation would be exactly the right
kind of person to help guide and support the actual development. Here's
a quote from his book precis:

"Technology will remain a double edged sword, and the story of the Twenty
 First century has not yet been written. It represents vast power to be
 used for all humankind's purposes. We have no choice but to work hard to
 apply these quickening technologies to advance our human values, despite
 what often appears to be a lack of consensus on what those values should
 be."

Confusing to say the least, unless he's running a secret AI or brain
scanning project we don't know about :-) He advocates working to achieve the
Singularity, and points out that the history is not made yet, but provides
no advice (outside of some possible future scenarios) on what might be the
best way to achieve it, whether we should try to accelerate the arrival of
it (is it "ethical" to accelerate the Singularity), and how we might do so
if we do decide we want to accelerate it.

He goes on to talk about the purpose of life which he sees as evolving
to the Singularity. He says that, but then almost immediately goes back
towards pointing out that the real reason the Singularity will happen
is due to economics. However, if he really feels like achieving a
Singularity is the goal of life, then I'd like to ask him: what are your
plans for after you finish your book? How will you help to achieve this
goal? I don't see any answers to that, or even anyone else asking this
question of themselves (besides people who hang around here) or Ray.

He ends his precis with the admonition that we all should "stick around
so you might see the Singularity". Which really clenches it for me- if
he had really internalized the "Singularity is the goal of life" idea
then he would instead be telling people to go out and help get the
Singularity here more quickly so that even more people now living will
be able to survive to see it.

Now I don't want to look like I'm stuck on Ray. He just is the most
glaring example to me because I've read so much of his stuff lately.
He also is the biggest proponent of the Singularity in the mainstream
world, and yet still seems to be missing the final pieces of the picture.

-- 
Brian Atkins
Director, Singularity Institute for Artificial Intelligence
http://www.intelligence.org/


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT