From: Brian Atkins (email@example.com)
Date: Sun Apr 07 2002 - 16:05:46 MDT
Evan Reese wrote:
> I will think about it some more. Perhaps I will change my mind. I think I
> can kinda get the idea of his wanting to try to influence the *tenor* or
> direction of a singularity, but I just don't see the doomsday scenario if
> this seed AI doesn't work out.
Hi Evan, if you want to think about it in strictly utilitarian ways, we
can do some simple math.
As you note, there are large industries out there right now producing
more and more powerful computers. The end result of this trend is that
smaller and smaller organizations can afford more computing power. It
won't take something on the scale of a manhattan project to fully
implement a seed AI. So the answer to question 1: "Is it theoretically
possible that a small group in the post-2010 period could cause a
Singularity?" is yes. Maybe even before that if computing power grows
quickly enough and the seed AI design of said group are good enough.
Question 2: "Why bother when someone else will do it even if we don't
or can't?" Answer is: 150k people die everyday. Although getting to a
Singularity a week earlier might not be life-and-death decisive for you
or anyone you know, it will in general matter greatly to about 1 million
people. If, by working on our designs for seed AIs now, us or Ben or
anyone else can be ready to let it fly the moment we can get our hands
on good enough hardware, this is well worth the efforts. There are many
examples of technologies only being developed well after they theoretically
could have been invented... this is not a situation where we want that to
Last thing to contemplate: if it turns out that the concepts of Friendly
AI turn out to be quite important in greatly reducing the risks of a
seed-AI-driven Singularity, then the work SIAI has already done was well
worth the pittance of dollars (yes, Eliezer deserves a better salary :-)
and personal effort we put into it.
Frankly it's hard to understand how anyone can think about something like
a Singularity and even come up with the thought "why bother?". Even if
you don't care about the 149,999 other deaths per day you should be
worrying about reaching safety for yourself. This isn't some event that
will not touch you. For anyone who is convinced a Singularity is likely,
the conservative way to behave in this era is not "why bother?". Now is
the time to seize every day.
-- Brian Atkins Singularity Institute for Artificial Intelligence http://www.intelligence.org/
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:38 MDT