Re: Suicide by committee (was: How hard a Singularity?)

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Thu Jun 27 2002 - 15:06:50 MDT


Ben Goertzel wrote:
>
> It does not seem at all strange to me to partially rely on the advice of
> an appropriate group of others, when making an important decision. It
> seems unwise to me *not* to.

Anyone incapable of taking advice or accepting criticism, who will therefore
tend to fail at Friendly AI, will by exactly the same logic tend to fail at
building AI. If Friendly AI wasn't an issue and the only question was
building AI as fast as possible, we'd be having the same argument except the
punchline would be "And if you don't take my advice, you'll fail at building
AI and we'll be wiped out by smallpox++!"

Incidentally, one very real danger of having a Friendly AI Advisory Board is
that the FAI programmers will listen to the advisory board... and *no one
else*, including Chinese or Indian geniuses who write in broken English and
never went to college but who happen to know exactly what the problem is and
how to fix it.

You know what I'd really like to see? The Friendly AI Argument Repository,
devoted to archiving all discussion of Friendly AI, which would also perform
the service of examining new arguments and notifying interested programmers
of anyone who seems to have anything really new and interesting to say.
This would prevent programmers who are already working full-time on an
incredibly absorbing AI project from tuning out all unsolicited emails on
Friendly AI because 95% of it is repetitions of criticisms they've already
heard. (Rationality, in feeding on criticism, uses it up.)

We have different opinions about the Voice That Must Be Heard. I think that
such a voice is more likely to come from a complete unknown, or from some
very smart person with no life, and is really not all that likely to come
from the kind of people ordinarily asked to serve on advisory committees.

-- 
Eliezer S. Yudkowsky                          http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT