Re: AGI funding (was Re: Some bad news)

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Fri Nov 08 2002 - 20:06:22 MST


Slawomir Paliwoda wrote:
>
> Yes, in order to create a message that would resonate with people, one needs
> to put a good spin on Singularity. Instead of talking about Jupiter Brains,
> Transition Guides, Singularitarianism, or boring AI technical details you
> might just as well say something like "FAI could help to cure cancer and
> aids", "FAI would be helpful in figuring out your material problems", or
> that "FAI could save people from dying". That's basically how the
> nanotechnology gets "sold" to the masses. And since FAI could do nanotech,
> well........ you get the idea. If anybody is interested enough, the FAI
> websites would provide the further details.

Dumbing down the Singularity for "the masses"... is that really what an
FAI would do? It doesn't seem very compassionate. Or very honest. Or
very effective. I have seen newbie Singularitarians try to spin the
Singularity for what they fondly imagine to be the lowest common
denominator, and I have never once seen that trick work. All that happens
is that the real message - the message that originally produced the
Singularitarian - is lost out in a morass of unconnected technical details
and unsupported noncredible assertions. If you're saying things that you
genuinely believe, somewhere in there will be things that even other
people may find to be worth believing. If you're trying to be
manipulative, saying things you think are real clever and that will really
pander to those stupid masses, you will make amateurish manipulative
statements that will instantly be filtered out by an audience trained to
resist the highly experienced manipulation of the trained professionals.
Is the solution to get more practice at manipulating people? No. That
high a competence at manipulation takes too much evil. There's no way
those skills could be employed at that level while keeping the essential
message of the Singularity intact. Remember that we are not here to make
ourselves famous. We are here for the Singularity. If you make yourself
famous and lose the Singularity that is anti-progress.

It sure is an inconvenient constraint that theories actually be true,
isn't it? Science is very useful and it needs more funding. We need to
sell Science. Well, Science would have so much easier a time competing
with astrology if we could just get rid of all that experimental evidence
and focus on finding theories that are more dramatic and easily
understandable to the audience. Let's get rid of that Newtonian
gravitation thing and replace it with angels. People like angels. And
yet... it was a popular theory, we got to be really famous and everything,
and yet somehow, even though Science has more funding than before, it
isn't really accomplishing as much as it was earlier.

-- 
Eliezer S. Yudkowsky                          http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:41 MDT