Approaching the end of Today and Tomorrow

From: Eliezer Yudkowsky (sentience@pobox.com)
Date: Sat Oct 23 2004 - 19:08:00 MDT


SIAI's 72-hour lightning campaign ends in eight hours - 5AM Eastern time on
Sunday morning. It's down to the last ninth.

I'm sorry I wasn't able to answer all the questions asked in the time
available. Hopefully people chose a sensible criterion in which they
realize I can't answer everything, but reasonable answers to some questions
are a good sign for those not yet answered. I hope to answer more, but
that won't happen until after Today and Tomorrow.

I think that transhumanism has some ways yet to go before we can compete
with flying-saucer cults, but people still donated despite the naysaying,
choosing to be strong without certainty. It looks to me like the stream of
donations slowed substantially after people began sharing their
justifications for not donating, and did not quite gain back the momentum
even after donors also began speaking up. But the campaign had a fair run
before the naysaying started.

The right to disagree is both morally and pragmatically necessary. A bias
in which thoughts we choose to communicate is a bias in collective
reasoning. Still, just because our ethics mandate an act doesn't mean the
act is without consequences. We've learned to show our disagreements with
one another. I don't think we're quite as good at consciously choosing
that it is possible to act coherently despite disagreements, even
disagreements that seem important; or consciously correcting for the peer
pressure felt when disagreement is more likely to be publicly aired than
agreement. It takes work. It isn't natural to us. But I think we can do
it if we try.

Some of the justification for not donating to the Singularity Institute
took the form, "Why aren't you further along / doing more?" Well, that's
rather a Catch-22, isn't it? If you think the Singularity Institute should
be doing XYZ... go ahead, don't let me stop you. It's your planet too.
No, seriously, it's your planet too. We took responsibility. We didn't
take responsibility away from you.

The Singularity Institute is a banner planted in the ground, a line drawn
in the sands of time: This is where humanity stops being a victim and
starts fighting back. Sometimes humans wander up to the banner, see that
not much of an army has gathered, and then wander away. It can be hard to
get the party rolling, if people only want to join after the room is
already crowded. No matter who else comes and goes, you will find Eliezer
Yudkowsky standing by that banner, gnawing steadily away at the challenge
of Friendly AI, which is one of the things that humanity needs to be doing
at this point. For SIAI to grow another step we need three things: enough
steady funding to pay one more person, one more person to pay, and a
worthwhile job that person can do. It's the second requirement that's the
most difficult, and what makes the second requirement difficult is the
third requirement. It isn't easy to find people who can do worthwhile
jobs. SIAI doesn't want to invent make-work, token efforts to show we're
doing something. But even if there were *no* active workers yet present at
the banner, not even Eliezer Yudkowsky, there would still need to be a
Singularity Institute. There would still have to be a rallying point, a
banner planted in the ground, a gathering place for the people who wanted
to make it happen. It would have to begin somewhere, and how else would it
ever begin?

One year ago we didn't have Tyler Emerson or Michael Wilson or Michael
Anissimov. Progress is being made. If it's too slow to suit you, get out
and push. It's your planet and your problem. We took responsibility but
we didn't take it from you.

Those who still haven't donated anything at all - ask yourself whether the
Singularity Institute has been worth more to you, and to the transhumanist
community, than the price of a movie ticket. Our suggested donation was a
hundred dollars, but if you can't afford that, ten dollars is better than
nothing.

http://intelligence.org/donate.html

-- 
Eliezer S. Yudkowsky                          http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:49 MDT