From: Slawomir Paliwoda (firstname.lastname@example.org)
Date: Thu Oct 21 2004 - 17:53:50 MDT
> Here's a thought experiment: If I offered people, for ten dollars, a pill
> that let them instantly lose five pounds of fat or gain five pounds of
> muscle, they'd buy it, right? They'd buy it today, and not sometime in
> indefinite future when their student loans were paid off. Now, why do so
> few people get around to sending even ten dollars to the Singularity
> Institute? Do people care more about losing five pounds than the survival
> of the human species? For that matter, do you care more about losing five
> pounds than you care about extending your healthy lifespan, or about not
> dying of an existential risk? When you make the comparison explicitly, it
> sounds wrong - but how do people behave when they consider the two
> in isolation? People spend more on two-hour movies than they ever get
> around to donating to the Singularity Institute. Cripes, even in pure
> entertainment we provide a larger benefit than that!
Eliezer, I think your involvement in this project has caused you to lose a
bit of the sense of objectivity necessary to evaluate true options included
in your thought experiment, and I infer that from your question: "Do people
care more about losing five pounds than the survival of the human species?"
What this question implies is the assumption that donating to SIAI equates
to preventing existential risks from happening. Your question has an obvious
answer. Of course people care more about survival of human species than
losing five pounds, but how do we know that SIAI, despite its intentions, is
on a straight path to implementing humanity-saving technology? What makes
your organization different from, say, an organization that also claims to
save the world, but by different means, like prayer, for instance? And no,
I'm not trying to imply anything about cults here, but I'm trying to point
out the common factor between the two organizations which is that, assuming
it's next to impossible to truly understand CFAI and LOGI, commitment to
these projects requires faith in implementation and belief that the means
will lead to intended end. One cannot aspire to rationalism and rely on
faith at the same time.
I've noticed a Matrix quote in your essay, ("Don't think you are, know you
are"). There is an equally interesting quote from Reloaded you might agree
with, and it is when Cornel West responds to one of Zion's commanders,
"Comprehension is not requisite for cooperation." And even though I'm
convinced that Matrix trilogy is an overlooked masterpiece, much farther
ahead of its time than Blade Runner ever hoped to be, I don't think Mr. West
was correct. Comprehension is indeed a requisite for cooperation, and as
long as you are unable to find a way to overcome the "comprehension"
requirement, I don't think you should expect to find donors who don't
understand exactly what you are doing and how.
Let's say I'm a potential donor. How do I know, despite sincere intentions
of the organization to save the world, that the world won't "drift toward
tragedy" as a result of FAI research made possible in part by my donation?
How do I know what you know to be certain without spending next 5 years
Why SIAI team would need so much money to continue building FAI if the
difficulty of creating it does not lie in hardware? What are the real costs?
Why the pursuit of fame has now become a just reason to support SIAI? Are
you suggesting that SIAI has acknowledged that ends justify means?
Increased donations give you greater power to influence the world. Do you
see anything wrong in entrusting a small group of people with the fate of
entire human race? What would you tell people objecting to that idea? Do we
have the right to end the world as we know it without their approval?
These are difficult questions which perhaps illustrate the difficulty of the
deceptively simple choice to support, or not to support SIAI.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:49 MDT