From: Eliezer S. Yudkowsky (firstname.lastname@example.org)
Date: Sat Jan 26 2002 - 11:03:14 MST
"Christian L." wrote:
> Regarding the recent discussion about ethics and if/how the Singularity meme
> should be spread, I have two quick and direct questions for Eliezer (or
> anyone else whom it might concern):
> 1) Do you think that you have the ability to get a majority of the
> population of the world (or even 30-40%) to have positive feelings towards
> the Singularity, before the event?
If I decided I was going to do nothing else with my life, and I had enough
time before the Singularity, then maybe. That is, I can see how I would
go about doing it, but there would be the usual caveats about anything in
life being difficult to accomplish.
> 2) If there is widespread condemnation and, say 98-99% of the people on the
> Earth did not want the Singularity to happen, how will this change your
> work? Would you consider abandoning the project altogether?
*That* is an interesting ethical question. I would have to answer that,
in the end, the *only* acceptable excuse for obeying public opinion in a
case like that is if you genuinely think that the large majority opinion
is correct. Obviously there should be a significant bias in favor of the
will of the majority, but I don't think it overrides everything else. If
the verdict of the Earthweb (a la Marc Stiegler) said "Eliezer, we want
you to break off on this temporarily while we try something else," I'd
probably obey, but that's because the Earthweb (by Marc Stiegler's
hypothesis) has a demonstrated ability to produce correct verdicts.
For a vote of Congress it would be equally straightforward: I'd move to
Japan and keep working. Congress has demonstrated too little competence
at producing complex strategies for complex issues. It doesn't matter
whether Congress is acting based on the goals of a democratic majority, or
is the emergent output of representatives trying to pander to their
concept of the lowest common denominator, because Congress as an electoral
body is not smart enough to produce strategies that can achieve complex
goals. The responsibility of an individual to give some credence to votes
of an electoral body is not totally one-way; the electoral body has a
responsibility to show some intelligence in return.
So you see, I don't believe this to be a straightforward yes-or-no
question, and it would depend on the *cause* of the widespread
condemnation - which I believe is answer enough. In matters like these,
opinions have weight because of their factual correctness or
incorrectness. In terms of ultimate goals, if 99% of the people on Earth
don't want the Singularity for itself, they are welcome to stay on Earth
after the Singularity finishes, but I must deny them their moral right to
deny me transhumanity based on their own moral judgements. *That* part is
unambiguous. It's only the question of managing a shared existential risk
that's ethically ambigiuous.
-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT