Re: Augmenting humans is a better way

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sat Jul 28 2001 - 16:00:10 MDT


Gordon Worley wrote:
>
> <http://www.rbisland.cx/doc/sing_moral_systems.html>

Yeah, I've been meaning to comment on that.

"Post Singularity, it will suddenly be possible to implement morals as
constants that are always enforced. Such is desirable because it would
eventually lead to a utopian society..."

That has to be the worst opening presentation of the Sysop Scenario I've
ever heard. No offense. :>

This is *exactly* what James Higgins means by 1984. *I* wouldn't want to
live in that world. "Morals", to me, usually have the connotation of
sexual morality, religious morality, and so on - things for which many
moral systems pass judgements about things that have no business being
judged.

The goal of the Sysop Scenario is to implement an extremely small subset
of the rules that most people would consider to be part of "morality",
such that even transhuman or superintelligent citizens cannot break them
in such a way as to harm innocent bystanders.

The Sysop Scenario will almost certainly be nonutopian from the point of
view of the kind of people who are always sticking their nose in other
people's business. There will be Pedestrian child abusers taking it out
on virtual dolls instead of real people, all kinds of open blasphemy of
various religions, and so on. Furthermore, meddlers will be totally
unable to interfere with this without the permission of the meddlees - no
witch-burning permitted.

-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT