Re: Augmenting humans is a better way

From: James Higgins (
Date: Sat Jul 28 2001 - 16:20:23 MDT

At 06:00 PM 7/28/2001 -0400, Eliezer wrote:
>Gordon Worley wrote:
> >
> > <>
>Yeah, I've been meaning to comment on that.
>"Post Singularity, it will suddenly be possible to implement morals as
>constants that are always enforced. Such is desirable because it would
>eventually lead to a utopian society..."
>That has to be the worst opening presentation of the Sysop Scenario I've
>ever heard. No offense. :>
>This is *exactly* what James Higgins means by 1984. *I* wouldn't want to
>live in that world. "Morals", to me, usually have the connotation of
>sexual morality, religious morality, and so on - things for which many
>moral systems pass judgements about things that have no business being


>The goal of the Sysop Scenario is to implement an extremely small subset
>of the rules that most people would consider to be part of "morality",
>such that even transhuman or superintelligent citizens cannot break them
>in such a way as to harm innocent bystanders.
>The Sysop Scenario will almost certainly be nonutopian from the point of
>view of the kind of people who are always sticking their nose in other
>people's business. There will be Pedestrian child abusers taking it out
>on virtual dolls instead of real people, all kinds of open blasphemy of
>various religions, and so on. Furthermore, meddlers will be totally
>unable to interfere with this without the permission of the meddlees - no
>witch-burning permitted.

And if done perfectly it might be a really good thing. But even the
smallest mistake in how this is done could lead to a horrible existence for
some or all of the individual involved. Plus it will, by nature, be
impossible to change, ever. I don't like that too much either. Thus the
whole Sysop scenario makes me very nervous.

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT