Re: ESSAY: Forward Moral Nihilism

From: John K Clark (jonkc@att.net)
Date: Mon May 15 2006 - 01:01:17 MDT


<m.l.vere@durham.ac.uk> Wrote:

> morality is built (atrificially), and people follow it
> because of evolution induced emotions.

Well what's the hell is "artificial" about that, humans didn't invent
evolution, and for that matter who gives a hell if it is artificial? You
keep using the word like it is a horrible insult, but I like artificial
stuff.

> I believe that it will be very unlikely that there will be multiple
> transhumans of anywhere near the same level (at least at the top level) of
> power/intelligence - the most powerful would never let anyone else catch
> up. Morality for cooperation will be unnesscissary.

The universe is a big place, perhaps the biggest, so I think it's likely
there is room for more than one Jupiter brain in it, and even if there are
only two morality will be essential. And if your scenario is true and
transhumans try to suppress the advancement of other transhumans then they
will be doomed to eternal war.

So I think morality will come in very handy, but it won't be the naïve
morality espoused by most on this list with their friendly AI meme; the idea
that we could engineer a Jupiter brain in such a way that it considered our
well being more important that its own is ridiculous, such a situation would
be about as stable as balancing a pencil on its tip. And in a way it's not
even moral. I find the idea of a slave who is superior to me in every way
imaginable but is nevertheless subservient to me repulsive; but that's just
me, your mileage may vary.

  John K Clark



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:56 MDT