RE: Ethical theories

From: Keith Henson (hkhenson@rogers.com)
Date: Sun Feb 01 2004 - 22:16:11 MST


At 10:35 PM 01/02/04 -0500, you wrote:

>Hi Keith,
>
>Yes, I'm aware of the recent work on evolutionary psychology and ethics.
>
>I think this work is VERY relevant in terms of helping explain why human
>ethics are the way they are.
>
>However, when thinking about the ethics of posthuman digital minds,
>evolutionary ethics probably aren't so relevant, because these minds won't
>be evolving by Darwinian natural selection.

I am not so sure. The scale may be off in space and/or time, but it seems
to me that posthuman digital minds are going to be faced with much the same
kinds of problems of limited resources human groups were faced with. Being
able to directly change their nature might not make a lot of difference in
the long term about the kinds of ethics they adopt.

For example, close by matter and energy is more useful to you than it is to
others who are far away. Unless you get FTL, then minds are going to be
limited in the amount of matter they can use, ultimately by forming a black
hole, but more likely just because of light speed delays.

Does physics limit posthuman minds to a population rather than a
megamind? If so cooperation and/or competition become options?

>As per Jef's comments, it probably makes sense to tie a "universal theory of
>ethical systems" in with the process of "general evolution" as observed in
>the cosmos at large -- more so than with the ethical systems that emerge
>from Darwinian natural selection.

Far as I know we have not observed evolution in the cosmos at large
yet. We have seen a mess of evolution simulated in computers. It is
interesting to note that one kind of parasitism that was first observed in
a simulation was later found to exist in the biological world.

>However, even in this perspective,
>Darwinian natural selection is interesting as an important special case of
>general evolution.

Genetic programming is becoming a major method to find solutions. Works
well enough when you have enough cycles to burn. It's way below the level
where it needs ethics yet.

One thing for sure, a neighborhood of vastly powerful AIs duking it out
would not be a good place to raise a family. :-)

Incidentally, my recent thoughts at the included URL on rising or at least
not falling income per capita being a way to keep human groups from
starting wars might remotely have application to AIs.

Keith



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:45 MDT