From: pdugan (firstname.lastname@example.org)
Date: Sun Jan 22 2006 - 17:32:56 MST
Phillip's solution seems functional enough to be a sustainable meta-ethic. I
think the key to his solution lies in its evaluation of the mode of
interaction, as well as the degree, frequency ect. Ethics as social codes have
evolved in human societies to mitigate risks involved in our interactions.
When new forms of interaction become possible there typically follows a phase
transition where the old morality is challenged and re-figured, if not
completely abandoned for a new worldview, as happened during the "age of
reason" in Europe and early America.
The point isn't that morals are relative and meaningless, though that
sentiment could be taken, but rather that an increase in wealth and complexity
of information (regarding both society and technology) generally facilitates
the evolution of new, formerly impractical ethics. Taken to the logical
extreme of the singularity, its possible that ethics may exist in increasingly
diverse and transitory modes, and that a moralist/ethicists/theologian looking
at a post-singularity civilization may see a mind-boggling version of Soddam
and Gammorah, where "do what you will" is the only absolute dictem.
The point, once again, is not: do what you will is the whole of the
post-singularity law (to paraphrase Aliester Crowley). The point is that as
information diversity/complexity increases the underlying ethical constraints
of our behavior and thoughts will become increasingly transparent, facilitated
by a bounty of relative perception.
>===== Original Message From Phillip Huggan <email@example.com> =====
>Fine, fine. Here is the answer you are looking for. The level of actual
interaction a living organism should be permitted with other living organisms
is proportional to how negatively the behaviour of the organism affects the
contacted community. A lion that can only kill gazelles should be restruicted
to inert simulations of the killing. A person who needs to kill for pleasure
should similiarly restricted to fulfilling this desire in a holodeck world or
with nervous system implants. If you can't play nice, you have to sit in a
corner, so to speak.
> This system works as long as resources don't become scarce. That shouldn't
become an issue for a long time.
>Philip Goetz <firstname.lastname@example.org> wrote:
> I began this thread to discuss problems in constructing an ethical
>code that works across species, in a society or ecosystem composed of
>species of many different levels of intelligence. This is an
>important and valid SL4 problem.
>So far, AFAIK not one person has responded to this aspect of what I
>posted. All you have done is posit the same moral dilemmas that
>humans have faced for millenia, substituting "AI" for "government".
> Ring in the New Year with Photo Calendars. Add photos, events, holidays,
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:55 MDT