Re: physical pain is bad (was Re: Dynamic ethics)

From: Jef Allbright (jef@jefallbright.net)
Date: Tue Jan 24 2006 - 18:55:49 MST


On 1/24/06, Michael Roy Ames <michaelroyames@yahoo.com> wrote:
>
>
> The whole process of evolution, with its progress requiring the
> giga-deaths
> of living things, can appear horrific beyond description to those who look
> upon death as an unacceptable state. And yet, that is how we have come
> into
> existence. When it comes to be within humanity's power to end evolution
> on
> Earth, should we in fact do so? Should we put an end to the process that
> produced us?

Biological evolution is becoming irrelevant to the human organism and its
societies. Our choices with respect to the quality of life of others will
continue to be based on *our* values, subject to our instrumental power, and
perceived as moral to the extent that they are seen to promote increasingly
shared (because they work) values over increasing scope.

The "process that produced us" is broader and more fundamental than
biological evolution by natural selection. For example, in the present era
we are increasingly defined by our interactions with others and with our
technology, and the process of change continues to accelerate through us;
its drivers are more fundamental than we could even imagine we could
control.

Unless one were to choose isolation, which would offer only a limited
respite, the only moral attitude (that which furthers one's values) is to
head the Red Queen's words and run as fast as possible toward uncertain
stability within a co-evolutionary environment.

My current best judgment is to answer: no. Or perhaps, hell no! Some of my
> arguments are laid out earlier in this thread, but additionally, there is
> something I find bothersome about the possibility of humanity imposing its
> will universally on every other living thing. It appears a short-sighted
> action, something a child might do, or an obsessive. I think we are smart
> enough to leave our options more open. In fact, that is a significant
> part
> of what SIAI is about - opening options and keeping them open.
>
>
You touched on the essence of the moral question, which is the question of
how to "impose our will" (promote our values into the future) but to do so
with the broadest possible appreciation of the consequences. Knowing that
we can never achieve such a god's-eye view, we must apply our
ever-increasing (objective) knowledge of how the world appears to work, in
the service of an increasingly broad view of our (subjective) values, to
develop principles of effective action which for all intents and purposes
represent our evolving morality.

This approach to morality encompasses other moral compasses. [Can I get
away with that phrase?] For example, instinctual indicators shared with
other mammals such as feelings of disgust, pride and love. It encompasses
cultural frameworks of morality including religious and philosophical codes
of conduct and social reciprocity. It will also encompass the social
interactions of machine intelligence, with no concern for substrate, but
only for those shared values that appear to work over increasing scope
within a competitive environment.

- Jef
http://www.jefallbright.net



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:55 MDT