From: Samantha Atkins (email@example.com)
Date: Tue Dec 03 2002 - 11:46:23 MST
Cliff Stabbert wrote:
> Samantha Atkins:
> SA> Complexity as such is no meaningful measure of "rightness" at
> SA> all. It certainly will not help us make moral decisions,
> SA> difficult or otherwise.
> My sense is that a "higher" ethics or morality does have some relation
> to increasing complexity (or consciousness or intelligence). On an
> intuitive or perhaps esthetic level I feel this is the "purpose" of
> sentience -- although I wouldn't propose that as an objective,
> measurable reality.
I have less problem with a particular kind of complexity such as
intelligence being part of a standard than just "complexity" in
general. Values are more specific in my thinking. What still
seems to be missing is quality of life (QoL) portions of the
morality/ethics meta-justification. Very loosely speaking I
would say "The Purpose" is increasing the amount of intelligence
and its potentials as well as opportuinities for actualizing
that potential. But that doesn't roll of the tongue easily.
> I can't properly formulate my exact "reasons" for this feeling, as it
> isn't something arrived at through reasoning. But roughly:
> Say we gave an AI the goal to minimize human suffering and maximize
> human pleasure. We could well end up with endless pleasure-center
> stimulation or its pharmaceutical or virtual equivalent. And although
> this would give us pleasure, it's an inward-oriented and stagnant,
> decadent path for sentience to take.
And it fails my loose stab at the general criteria of morals
above of course.
> (Pure pleasure also is less esthetically pleasing. It's superficial,
> it's not "deeply" satisfying, it doesn't fulfill.)
> In Christian terms it would be a sinful path. I am not religious nor
> was I raised as such, but in certain ways the statement that the
> above would "go against God's plan" resonates -- perhaps if you
> substitute "Life, the Universe & Everything" for "God". To say this
> is, indeed, to anthropomorphize the Universe, to ascribe it goals or
> purpose, for which there is no scientific excuse.
Not to mention that the above is massively boring. You would
have to remove part of human intelligence to have many people
"happy" with simply continuous pleasure. Pleasure is also quite
relative for us. Too much of a "good thing" results in the
devaluation of that pleasure and even eventual repugnance.
> Ultimately, it feels more Right to me for sentience to be oriented
> towards the complexity found "outside" itself, i.e. to embrace the
> Unknown, rather than to navelgaze. Facing challenges and hardship,
> assimilating more of the Universe's complexity, seems ethically on a
> higher plane than "just" refraining from hurting other sentients. (I
> put "outside" in scare quotes because there is much of the Unknown
> inside ourselves as well -- generally the scarier, darker, more
> difficult stuff). Increasing intelligence.
All the above said though, I have no right to choose for anyone
else. If they want the equivalent to being a wirehead then they
must have room to choose that although not to bind others to
supporting their decision directly.
> To take a different stab at this than ethics, one can judge answers to
> "What is the Meaning of Life?" esthetically. E.g., "it's all just
> random coincidence" is ugly and nihilistic; "it's so that God could
> judge you and reward/punish you for Eternity" seems trite and
> simplistic; "to become more and more aware of everything around you,
> to get to know things Deeply, until even the most ugly thing to you
> becomes beautiful and the most hated thing loved" seems a step up from
> that. Although still trite when put in those terms...
Ethics is generally considered more philosophically fundamental
than aesthetics. So using the latter to formulate and justify
the former is not very workable.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:41 MDT