RE: Ben's "Extropian Creed"

From: Samantha Atkins (satkins@intraspect.com)
Date: Tue Nov 14 2000 - 14:10:52 MST


-----Original Message-----
From: owner-sl4@sysopmind.com [mailto:owner-sl4@sysopmind.com]On Behalf
Of Mark Plus
Sent: Monday, November 13, 2000 8:39 PM
To: sl4@sysopmind.com
Subject: RE: Ben's "Extropian Creed"

Ben Goertzel wrote,

[snip]
>There are many possible beliefs here to be sure.

One line of thinking, which Moravec often gravitates toward, is that humans
are basically inferior compared
to what's going to come after -- and we all die in the end anyway -- so
whether a few humans live or die
is ultimately not that crucial.

samantha:
But humans can transform so that the inferiority (if we even bother to keep
such distinctions) is largely illusory. What is inferior about a human
persona freed of the ills of the flesh and able to create/take on new
physical forms at will and with vastly improved intellectual abilities?

ben:
Another line of thinking, which Sasha explicitly maintained in many
conversations with me -- and Moravec and
many other libertarians hint at in their writings -- is that the "hi-tech"
breed of humans is somehow superior
to the rest of humanity and hence more deserving of living on forever in the
grand and glorious cyber-afterlife...<

samantha:
Augmented humans will be percisely that, augmented and thus superior over
the augmented aspects. Whether with greater abilities, greater intelligence
and so on comes any greater wisdom or value to self and others is a
different question. Greater value seems certain as far as practical
benefits are concerned.

[snip]
Mark Plus:
Have you considered the more benevolent form of Transhumanism (called "the
Expansionary Theory") in Michael Zey's new book, _The Future Factor_? Link
to <http://www.amazon.com/exec/obidos/ASIN/0071343059/> for my review on
Amazon.

Zey advocates enhancing humans in definitely a Transhumanist way, but
criticizes Moravec and Kurzweil for speculating that advanced AI's will
subordinate humans. He points out that this scenario resembles the
Neo-Luddite proposal to subordinate humans to other species or to "Gaia."
Why do humans have to become subordinated to anything, he asks?

samantha:
I have read parts of the book. Zey is no transhumanist. He does not
believe in augmenting human capabilities in any major way. Hence his notion
of humans is rather stifling. If I understand him correctly it is he who is
being a Luddite. Subordinate is passe. Humans will either transform or
perish.

- samantha



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:35 MDT