RE: Ben's "Extropian Creed"

From: Ben Goertzel (ben@intelligenesis.net)
Date: Mon Nov 13 2000 - 20:42:40 MST


There are many things to say on this topic, and for starters I'll say only a
few of them

First of all, I am aware that libertarianism, and extropianism, are diverse
bodies of thought,
and that no two adherents to either of these philosophies can be expected to
agree with each other
except in fairly general terms

I knew Sasha really well, and I think I have a deep understanding of what
his views were, and I don't
think I misrepresented them. For all his warm-heartedness in person, I
think his intellectual view
really was Social-Darwinist. I think that Moravec's views really are that
way too -- though, judging from
his writings, not from knowing him personally (which means I have a lower
degree of confidence in this case).
This doesn't mean I think Eliezer's views are the same.

My apologies if I misrepresented things in this regard
in my article -- as anyone who has written journalistically knows, you're
always walking a fine line, wanting to
say things interestingly and excitingly and provocatively, but not to sayi
things sensationalistically
or even inaccurately.

There are many possible beliefs here to be sure.

One line of thinking, which Moravec often gravitates toward, is that humans
are basically inferior compared
to what's going to come after -- and we all die in the end anyway -- so
whether a few humans live or die
is ultimately not that crucial.

Another line of thinking, which Sasha explicitly maintained in many
conversations with me -- and Moravec and
many other libertarians hint at in their writings -- is that the "hi-tech"
breed of humans is somehow superior
to the rest of humanity and hence more deserving of living on forever in the
grand and glorious cyber-afterlife...

The former line of thinking is one I have some emotional sympathy for --
I've definitely felt that way myself
at some times in my life. The latter line of thinking really gets on my
nerves.

> I also don't believe that the poor will be left out of the Singularity. I
> think that the successful seed AI will blaze straight to strong
> transhumanity
> and superintelligence and invent advanced nanotechnology, at
> which point the
> cost of personal transcendence falls to zero. I think that who gets to
> participate in the Singularity will be determined by the answer of each
> individual human to the question: "Hi! Do you want to participate in the
> Singularity?" You say "Yes", you get uploaded, you run on your
> six-billionth
> of the mass of the Solar System; whoever you were before the
> Singularity, you
> are now a Citizen of Terran Space, coequal with any other Citizen.
>
> One of the major reasons I am *in* the Singularity biz is to wipe out
> nonconsensual poverty.
>
> Ben Goertzel, does that satisfy your call for a humanist transhumanism?

Well, Eliezer Yudkowsky, it does and it doesn't...

On the one hand, I'd like to say: "It's not enough to ~believe~ that the
poor will automatically
be included in the flight to cyber-transcendence. One should actively
strive to ensure that this happens.
This is a very CONVENIENT belief, in that it tells you that you can help
other people by ignoring them...
this convenience is somehow psychologically suspicious..."

On the other hand,

a) I'm not really doing anything to actively strive to ensure that this
happens, at the moment, because building
a thinking machine while running a business is a pretty all-consuming
occupation. So to criticize you for doing
like I'm doing would be rather hypocritical. Of course, once Webmind has
reached a certain point, I ~plan~ to
explicitly devote attention to ensuring that the benefits of it and other
advanced technology encompass everyone ...
but for all I know, so do you -- and anyway plans are far cheaper than
actions. The best I can say for myself in this
regard is that half of my employees are in a third-world country, Brazil.
Of course, they weren't poor before i hired them,
but at least I'm pumping money into the country, which benefits everyone
there, including the poor (who are VERY poor,
not like the "poor" in the US who have cars and big TV's, etc.).

b) hey, maybe you're right. I don't KNOW that you're wrong.... Sometimes
you get lucky and the convenient
beliefs are the right ones!!

I do have a mistrust of "ends justifies the means" philosophies -- but I
also know that sometimes
the end DOES justify the means...

Saying that the best way to help the poor is to bring the Singularity about
is definitely an "ends justifies
the means" philosophy -- because the means involves people like you and I
devoting our talents to bringing
the Singularity about rather than to helping needy people NOW....

This really gets back to the point that, although Sasha and I had very
different nominal political philosophies,
we actually lived our lives in very similar ways. (Well, he was single,
whereas I'm married with 3 kids ... he
was a Russian immigrant and I'm not ... so we weren't exactly clones ... but
on the spectrum of human beings
we came pretty damn close together). Only, I always felt guilty about not
spending more time helping the needy (not being comfortable with the "ends
justifies the means" approach),
whereas he always felt bad about not making a lot of money (because to him
money was an objective measure of value).

In other words, I'm happy enough to critique Extropianism as a collection of
words written down on paper,
a conceptual philosophy.... I'm happy enough to critique my friends, like
Sasha, whom I know well (and I know he
wouldn't mind me picking on him even after his death, he loved this kind of
argumentation). But I have no desire
to pass judgment, positive or negative, on Eliezer and others whom I don't
know.... I guess this makes me a weirdo, but that's not really news now is
it...

OK, I know that was not a satisfactorily articulate response, but I have a
lot of work to do so there you go for now...

ben



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:35 MDT