From: Ben Goertzel (firstname.lastname@example.org)
Date: Fri Mar 01 2002 - 08:46:11 MST
> I see. I guess I'd better administer the oath, then, with the
> remainder of
> the SL4 list standing in for the human species as witnesses.
Ah, Eli.... As you know, you're bringing up the good old
Eli = "100% Singularity dedication"
Ben = "Powerful Singularity dedication plus some ordinary human
This is familiar ground for SL4, and I'd guess that many of the members of
this list are probably bored with this particular quibble between the two of
But I'll answer you anyway ;)
I don't particularly believe in "oaths", although about 20 years ago, I had
some Dungeons & Dragons characters that did....
If I did believe in oaths, however, I wouldn't take your oath.
> Do you, Ben Goertzel, having declared your intention to entangle
> yourself in
> the Singularity and thereby make the death or survival of
> humanity partially
> and perhaps wholly dependent on your actions, acknowledge that the
> responsibility you have accepted transcends all other considerations, and
> swear that you shall hold it inviolable? Do you swear that,
> insofar as you
> may have any especial ability to influence the course of
> humanity, you shall
> never use this ability for any personal consideration, and you shall act
> only under the name of all humanity or all sentient life, and not for any
> lesser group or faction?
As we've discussed before, Eli, my answer is: Almost but not quite.
I find this question, when taken seriously, poses "Sophie's Choice" type
(If anyone doesn't know this William Styron novel, later cinematized: it
regards a young Polish mother who, while standing in line waiting for entry
to a Nazi death camp with her two children by her side, is given the choice
to choose at most *one* of her kids to be spared from death. The child
selected for death, and the child spared, will see exactly what her choice
was. The child selected for death will watch their sibling walk off to
safety, and watch their mom with fearful eyes as they're dragged away,
knowing their mom has just chosen them for death. What should she do?
Choosing one is an impossible choice -- she loves them both infinitely and
equally -- but choosing none means they both die. She chooses her son, and
torments herself psychologically for the rest of her life. Her son is
killed a few years later, anyway.)
Your oath immediately leads me to the question: Would I murder my own
children for the Singularity?
And my answer is: I really don't know.
Now, Eli, perhaps you believe yourself to be *so* altruistic and so
dedicated to the Singularity that,
a) you'd never have children except under very unusual circumstances,
because it would interfere with your work toward the Singularity
b) if you did have kids, and it were somehow necessary, you would
*definitely* murder them for the Singularity
However, even if you *do* beieve yourself to be this purely altruistic and
Singularity-dedicated, *I* do not have faith that *your* altruism and
Singularity-dedication are that absolute. I don't consider it *impossible*
than they are, but I also have a lot of respect for the diverse complexity
and perversity of human nature, and the ability of novel situations to bring
out novel responses in human beings.
I very much appreciate the deep philosophical/ethical approach to
Singularity work that you advocate and embody. However, I do not feel that
the kind of 100% altruistic Singularity-dedication that you describe is
necessary in order to work effectively and beneficially toward Real AI and
I have a deep-seated skepticism of "the ends justifies the means" type
thinking, and, in my view, the extreme nature of your proposed oath has a
bit of this flavor to it.
This isn't to say that there couldn't be a circumstance in which I wouldn't
make the choice to murder my kids in order to bring about the Singularity.
However, it would have to be a pretty extreme circumstance. Under very many
circumstances, I would decide that a Singularity that required me to murder
my children wasn't the right kind to bring about anyway....
Anyway, rather than worrying about bizarre ethical worst-cases, I am
proceeding under the assumption that bringing the Singularity about is most
likely NOT going to involve any terrifying Sophie's-choice-type dilemmas for
In fact, Eli, I reckon that in 100 year or so, our uploaded selves are going
to look back on conversations like this and emit some powerful fully-digital
laughter -- at oaths, children, seriousness, and all the other ridiculous
though somewhat charming human concepts and mental habit-patterns with which
they are bound up.
Hmmm... a conversation about circumstances in which I might butcher my
offspring to save the human and posthuman race is really a cheerful way to
begin the day ;) . (Of course, I have only my own subconscious to blame for
turning the conversation in this particular direction, I suppose... it just
seemed the best way to illustrate why I can't 100% accept your oath.) Time
to do something more entertaining, maybe some unanesthetized
self-administered dental work or something...
-- Ben G
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT