From: Ben Goertzel (firstname.lastname@example.org)
Date: Thu Mar 24 2005 - 10:21:10 MST
> I don't think I'd take oblivion for any purpose, even
> to save ten other people. Although I'd really love to
> be that philosophical, I just can't pay my life for
> ideals like utilitarianism.
I believe you.
However, some feel differently. I know several individuals, each of whom
*would* trade their lives in order to save 10 random strangers with whom
they have minimal genetic relatedness -- an action that would go against the
interest of their selfish genome as well as their selfish organism.
You could argue this isn't "true altruism" because their goal may be
personal satisfaction or personal ego-boosting or something, rather than
"pure altruism" -- but I don't tend to find such arguments very meaningful.
As far as I'm concerned, this is an example of genuine altruism, and it's
not explained via the neo-Darwinist orthodoxy very well. It's explained by
the variant of evolutionary theory that emphasizes self-organization and
Altruism (in the sense I'm using it here) is a psychological attractor, and
the quasi-altruism that the selfish genome promotes has pushed some human
brains toward that attractor. Guiding AGI's into this psychological
attractor will be an important topic in AGI psychology...
-- Ben G
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:51 MDT