From: Ben Goertzel (email@example.com)
Date: Wed Jun 02 2004 - 06:15:31 MDT
> 'Oh I don't know what morality is, I'll just throw the
> question entirely back on you humans! Tell you what,
> you humans can vote on it - give me your opinions and
> that's what I'll take to be morality. Cheers! ' ;)
> Seems like a terrible waste of super-intelligence to
> me. Perhaps the whole concept of 'Collective
> Volition' has gone over my head, I don't know. I'm
> sticking to my guns and still favoring the 'Objective
> Morality' approach.
To me, the concept of Objective Morality is an obvious oxymoron.
Obviously there is not, and can never be, an "objective morality." The
whole idea of morality is to impose some value system, some criterion
regarding how things "should be." But objectivity is about how things
ARE not about how they should be. The very nature of should-ness
implies diversity, as opposed to singularity (in the sense of
I do think however that some moral systems are more "universe-friendly"
than others, in the sense that they have some prayer of actually guiding
the dynamics of real parts of the universe.
I am not sure whether Eliezer's "collective volition" moral system is
really workable (i.e. universe-friendly) or not.
The "joy, growth and choice" moral system that I proposed a while back
is at least *slightly* universe-friendly. At least, in the domain of
humans and slightly-trans-human beings, it can be applied in a useful
way. However, I'm not so sure it's universe-friendly in a grander
sense, because the very concepts of "joy", "growth" and "choice" are
part of the human nexus of concepts and feelings, and probably don't
mean that much to profoundly nonhuman beings (at least, not enough that
they would want these concepts to guide their lives and actions). Thus,
Singularity-wise, I tend to view any moral system as something that's
going to help launch the Singularity beyond human comprehension -- but
only a little bit beyond human comprehension ... Once things get too far
beyond human comprehension, then human-created moral systems probably
aren't going to be very relevant.
The closest thing there could be to an objective morality is some set of
rules or principles regarding which moral systems tend to be more
universe-friendly than others -- either in our universe in particular,
or as a function of the universe in question. This is a very
interesting area for speculation and investigation.
-- Ben G
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:47 MDT