RE: ethics, joyous growth, etc.

From: Ben Goertzel (ben@goertzel.org)
Date: Tue Feb 03 2004 - 05:56:28 MST


> By the way, my guess would be that if one adopted a universal no-out-
> group philosophy that it would have to have some gradation of
> consideration - otherwise the person adopting the philosophy would not
> be able to act in any way for fear of negatively affect the rest of the
> universe - can't walk 'cos the dust might be affected beneath my feet.

But if an AGI had sufficient computing power, it could estimate that letting
its robot body walk down the road to talk to some people was in the best
interest of the cosmos as a whole (according to its value system), even
though this walking might squash a few ants.

Managing a local-oriented, self-focused value system is EASIER
computationally than managing a universe-focused, unselfish value system.
But with the vast computing power AGI's will have in the future, the latter
may also be manageable.

-- Ben G



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:45 MDT