Re: Imposing ideas, eg: morality

From: Phillip Huggan (cdnprodigy@yahoo.com)
Date: Tue May 16 2006 - 12:10:22 MDT


Okay stop calling it *morality* and the problem is solved. Call the AGI goals: energy configurations of the universe that permit minds to experience preferred brain-states. All this bickering and confusion resolved.
  

m.l.vere@durham.ac.uk wrote:
  I dont think this is so. Whilst IMO, the balance of evidence is overwhelmingly
against the existence of an objective morality, loads of people believe their
relative moralities to be objective. I am certain it is theoretically possible
to give an AI any set of goals we like, and have the AI follow them as if they
were an objective morality.

                
---------------------------------
How low will we go? Check out Yahoo! Messenger’s low PC-to-Phone call rates.



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:56 MDT