Re: Will FAI develop a high priority self-preservation goal

From: Eliezer S. Yudkowsky (
Date: Mon Dec 16 2002 - 23:10:54 MST

Gary Miller wrote:
> Will a FAI develop a sense of self preservation and self interest? It
> seems prudent from an evolutionary perspective to insure an organism
> does not engage in risky behavior for no reason thereby risking it's
> very existence. Such as radically altering it's own code with out doing
> a backup :)

This one has been pretty exhaustively covered; see:

Eliezer S. Yudkowsky                
Research Fellow, Singularity Institute for Artificial Intelligence

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:41 MDT