Re: friendly ai

From: xgl (xli03@emory.edu)
Date: Mon Jan 29 2001 - 08:19:28 MST


On Mon, 29 Jan 2001, Samantha Atkins wrote:

>
> It seems like a pretty limited SI if its only goal is Friendliness.
> Some other goals like to explore the universe or expand its
> understanding and create interesting things would seem to be almost a
> requirement of a real intelligence. For that matter I would expect a
> real SI to have the ability to formulate its own goals including
> questioning deeply even the pre-defined supergoal. And Friendliness to
> what? Only to humans? What about other sentiences when/if it
> encounters them?
>

        well, its only *super-duper* goal is friendliness. why couldn't
everything else just be subgoals of friendliness? i guess the word
"friendliness" has different connotations in the human sphere, to some it
is merely a passive quality, to others perhaps more active. on the other
hand, coding the super-duper goal of friendliness is admittedly very, very
tricky.

-x



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:35 MDT