Friendly AGIs will need to be friendly to more than humans

From: Philip Sutton (Philip.Sutton@green-innovations.asn.au)
Date: Sat May 24 2003 - 08:02:02 MDT


When we try to design and/or educate AGIs to have a moral structure I
think it is critical to get beyond our human centrism.

This is for three reasons that I can currently think of.

One is that AGIs will be a new form of sentient life and AGIs will need a
moral code to guide their interactions with other AGIs. For us to only
help AGIs create a moral structure that relates to humans is to not
recognise the moral worth of the AGIs themselves.

Secondly, AGIs that go into take-off will most likely not restrict their
presence or impact to Earth. Most likely there is life (biological or
otherwise) elsewhere in the universe and we should not release
intelligences into the universe that cannot relate in a moral fashion with
what they might meet elsewhere.

Thirdly, there is more life on Earth than just humans and that life also
deserves some consideration and moral regard too.

If anyone agrees that AGIs need a moral structure/dynamic (whatever)
that goes beyond human-centrism then we need to start expressing this
in the language we use - so that we have a greater chance of
expressing it in the tangible activities and programs that lead to the
creation of AGIs.

Cheers, Philip



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:42 MDT