Re: guaranteeing friendliness

From: Robin Lee Powell (
Date: Tue Nov 29 2005 - 00:44:23 MST

On Tue, Nov 29, 2005 at 07:08:13AM +0000, H C wrote:
> It's not so rediculous as it sounds.
> For example, provide an AGI with some sort of virtual environment,
> in which it is indirectly capable of action.
> It's direct actions would be in text only direct action area
> (imagine it's only direct actions being typing a letter on the
> keyboard, such as in a text editor).

Oh god, not again.

Quick tip #1: if it's *smarter than you*, it can convince you of
*anything it wants*.

Quick tip #2: what you're describing is called "slavery"; it has
teensy little moral issues.

Quick tip #3: Search the archives/google for "ai box".


-- ***
Reason #237 To Learn Lojban: "Homonyms: Their Grate!"
Proud Supporter of the Singularity Institute -

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:53 MDT