Re: Self-modifying FAI (was: How hard a Singularity?)

From: James Higgins (jameshiggins@earthlink.net)
Date: Wed Jun 26 2002 - 09:51:46 MDT


At 02:12 PM 6/26/2002 +0200, Eugen Leitl wrote:
>On Wed, 26 Jun 2002, Eliezer S. Yudkowsky wrote:
>
> > And how do "referents'" have a "floating database" or a
> "morality"? Are you
> > confusing the idea of an external referent with the programmers? A
>
>No. I think I understood the gist of it just fine.

Humans have a terribly hard time understanding each other, even when they
are mostly thinking the same. Fully and exactly communicating the essence
of an abstract concept to an intelligence which isn't even wired the same
way will almost certainly be more difficult. If people can very rarely
accomplish it between each other the odds of accomplishing it with an AI
are very, very low. Please note that appearances can be very
deceiving. Even though two people discuss something and agree that they
completely sync on the meaning and details involved, they rarely if ever are.

James Higgins



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT