Re: Questions about any Would-Be AGI

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Tue May 21 2002 - 14:46:02 MDT


ben goertzel wrote:
>
> Let me be sure I understand this one by exploring it in a Novamente context
>
> Your question is about what happens if, in Novamente, I do the following
> process:
>
> --> For all WordNodes and WordInstanceNodes in the system, replace the
> ListLink of CharacterNodes that is connected to the WordNode or
> WordInstanceNode by a ConcatContained Link, with a ListLink to a random
> sequence of CharacterNodes. Do this in a way so that the WordInstanceNodes
> with MemberLinks to WordNode X all get the same random sequence of
> CharacterNodes

Yes, that sounds right.

> So your question is designed to identify systems that are capable of
> language understanding based on programmed-in rules, but not of language
> learning, e.g. systems like the current version of Cyc, right?

If I'm kidnapped by aliens and set down in China, or in an alien culture, or
in the middle of interstellar space with no sentient beings around for
thousands of lightyears, I can still play chess, walk in a gravitational
environment, program a computer if I have a compiler, solve N equations in N
unknowns, invent AI designs, learn to play a new (humanly possible) sport,
think rationally about the predicament I'm in, and so on.

The thought experiment is supposed to single out abilities which the AI
possesses whether or not anyone is observing them.

-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT