Re: Curriculum for AI

From: Cliff Stabbert (cps46@earthlink.net)
Date: Wed Jan 01 2003 - 11:23:01 MST


Wednesday, January 1, 2003, 11:40:42 AM, Ben Goertzel wrote:

BG> A question is: Can you conceive a set of tests with sufficient breadth that,
BG> if an AGI system could pass them, it would be pretty damn clear the system
BG> possessed a high degree of general intelligence. I.e., a set of tests that
BG> can't be viably handled by a narrow-AI system, but yet still possess the
BG> abstract character of your tests, rather than being specialized to the world
BG> of human knowledge or to some particular embodiment...

I should have checked new messages before sending mine; this is what I
in a more roundabout way was suggesting the curriculum could be
developed into. There would be a number of advantages to developing
such a testing curriculum:

1) It would be impossible to "accidentally" or "subconsciously" build
   in the heuristics necessary to pass it (which would be possible
   with the current tests).

2) It could form the basis for automated testing rather than human-
   judged Turing "tests". So the tests would be more objective, in
   the sense of fair, and they in principle could be used as an
   evolutionary driver of sorts; e.g., to tweak the balance between a
   number of starting parameters of an AI mind.

3) It could give AI developers some very specific targets to aim for.
   A goal such as "hold a convincingly human conversation for ten
   minutes" is very murky, whereas choosing specific goals in a
   domain like CopyCat's (e.g, arrive at "wyz" from "abc->abd" +
   "xyz->?", with an explanation) can lead to design inspiration.

--
Cliff


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:41 MDT