Re: How big is an FAI solution? (was Re: [sl4] to-do list for strong, nice AI)

From: Matt Mahoney (matmahoney@yahoo.com)
Date: Thu Oct 22 2009 - 08:06:20 MDT


From: Tim Freeman <tim@fungible.com>
> From: Matt Mahoney <matmahoney@yahoo.com>
>> That's a guess. I think the training data is 10^17 bits.
>>
>> Also, your design has the same shortcoming as CEV and my cheating
>> ideal-market definition. It doesn't define "human".

> The purpose of the training data is to define "human" and to define
> what voluntary actions these humans are taking and what perceptions
> they are experiencing.

Training an AI by watching people will only define "human" as far as the cultural beliefs of the people it observes. Depending on your test subjects, you will get different beliefs about the relative rights of animals, children, embryos, men vs. women, native born vs. foreigners, rich vs. poor vs. prisoners, etc. It will fail utterly in the case of future human-software hybrids that don't yet exist.

> Humans seem to be able to do this without any deep thought, so an
> ideal extrapolator shouldn't need an absurdly large amount of training
> data to learn to do it.

Humans can recognize faces in videos without much effort. That doesn't make it easy. In fact the training data that allows you to do this consists of, among other things, several years worth of high resolution video.

> One might argue that humans have a large amount of state information
> in their heads, but seriously, how much of that do you think pertains
> to being able to make statements like "Joe went to the far side of the
> room, saw the chair, and moved it with his left hand" in reaction to
> seeing that happen? Nearly everybody can do that sort of thing, and
> they don't feel like they're doing any deep thought at the time.

You are guessing. Landauer measured the complexity of human long term episodic memory to be on the order of 10^9 bits. If you think that's wrong, then give me different numbers and justify them. A couple of years ago I estimated the complexity of your friendly AI program to be on the order of 2^800 steps. A quantum level simulation of the universe would be faster.
 -- Matt Mahoney, matmahoney@yahoo.com



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:05 MDT