Re: Selling AI versus selling knowledge...

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sun Oct 29 2000 - 14:48:16 MST


Samantha Atkins wrote:
>

> Hmm. But to do that you would have to take the position that knowledge
> is something that can/should be "owned", that the commons of knowledge
> and information should be all fenced and parceled out. It seems to me
> that would be a very major drag on human progress and advance. In the
> face of rapidly accelerating change that we are in it could very well
> kill us.

I don't think Ben Goertzel was talking about ownership of facts, just
ownership of the facts in ready-to-think format for your favorite AI.

I have done some thinking along those lines, but mostly back when I was still
thinking in terms of an open-source model for developing AI. It's not easy to
see a good way that knowledge could be sold for closed-source AIs - except by
companies with access to the source - unless we're assuming that the knowledge
format is both simple and independent of the proprietary thought processes.
This doesn't seem too likely to me; more likely, the "format" is a third of
what you need to know to create an AI, and another third is composed of
thoughts stored in that format.

Another variable strongly affecting the saleability of knowledge is the degree
to which the knowledge can be made independent of the particular context of
the AI that created it, or the extent to which that context can be packaged
with the knowledge. We aren't just talking about things like what type of
fluctuations indicate anomalies, but the fact that real knowledge grounds in
experience, and experience is perceived in ways that link into the rest of the
knowledge base. So to export knowledge plus grounding, you have to export the
experience, and to export the particular experiences, you may need to export
the rest of the experiential and knowledge base.

The upshot is that brute-force exportation of knowledge may be prohibitively
expensive; packing fully grounded knowledge into smaller forms requires either
really dumb or highly sophisticated AI. This may not hold equally true of all
forms of knowledge, and sufficiently fault-tolerant AIs may be able to easily
deal with imperfect integration of experiential bases.

Also, the problem in its rawest form only holds with respect to exporting
knowledge that was formed by the AI without human cooperation, to another AI
with a pre-existing knowledge base formed without human cooperation. Where
first-time sales of AIs are concerned, there may be a legitimate separation of
selling the AI itself versus selling the various thought processes needed to
make up the AI. It may not be a prohibitively expensive proposition to get
knowledge bases and experience bases manufactured by the same company to
integrate automatically, as long as the AI is just starting up. After that
you have the problem of integration with the previous experiential base.

The same thing goes for selling sensory modalities for source code or
financial accounting packages or commodities trading. Hey, Ben, aren't you
glad I didn't patent that business model back in 1998?

-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:35 MDT