From: Ben Goertzel (firstname.lastname@example.org)
Date: Tue Aug 13 2002 - 14:13:34 MDT
> This sounds not all that different from human language to me. In
> order for the above to be useful, there needs (it seems to me) to be a
> set of "common" atoms -- i.e., a more or less similar set of
> structures in each Novamente brain.
yes, two minds that share no common concepts will not be able to
>But it is in that "more or less"
> qualifier that we find the similarity to the vagueness of human
> language -- that is to say, I expect when I say "language" that some
> "more or less" similar neural structures get activated in your brain
> as do in mine.
> I can see where it's arguable that some future AI will be able to
> introspect /better/ or /more honestly/ than humans can, but not where
> such introspection can be perfect (any consciousness cannot contain
> a full representation of its own inner workings simply because those
> are by necessity more detailed). Similarly, I cannot see where any
> sort of language can be "fully" representational or accurate. I'll
> buy that an inter-AI language could be "more accurate", at
> least to some extent. How non-linear that needs to or can be is IMO
> still open to question.
yes, of course Psynese is not going to be a perfectly accurate language.
qualitatively, however, i believe it will be orders of magnitude more
accurate than human language.
If I could send my thoughts straight to your brain, some of them would be
opaque and incomprehensible.
But some of the images, sounds, formulas, feelings and dreams I'd send you,
would be tremendously evocative, far beyond what we can transfer our
language. Thus will it be with Psynese, I believe...
I think the step from human language to psynese and other inter-AI language
will be a quantitative difference that makes a qualitative difference. Even
though of course perfect communication between two nonidentical minds will
never be possible.
-- ben goertzel
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:40 MDT