Re: answers I'd like from an SI

From: Norman Noman (
Date: Sun Nov 11 2007 - 23:07:53 MST

On Nov 11, 2007 8:48 PM, Matt Mahoney <> wrote:
> > what if someone from 17000 BCE asked you how fire works?
> > Would you say "I know the answer, but I am not able to communicate it to
> > you"?
> Not the same. An SI trying to communicate with a human would be like a human
> trying to communicate with an insect.

That's just stupid. The smarter you are, the better you are at
explaining anything to anyone. If an explanation exists which a human
would understand, a superintelligence should be able to find it. It
should be able to explain general relativity to an eight year old so
well that their intuitive understanding of the skewing of reference
frames is better than stephen hawking's.

I don't even agree with the idea that the answers to various big
questions are simply beyond our comprehension. Unlike the mind of an
insect, the human mind is a system of sufficient complexity that it
can essentially be reprogrammed. The limitations are in the form of
hardware issues: speed, short and long term memory, etc.

All the philosophical questions we've solved turn out to have
extremely simple answers, they're just nonintuitive. I very much
expect the rest of them are the same.

Then again, maybe not. In the words of gordon creighton, we dream of a
logical universe; what if it isn't logical at all, but a vast
surrealist nightmare?

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:00 MDT