From: Anthony Berglas (email@example.com)
Date: Fri Jun 13 2008 - 20:11:44 MDT
Having scanned the literature, I decided to write a paper on the
dangers of intelligence. I have tried to keep it short, sharp and
I took the trouble to write it because I could not find any other
paper that put it all together succinctly without philosophical,
technical, egotistical and other distractions. There are a few ideas
in it that I have not seen in Singularity community such as DNA size
and brain size/speech understanding. But the main purpose of the
paper is to be succinct and convincing.
It mainly addresses issues raised in discussions with "ordinary"
people and software engineers -- that is the target audience. In
particular, "computers obviously can never be intelligent". "They
would just do what we tell them". "They would be just like us but
smarter". And "but what about global warming, biotechnology,
nanotechnology and other distractions".
So all comments most welcome, especially as to what the paper does
not need to say.
Dr Anthony Berglas, firstname.lastname@example.org Mobile: +61 4 4838 8874
Just because it is possible to push twigs along the ground with ones nose
does not necessarily mean that is the best way to collect firewood.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT