Re: Paper: Artificial Intelligence will Kill our Grandchildren

From: Thomas McCabe (pphysics141@gmail.com)
Date: Fri Jun 13 2008 - 21:16:27 MDT


On Fri, Jun 13, 2008 at 10:11 PM, Anthony Berglas <anthony@berglas.org> wrote:
> Having scanned the literature, I decided to write a paper on the dangers of
> intelligence. I have tried to keep it short, sharp and very focused.
>
> I took the trouble to write it because I could not find any other paper that
> put it all together succinctly without philosophical, technical, egotistical
> and other distractions. There are a few ideas in it that I have not seen in
> Singularity community such as DNA size and brain size/speech understanding.
> But the main purpose of the paper is to be succinct and convincing.
>
> It mainly addresses issues raised in discussions with "ordinary" people and
> software engineers -- that is the target audience. In particular,
> "computers obviously can never be intelligent". "They would just do what we
> tell them". "They would be just like us but smarter". And "but what about
> global warming, biotechnology, nanotechnology and other distractions".
>
> So all comments most welcome, especially as to what the paper does not need
> to say.
>
> http://berglas.org/Articles/AIKillGrandchildren/AIKillGrandchildren.html
>
> Anthony
>
> Dr Anthony Berglas, anthony@berglas.org Mobile: +61 4 4838 8874
> Just because it is possible to push twigs along the ground with ones nose
> does not necessarily mean that is the best way to collect firewood.
>
>

With reference to this excerpt:

"What is certain is that an intelligence that was good at world
domination would be good at world domination. So if there were a
large number artificial intelligences, and just one of them wanted to
and was capable of dominating the world, then it would. That is just
Darwin's evolution taken to the next level. The pen is mightier than
the sword, and the best intelligence has the best pen. It is also
difficult to see why an AI would want humans around competing for
resources and threatening the planet."

You don't explain where the AI gets the motive for world domination.
See http://intelligence.org/upload/futuresalon.pdf.

"For evolution consumes all that is good and noble in mankind and
reduces it to base impulses that have simply been found to be
effective for breeding children."

All of our impulses were generated by evolution, not just the bad
ones. Altruism is a result of evolution, just as much as the urge to
reproduce is. See
http://www.overcomingbias.com/2007/11/adaptation-exec.html,
http://www.overcomingbias.com/2007/11/evolutionary-ps.html,
http://www.overcomingbias.com/2007/11/thou-art-godsha.html.

"We maintain an illusion of immortality because we need to live to
breed, and our thirst for knowledge helps provide us with material
resources to do that. "

You're confusing the goals of evolution with the desires of evolved
organisms. We *know* that it's an evolutionary advantage for everyone
to not simply pump out kids as fast as possible. Indeed, we've had
cultures where anyone who was caught not pumping out kids was stoned
to death, and even in the presence of this extreme selection pressure,
some people resume not pumping out kids a few generations after the
pressure is gone.

"And we seek explanations for death and the unknowable, and invent God."

We invented God (er, well, not the modern God, but the idea of a
divine being) over ten thousand years ago, comfortably predating
modern communication and contraception.

"After all, 10 mega hertz/1 mega byte is about the power computers had
in 1990, and those computers were very functional."

It isn't really important, but this isn't accurate. The Intel 286
(introduced 1982) could address 16 MB of RAM and was later clocked up
to around 25 MHz. These enhancements were, admittedly, expensive and
unwieldy, so they weren't generally used by the mass market.

"The force of evolution is just too strong."

There is no evolution without reproduction over large numbers of
successive generations. See
http://www.overcomingbias.com/2007/11/no-evolution-fo.html.

-- 
 - Tom
http://www.acceleratingfuture.com/tom


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT