[SL4] Re: Abandon Ship

From: patrick@kia.net
Date: Sat Feb 12 2000 - 16:24:24 MST


From: patrick@kia.net

I'm responding to a digest message through the onelist web site, so this may
mess up any kind of thread tracking your mailer may be doing. Sorry.

>Message: 1
> Date: Thu, 10 Feb 2000 22:50:08 +0000
> From: Marc Forrester <A1200@xxxxx.xx.xx.xx>
>Subject: Abandon Ship
>
>> What we want and what the military want are two different things.
>> That doesn't mean they're mutually exclusive. We can't design them
>> to be 'good', or rather we can, but they could easily be changed.
>
>I don't think any militaries want AI as intelligent as themselves,
>anyway. They want loyal, obedient animal-like minds with blinding
>reaction speed and instincts hard-wired into their missile racks.
>Scary stuff, certainly, but not apocalyptically so.

 My understanding of the reason the military in the US put so much money in AI
research decades ago was not for the purposes of developing smarter weapons or
meaner soldiers, but rather for the purpose of creating superior intelligence
analysis agents. 'Intelligence' in this sense meaning spy stuff, satellite
photes, etc.

 The reason for this is that it takes a great deal of training and a special
kind of knack for a human to decide whether a satellite photograph has a
concealed tank in it, or how many soldiers are in a bunker, and so on. This
skill and related skills are expensive and so the 'MI' gained from it is
thereby expensive. Automating some or all of these tasks would be, they
reasoned, in the long term more cost efficient.

 There are more examples, like monitoring intercepted radio and phone traffic.
For instance, if during a stalemate situation (the cold war in Germany in the
70s, the Chinese vs. Taiwan, etc.) someone happened to notice that a general's
wife is burning up the phone lines looking for her husband, you might
(possibly) conclude that the brass have been sequestered prior to a sneak
attack (which I've been told is likely, though I can't quote my source).

 Military people might be very happy to put up with an AI 'smart' in the sense
of the ability to track disparate data over huge spectra and draw MI
conclusions from them.

 That's not to say that the militaries of the world won't think up all kinds of
ways to use AI to kill people. I'm only speaking historically.

Patrick McCuller

--------------------------- ONElist Sponsor ----------------------------

Get what you deserve with NextCard Visa. Rates as low as 2.9 percent
Intro or 9.9 percent Fixed APR, online balance transfers, Rewards
credit you deserve! Apply now! Get your NextCard Visa at
<a href=" http://clickme.onelist.com/ad/NextcardCreative2 ">Click Here</a>

------------------------------------------------------------------------



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:06 MDT