Re[2]: Strong-willed AIs [was: continuity of self]

From: Cliff Stabbert (cps46@earthlink.net)
Date: Sun Sep 15 2002 - 20:18:56 MDT


Sunday, September 15, 2002, 9:20:20 PM, Eliezer S. Yudkowsky wrote:

ESY> Incidentally, this should be common sense, but it deserves to be
ESY> said explicitly:

ESY> This is a public forum which will be publicly archived.

Duly noted.

ESY> Do not propose, in this public forum, any security measure which
ESY> is weakened if the brainwasher anticipates it; discuss it
ESY> privately using strong cryptography, if at all.

I would argue for openness as the best security measure, both morally
and practically -- through traditional publishing, open source code
releases, and the like. This tactic is not weakened by anticipation.

<snip>

ESY> Bear in mind that sometimes, talking as if you expect people to
ESY> do immoral things increases the possibility that they will do so.

Possibly. In general, I think governments and companies are granted
with more credit for benevolence than is their due. I don't know to
what extent that is shared by members of this list, but don't see much
harm in suggesting a dose of skepticism, even cynicism, when
speculating on the motives of the powerful.

ESY> Finally, try not to contribute to creating a *psychology* (which
ESY> may spread beyond this mailing list) under which AI is a valuable
ESY> thing that people can steal and use, as opposed to an independent
ESY> sentient entity capable of making its own decisions. Don't use
ESY> the word *steal*. Use the word *brainwash* or *pervert*.

Noted. I used the catchall "grab" for nanotech and AI but later used
"subvert" when referring to AI, which has much the same meaning as
your terms in this context. I didn't mean to create the impression
that either of these technologies can easily be "stolen".

Especially in AI's case any such effort may very well be doomed from
the start (in terms of being able to use the technology for other
ends). But that wouldn't necessarily stop an organization from trying
-- they're not renowned for their smarts, especially as they get
larger. The risk that I feel should be mitigated is not so much
subversion but the loss of scientific/technological progress. Again,
I think openness is the best approach here in most cases; for those
who disagree (or choose not to use this approach for various reasons)
I would concur: full-on encrypted communications are *highly*
recommended.

--
Cliff






This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:41 MDT