From: Eliezer S. Yudkowsky (firstname.lastname@example.org)
Date: Mon Jan 01 2001 - 14:15:19 MST
"Eliezer S. Yudkowsky" wrote:
> Don't worry, the number was random, i.e., calculated to be "sometime in
> 2005". If I somehow did have a definite date, rule 15 mandates that I
> couldn't announce it, and that if I had no choice but to announce it, I'd
> need to announce a date at least eight days *after* the actual Singularity
> was scheduled for. (Yes, some of the rules apply to good guys as well.)
Justin Corwin wrote:
> there are numbered rules for the public treatment of the singularity?
Rule 15: "I will never employ any device with a digital countdown. If I
find that such a device is absolutely unavoidable, I will set it to
activate when the counter reaches 117 and the hero is just putting his
plan into operation."
Other rules which also apply to good guys are:
Rule 2: "My ventilation ducts will be too small to crawl through."
Rule 25: "No matter how well it would perform, I will never construct any
sort of machinery which is completely indestructible except for one small
and virtually inaccessible vulnerable spot."
Rule 49: "If I learn the whereabouts of the one artifact which can
destroy me, I will not send all my troops out to seize it. Instead I will
send them out to seize something else and quietly put a Want-Ad in the
Rule 79: "If my doomsday device happens to come with a reverse switch, as
soon as it has been employed it will be melted down and made into
limited-edition commemorative coins."
Rule 85: "I will not use any plan in which the final step is horribly
complicated, e.g. 'Align the 12 Stones of Power on the sacred altar then
activate the medallion at the moment of total eclipse.' Instead it will be
more along the lines of 'Push the button.'"
Rule 96: "My door mechanisms will be designed so that blasting the
control panel on the outside seals the door and blasting the control panel
on the inside opens the door, not vice versa."
Some rules which do NOT apply to good guys:
Rule 20: "Despite its proven stress-relieving effect, I will not indulge
in maniacal laughter."
Rule 22: "No matter how tempted I am with the prospect of unlimited
power, I will not consume any energy field bigger than my head."
Rule 179: "I will not outsource core functions."
And of course:
Rule 59: "I will never build a sentient computer smarter than I am."
But all of us can take heart from...
Rule 230: "I will not procrastinate regarding any ritual granting
-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:35 MDT