Cultishness as a high-entropy state

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sat Nov 26 2005 - 09:45:31 MST


Olie L wrote:
>> From: "David Picon Alvarez" <eleuteri@myrealbox.com>
>>
>> Eliezer said:
>>> The following is quite an interesting opinion, and it is, of
>>> course, your own - why preface it by attributing it to me?
>>
>> I believe that this kind of thinking, the "what would Eliezer do"
>> thinking, no matter how well one knows Eliezer and how tractable
>> Eliezer is about his thought processes, is one of the reasons why
>> many people consider the SIAI and other SL4 groups as cultish.
>
> Are you insinuating that just because I have a poster of Eliezer on
> my bedroom wall, a "Yudkowski is our saviour" poster in my lounge and
> a "What would Eliezer do?" bumper sticker, that I'm somehow
> subscribing to some sort of cult?

I think it is important to emphasize that *every* Cause good bad or ugly
*wants* to be a cult, in the same sense that every thermal differential
wants to equalize itself and every computer program wants to become a
collection of ad-hoc patches. It's a high-entropy state into which the
system trends, a trap laid in human psychology. It has nothing to do
with whether the Cause is a worthy one. *Every* Cause wants to be a cult.

If you know anything at all about cults in the real world, you know that
transhumanism is not a cult. Prospective transhumanists are not
surrounded by wall-to-wall proselytizers showing perfect agreement among
themselves. People who join are not separated from friends and family.
  People who question the prevailing opinion are not ostracized,
deprived of sleep or worse, or threatened with expulsion.

But there's a deeper question behind this, which most people don't think
to ask because they're too busy arguing that transhumanism is a cult or
alternatively that transhumanism is not a cult. *Why* isn't
transhumanism a cult? Why *aren't* prospective transhumanists
surrounded by wall-to-wall proselytizers? This is an anomaly, which
needs to be explained. Usually when a room full of people agree with
each other on something, they are a lot more obnoxious.

The first reason is that transhumanists tend to be recruited from the
ranks of secular rationalists and SF fans, and we already have
anticultish habits of thought - "herding cats" is the usual phrase.

The second reason is that it's harder for cults to form over the
Internet. The wall-to-wall agreement effect is much weaker if the
proselytizers aren't present in person, and if the recruit isn't
separated from friends and family.

The third reason is that the leading lights of transhumanism, such as
myself in the case of Singularitarianism, know damn well that every
Cause wants to be a cult. We exert a continuing, deliberate effort to
prevent transhumanism (or Singularitarianism) from becoming a cult.
Like expending energy to keep a room refrigerated; fighting entropy
takes work.

People think of cultishness as if it were a special case, a defect of
certain causes or certain people. It's not. Cultishness happens by
default, unless you exert work to keep the Cause in an unnatural
condition of noncultishness.

What goes into this work? Off the top of my head, here are some of the
obvious ones:

1) Deliberately restrict the scope of the meme. If you have ideas
about AI, then *just* talk about AI. Don't tell people what food to
eat, what clothes to wear, which music to listen to, what art to view,
who to marry, who to sleep with, who to vote for, which games to play,
or how to live their own damn lives.

2) Exert a deliberate effort to tolerate dissent - visibly, publicly
so. One of the reasons Marc Geddes is still on this list is that I am
reluctant to kick off a resident crackpot who at least knows how to
spell and is more or less polite. If I remove Marc Geddes, then someone
*else* will stand out as the most disagreeable person - should I remove
them too? While Marc Geddes stays on the list, other people will feel
that much more comfortable about disagreeing - there's an example to
follow, someone who dared to disagree, who is in fact clearly an idiot,
yet was not struck down for his sins. Of course this does annoy the
other readers and I did eventually have to moderate Geddes. I worry in
fact that transhumanists have swung too far the other way, that we no
longer tolerate public conformity, that people are afraid to agree in
public for fear of being labeled cultish. And this is just as severe a
mistake. At least cults get things done instead of arguing endlessly -
the wrong things, but they get them done.

3) Don't wonder what the local leading lights would say about an idea.
  Don't wonder what Transhumanism thinks of a concept, or what
Singularitarianism has to say about it, or what SIAI's position on the
issue is. Ask yourself whether it's a good idea or a bad idea, and then
speak for yourself. How do you think I steer my own life? Not by
asking myself, "What would Eliezer do?" for that is an infinite loop.

4) Draw on the strengths of Traditional Rationalist culture. Exerting
a deliberate effort to tolerate dissent is just one case of this. And:
  Attend to arguments, not to authorities; demand reasoned justification
no matter who the speaker is. And so on. I don't have to explain this
stuff, you know it already.

-- 
Eliezer S. Yudkowsky                          http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:53 MDT