Re: How To Live In A Simulation

From: telocity (mentat@telocity.com)
Date: Thu Mar 22 2001 - 18:06:28 MST


  ----- Original Message -----
  From: telocity
  To: sl4@sysopmind.com
  Sent: Thursday, March 22, 2001 2:33 PM
  Subject: Re: How To Live In A Simulation

      I told Outlook Express to send this message later, as I hadn't finished it yet. But, like someone else on this list, sometimes I just don't know how to make it do what I want all the time.

        Herewith, the intended version.

    ----- Original Message -----

    From: Chris Cooper
    To: sl4@sysopmind.com
    Sent: Tuesday, March 20, 2001 10:42 AM
    Subject: Re: How To Live In A Simulation

      
      Now maybe I'm just damaged by growing up in a heavily Southern Baptist environment,but these discussions of the Singularity begin to sound like New Testament/Revelations-type stuff.I'm fascinated by this.I'm not a religious person myself,an atheist in fact,but the parallels often give me pause.Could these similarities be due to the universal nature of human faith and zealotry?I have no doubt that the folks on this list BELIEVE in the inevitability on the Singularity,with just as much passion as a devoted Christian(or Jew,Muslim,etc.)Is the Singularity the current,technology-friendly version of the same world ending/transforming myths that are at the heart of all the world's religions?

          I think you're on to something here. If you read Eliezer's _Staring into the Singularity_, especially the last part, it has a definite air of singularity-as-salvation ring to it. Such remarks as "I've had it with this world.", among others - I don't have it in front of me, indicate a desire for something - in this case the AI - to come and make everything all better. Such remarks as: "If we can keep the world economy from disintegrating in the next ten years ...", with no evidence as to why this is likely, just seems to be taken for granted. It's the kind of outlook on which salvation theologies are based: A glass-half-empty philosophy, the world is basically a bad place, and humans have really screwed it up and need something or someone to come and fix it. In adition, the AI will answer all the ancient philosophical questions. It will know - and perhaps tell us - the meaning of life (whatever that phrase is supposed to mean) and anything else we want to know. Of course, if we join with it, we will get all these answers directly. Somehow, if drug dealers become Powers they will morph into good citizens.

          I didn't expect to see such a thing, but in retrospect I guess it was inevitable that the singularity would become a kind of salvation theology to some people. A great many people who reject notions of a supernatural seem to have turned to aliens of some sort or other to fill the same role. The singularity has all the ingredients to fit the bill at least as well: An omniscient - for all practical purposes from a human perspective - omnipotent - from the same perspective - being who can end all suffering, answer any question - whether the question is meaningful or not - and give people whatever they want, if it chooses to. Also, humans themselves may transcend to join/create this being and become entities with all the attributes of the traditional deities for all practical purposes. That's as good a definition of heaven as I've heard.

          I do believe the singularity is coming. I first heard of it from Vinge's _Marooned in Real Time_ in 1986. By Eliezer's definition, I'm an sl4. But my ideas about how it will arrive are markedly different from what seem to be those of most people on this list. It seems to me more likely to result from the interaction of lots of connected humans and computers and sensors --just the way intelligence emerged in the first place, (just replace "humans and computers" in the above with "neurons"). It is a natural result of evolution, now consciously guided, an opening up to new possibilities, a growth process, not a race against catastrophe to borrow a phrase from H.G. Wells. That's hardly an inspiring vision anyhow. "We must have a singularity, and we must have it as soon as possible or we're doomed." Baloney! We'll have it because it's natural to have it, not because we're toast if we don't.

          Don't get me wrong. I certainly respect Eliezer's intelligence, and he certainly does not lack for imagination. I like that kind of no-milquetoast approach to things he evinces. I'm not picking on him in particular, but as he's written more on the topic than anyone else I know of, and is a primary advocate for trying to advance the singularity, he's an easy target. But I CERTAINLY DON'T MEAN ANYTHING PERSONAL, and I hope I will be permitted to remain on the list after this is posted.



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT