Re: Revision of "Simulations" essay available.

From: Alan Grimes (alangrimes@starpower.net)
Date: Thu Jan 31 2002 - 20:53:40 MST


Mitch Howe wrote:
> Alan Grimes wrote on January 31, 2002 2:52 PM
> > However there seems to be an underlying premice, that a "Big
> > Brother"/"nany"/"G0D" AI should be ruling everyone's life.

> "Ruling" is such a strong word, and I confess that I always feel like I
> run against limitations in the English language when trying to explain
> the minimum degree of regulation that the preservation of the human
> race will require.

Now that is an interesting proposal...

Two questions:

1. How would it be held accountable. (there must _ALWAYS_ be checks and
balances)

2. Would there be an option of living outside the system, say on a
different planet...

> One way to look at a minimalist singleton SI like a Sysop is this: A
> Sysop is not interested in restricting or punishing the actions of
> would-be murderers; It is interested in protecting those who do not
> wish to be murdered. (Take this same reasoning and apply it to
> nanotech grey goo, bioengineered super-plagues, nuclear weapons, etc.)

That's fair, but why would it need to be implemented as anything beyond
a shielding system for people who choose to use it?

-- I'm not sure which reality I'm talking about here....

> Would you be upset if your local police department were 100% efficient
> and preventing violent crime and 0% corrupt in the use of this power?

The police are not there to _prevent_ violent crime.
They are there to deal with offenders, past-tense.

I will admit that prevention has been impossible throughout history,
leading to this general tennant, however I don't see this as sufficient
reason to change tenants....

> The reason so many people cringe at their first exposure to a concept
> like Sysop is that they assume the system would for some reason abuse
> its ability to completely protect those who desire protection.

Power guarentees corruption.
If the SI is not corrupt, it will be hacked and corrupted from the
outside. Once that happens, it will be the end.

Case in point: Windows XP.

If you study its design you will find it extremely limiting of personal
freedom. How would you suppose a system derived from it would behave?

I am so fed up with "modern" operating systems that I have held on to
windows 3.11 for all these years... =\

Linux _REALLY_ sucks. =(((

I've designed my own OS but "bootstrapping" it will be incredibly
challenging...

> Everyone on this list would agree that giving any human this kind of
> power would be a mistake,

The only person I would trust with that kind of power would be myself.
;)

> but assigning ulterior motives to a Sysop is anthropomorphism, and
> several threads in the SL4 archives have already discussed it at
> length. I think March '01 was an especially good month for this topic.

That wasn't the question.
Let me put it this way: "I hope there isn't a God because there isn't
enough room in this universe for the two of us." ;)

I have made a standing invitation to be smitten...

> I know people who feel that this would strip away some valuable
> component of 'humanity', and I don't have a good answer for them

Good! =)
This proves that you don't have a pathalogicly inflated ego.

> - my only response is that if the ability to carry out murder is an
> essential element of humanity, then I look forward to becoming
> something other than human.

It is a neccessary skill...

> Earth. Please do not take phrases like, "Even if all the matter in the
> solar system were converted to computronium. . ." to mean "The entire
> solar system should be converted to computronium."

Then I would ugre writers to take greater care...

DarkVegeta26@aol.com wrote:
> <<could not tolerate such tyrany, no matter how benevolant. Take that
> away, and much of the argument for simulation falls through....>>

> You tolerate the tyranny of the laws of physics and the human
> condition...

laws of Physics: I require them to remain alive. If they were to be
suspended, even one of them, I would be dead instantly. They may be
inconvenient, but then I have not seen a single proposal that will have
any direct effect on them anyway....

As for the human condition: Just let me have AI, nanites, and a few
decades to work on my next body. ;)

> list, so I recommend everyone reads the essential prerequisites to
> Singularitarian thought in lieu of laziness.

I need to keep focused on my projects...

> <<Would you really so casually brush asside _ALL_ Life including people
> who disagree with the "sysop"... >>

> Life would just be implemented on a higher level, not brushed aside.

Here you diverge from the other d00d....

I think you, and many on this list, totally misunderstand what computers
are, what the internet means, and where meaning comes from.

AI will indeed open the door to a whole new phase in evolution but this
notion of "level" especially, one being "higher" than the other, is
ludacrous in the extreme. Its almost as bad as the claim that
transmuting into silicon based life is the next higer plane from eariler
carbon life just because its on the next row of the pereodic table...

> It's impossible to disagree with the "Sysop", because it would be so
> darn smart, and know what's best for you and the universe, even better
> than parents "know what is best" for their children!

I hate my father.
There will never be more than one person who knows what's best for me.

DarkVegeta26@aol.com wrote:
[The Ego]
> Why make unconstructive quips about Eliezer's ego?

>From my viewpoint they *are* constructive because I see Eliezer's ego as
a major obsticle towards more efficient progress...

> Doesn't anyone respect him for what he is doing, what he has done, and
> what he has wrote?

He has shown himself to be an active thinker, however he could be even
better if he is more open to differing and critical viewpoints.

You will see a number of people on this list have taken viewpoints
supporting my own with regards to things Eliezer has said.

If I could just get past that confounded Ego barrier, I would be able to
gain greater access to his source material and his current theories, if
indeed he has any....

It would be a win-win situation for everyone.

> I have no past in gaia spirituality/mysticism and don't hold any views
> not based on rationalism.

Look again.

> <<Faith is dangerous -- ALWAYS.>>

> Faith in technology? Faith in intellect? Faith in ethics?

Faith is dangerous. -- ALWAYS.

> We appear to already be in that environment. Our challenge lies with
> improving ourselves not crawling into a box with lots of blinking
> LEDs.>>

> Oh my. Improving ourselves is transcending onto another level of
> implementation.

Mind yes
Body too
I'll start with an almost human look and then evolve as need presents
itself. I'm ready for ME-2.0. =)

> Your "box with blinking LEDs" is a wonderful example
> of a stereotype of uploading, however. You could be in a "box with
> blinking LEDs" right now, and not know it, if the LEDs were complex
> enough.

In that case, I would not be in a position to care...

> <<That sounds very authoritarian. Please be clear about where you stand
> on the political spectrum. >>

> I'm a radical libertarian, if you must put me into some political
> classification.

That much is good...

> The Singularity is "arch-anarchy", freedom from all
> possible restraints, including those of the laws of physics (when
> within a simulation, for sure).

But confinement to the simulation. -- the most horrible form of
confinement of all....

> The "Sysop" will not restrain us but rather guide us and help us,
> putting in necessary guidelines for a smoothly functioning Omega State.

Why?
How would that be any different from death?

-- 
DOS LIVES! MWAHAHAHAHA
http://users.rcn.com/alangrimes/  <my website.


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT