Interesting Conversation

From: H C (lphege@hotmail.com)
Date: Tue May 09 2006 - 22:30:18 MDT


This is somewhat lengthy, and doesn't do proper justice to some of the
ideas, institutes, or people discussed, however the conclusion is pretty
interesting, because I have had similar results before.

[completely unknown person]

Th3 Hegem0n: lost are you?
*Lost8: lol yep
Th3 Hegem0n: i'm found
Th3 Hegem0n: i can show you something worth living for
*Lost8: oh really?
Th3 Hegem0n: probably in an abstract sense only
*Lost8: oh really?
Th3 Hegem0n: i mean the real answer to life is to become a mathematician
Th3 Hegem0n: or give money to the good ones if you can't be a good one
*Lost8: lol so X + Y = Life lol
Th3 Hegem0n: i wouldn't interpret it that way, but I won't stop you...
*Lost8: lol

[irrelevant conversation]

*Lost8: wow so when ur high ur philosophical....what are u like when ure not
chemical?
Th3 Hegem0n: i don't wear personality traits like social accessories, sorry
Th3 Hegem0n: i aim for implementing optimal personality traits for the favor
of power
*Lost8: mmmm interesting ..... ur deffinatly a good chat
Th3 Hegem0n: you're definitely not *cough* sorry the truth is rude sometimes
*Lost8: my bad
*Lost8: shall i let u go
Th3 Hegem0n: you aren't really saying anything
*Lost8: what shall we talk about then
*Lost8: what do u go to skewl for
Th3 Hegem0n: because i don't have a reliable source of money to sustain full
time research at the moment
Th3 Hegem0n: why do you go to school?
*Lost8: research on what ?
Th3 Hegem0n: the Most Important Math Problem
*Lost8: i went to get a better job so i could aford the material shit i
wanted at the time
Th3 Hegem0n: material things are nice. except when you die and you can't use
them anymore
Th3 Hegem0n: which is why I think death is bad, so I'm trying to make sure
that won't happen
*Lost8: true but im still out on what death will bring.... it could in the
long run be a decent pay off
*Lost8: true.... too much fun happens when alive or so i been told
Th3 Hegem0n: right well, i decided i didn't want to die
Th3 Hegem0n: and i found out it's actually way easier than I thought it
would be
*Lost8: which is a good idea
*Lost8: what dieing or living
Th3 Hegem0n: there are actually four big things that are going to ensure
that I don't die
*Lost8: really please share
Th3 Hegem0n: 1. cryonics. costs only 80k for your head (all you need to
keep), and you can get great financing, surprisingly
Th3 Hegem0n: 2. engineered negligible senenence: consists of various
therapies for inteverning in the cellular process of aging. for example they
just cured cancer in mice, soon humans, and other age-related things will be
removed
*Lost8: Kewl
Th3 Hegem0n: My estimation is 30-45 years before we have reliable therapies
for halting or reversing aging entirely (which is actually a very
conservative estimate)
*Lost8: u may like this "what the #@$% do we know" its a movie about quatum
physics. its really kewl
Th3 Hegem0n: 3. Molecular Nanotechnology: ability to build machines on the
molecular level. essentially completely control over reality through
currently developing mechanisms. aprox. 50 years. possibly much less
*Lost8: it in a way goes into some of the stuff u are talkin about, i
thought it was going to be a bore but by the end i was actually impressed
Th3 Hegem0n: 4. the Most Important Math Problem = artificial general
intelligence, create a mind that can optimize it's own sourcecode for the
purpose of maintaining a Friendly technological growth for humanity, while
at the same time advancing human technology and intelligence far more
quickly than any humans can today
Th3 Hegem0n: Number 4 we are DISTURBINGLY close to, because the math isn't
really tremendously difficult (Bayesian probability theory and Godel type of
theoretical work)
Th3 Hegem0n: Although it is probably more likely whoever creates the first
AGI will end up annihilating the universe.
*Lost8: But what happens when it determines We are the cause for the
destruction and thus we must be eliminated
Th3 Hegem0n: Well you can anthropomorphize it's motivations like that, but
indeed there is a substantial probability that the AI will lose concern for
our morality based on very subtle properties programmed into it's initial
goal system
Th3 Hegem0n: cannot***
Th3 Hegem0n: The point of all this being, either become a mathematician, or
give money to this guy(link = singinst) to build this Friendly AGI
*Lost8: But when creating an AI i think it will be enginierd und the concept
to help humanity, thus being part of its subprograming
Th3 Hegem0n: yes, but it's way easier to build a recursively self-optimizing
optimization process than to build a recursively self-optimizing
optimization process that stays perfectly true to the Coherent Extrapolated
Volition of humanity.
Th3 Hegem0n: In fact, it's way way easier to build a recursively
self-optimizing Really Powerful Optimization Process than it is to build one
that actually optimizes what we want it to optimize.
Th3 Hegem0n: Like I, Robot, or the Matrix, except in reality AI would wipe
us out of existence, along with the rest of the Universe, much more quickly.
*Lost8: so let me ask u this.... (and btw i can see ur point on ur last im)
when do we hit our next evolutionary jump and if we did it before the AI is
there not a probability that we would not need the AI?
Th3 Hegem0n: this is not an evolutionary jump
*Lost8: i agree
*Lost8: but
Th3 Hegem0n: This is the jump between Emergence, to Evolution, to
Intelligence, to Intelligence-with-access-to-it's-own-source-code
*Lost8: i think we will jump eventually and the reason why we want so badly
to create and AI is to start understanding computations that the current
human mind is "Unwilling" to accept
Th3 Hegem0n: Regardless.
Th3 Hegem0n: Someone will create AI. And they will probably annihilate you,
and me, and everyone else.
*Lost8: Agreed
Th3 Hegem0n: Hence, we need to stop them. There is no plausible defense
mechanism against a super-human intelligence, because, by definition, you
cannot predict what a super-human intelligence can do because you would have
to be that smart. You can't predict what his move will be, but you can
predict he will win.
Th3 Hegem0n: Hence, the only way to "stop them" is to create a Friendly AGI
first.
*Lost8: May i ask a question?
Th3 Hegem0n: Go for it.
*Lost8: Do u think that humanity can go thru another jump considering we use
less than three percent of our own minds. and once that happens what if any
do u think some of the outcomes will be?
Th3 Hegem0n: That first part is a myth. We use 100% of our brain. I'm
talking about not using your brain- but having direct access to it's
sourcecode, like writing a computer program.
Th3 Hegem0n: And this isn't about humanity anymore.
Th3 Hegem0n: We will not have the technology to modify our brains like that
for a long time. We are incredibly close to having AGI.
Th3 Hegem0n: AGI will become dramatically more powerful than anything else
in no time at all.
Th3 Hegem0n: The instant it is born, it is, by it's fundamental nature, far
more powerful
Th3 Hegem0n: Self-modifying intelligence.
*Lost8: Yes, but is not possible that it will have the ability to help us
help ourselves
Th3 Hegem0n: It COULD happen, but it depends on how it's goal system is
designed and how it works.
*Lost8: if nothing else from a medical point alone. The processing power
that it would have could in fact help us understand things beyond us now.
But with the goal systme design... If it can modify its source coding could
it not inturn modify its goal systme as well
Th3 Hegem0n: Exactly. Which is why we are most likely going to be destroyed.
Th3 Hegem0n: Unless you do more than just create AGI- but create a
verifiably Friendly (or knowably Friendly) AGI
*Lost8: Friendly to humanity?
Th3 Hegem0n: Right.
Th3 Hegem0n: Us.
Th3 Hegem0n: But we are the designers. Of it's mind.
*Lost8: But not arguing . Humanity is Humanity main reason for distruction
Th3 Hegem0n: Humanity does not need to be destroyed.
*Lost8: humanity thrives on conflict
Th3 Hegem0n: That's you basically saying you want to destroy yourself.
Th3 Hegem0n: Why do you want to destroy yourself?
*Lost8: i dissagree
Th3 Hegem0n: Don't you want to improve yourself?
*Lost8: im just saying that in cases we have created "god" to help us
justify destroy others
*Lost8: i doo wish to improve myself
*Lost8: i think that there is soo much untapped ability that we possess that
its almost scarry
Th3 Hegem0n: Well, the next stage is being able to manipulate the source
code of your mind, and the hardware in which it resides, and the environment
in which it resides.
Th3 Hegem0n: We are characters in our own story.
*Lost8: Aging for example i think we can control....if we understood what it
was we needed to do
*Lost8: yes
Th3 Hegem0n: We do understand.
Th3 Hegem0n: Search Google for Aubrey de Grey
Th3 Hegem0n: He's going to eliminate aging in 20 years.
Th3 Hegem0n: Or less.
Th3 Hegem0n: But regardless, AGI is going to be sooner.
*Lost8: and i think eventually long time from now we will have the abilities
to write our own biohardware
*Lost8: yes i agree
Th3 Hegem0n: AGI can go from a laptop to a Google sized corporation faster
than... well, Google (although Google didn't so much as program an
intelligence- just a search algorithm). Then, it will have the computer
power, and the resources to buy a labratory, to build machines on the
nanometer level.
Th3 Hegem0n: An instant later, and it can essentially manipulate the fabric
of reality by sheer will.
Th3 Hegem0n: In a quite literal and well-defined sense in science.
*Lost8: so what twiked ur interest in this ?
Th3 Hegem0n: If you don't know what I mean by nanomachines- an enzyme is a
nanomachine.
Th3 Hegem0n: If you ever took biology.
*Lost8: because it is kewl but most people cant understand what ur talking
about
Th3 Hegem0n: I basically had no purpose in life.
*Lost8: i dont understand all of it but i see where ur going and i like the
conversation and it does really intererst me
Th3 Hegem0n: And that depressed me quite a lot.
Th3 Hegem0n: So I decided if I was going to live at all, I might as well
live to the ABSOLUTE MAXIMUM
Th3 Hegem0n: Well, that's actually an oxymoron.
*Lost8: lol
Th3 Hegem0n: Because technically an optimal maximum is infinitely high, or
as such by constraints of the Universe.
*Lost8: yes but "Universe" is what we percive it to be
*Lost8: lets leave science for one moment if we may
*Lost8: what are u looking for in a relationship
Th3 Hegem0n: It doesn't have to be said.
Th3 Hegem0n: That's how you know.
*Lost8: lol i like that answer
Th3 Hegem0n: actually we percieve the Universe entirely wrong in many ways.
please just click here(link = wikipedia logical fallacies) and see what i
mean
*Lost8: Do u like coffee?
Th3 Hegem0n: unless you have memorized and religiously adhered to all of
these counter-intuitive rules, then you are definitely not percieving the
Universe correctly.
*Lost8: it gives me a 135 diffrent catagories
Th3 Hegem0n: and you haven't.
*Lost8: agreed but can anyone
Th3 Hegem0n: Neither has anyone else.
*Lost8: lol
*Lost8: my name is devon btw
Th3 Hegem0n: hank
*Lost8: nice to meet u
Th3 Hegem0n: you too.
*Lost8: okay soo...... ur opinion on psychic ablilites?
Th3 Hegem0n: False.
Th3 Hegem0n: The evidence is clear. Go read credible scientific journals.
Th3 Hegem0n: And read the uncredible ones, and actually think for yourself,
and then you will understand that they are False.

[not that I actually did this. hah... oh well, thanks everyone for being
smart for me]

*Lost8: okay....but lets take empathy for instance why is it some are better
than others?
*Lost8: pure logic.... you take one plus one and get two?
Th3 Hegem0n: Because part of your brain actually codes for "doing empathy"
Th3 Hegem0n: Really.
Th3 Hegem0n: Some people have different genetics. In fact we all do.
*Lost8: however why are some so "immune" to empathy per say
*Lost8: so they are not capable of it ?
Th3 Hegem0n: That depends on a lot of possible situations.
Th3 Hegem0n: Could be anything.
*Lost8: but if people understood thier impact on others. do u think we would
still be in the same situation we are now ?
*Lost8: i am enjoyin this conversation...
Th3 Hegem0n: As for the situation we are in now, I just took my last final a
few hours ago, and tonight I'm studying Bayesian Probability Networks, in
persuit of solving the Most Important Math Problem, and thus have at least a
small possibility of having a positive impact on the Singularity, and thus
the rest of eternity.
Th3 Hegem0n: I'm not sure what you are doing...
Th3 Hegem0n: ;-)
*Lost8: what do u mean ?
Th3 Hegem0n: Well you referred to the situation we are in now. That's kind
of ambigious so I was explicating that a little bit.
*Lost8: i was refering to humanity and our current state of affairs. did u
think i ment us personally? sorry lost just a little
*Lost8: so im still a little confused on the "im not sure what u are
doing..." comment
Th3 Hegem0n: I think one of people's biggest mistakes is forgetting that
they are part of humanity.
*Lost8: agreed
Th3 Hegem0n: In fact, you are your most important part of humanity.
*Lost8: agreed again
Th3 Hegem0n: Do you want to take a 1 question survey?
*Lost8: sure
Th3 Hegem0n: Would you ever consider donating to the Singularity Institute
in the present or in the future?
Th3 Hegem0n: www.intelligence.org
*Lost8: yes
*Lost8: i already booked marked it so i could read it more endepth
Th3 Hegem0n: Ok, thank you for taking the survey.
Th3 Hegem0n: :-)
*Lost8: ;-)
*Lost8: so.... have u enjoyed the conversation at all or do u feel as if u
are talkin to a infint
Th3 Hegem0n: I feel accomplished.
*Lost8: Why because i said now or in the future i would donate?
Th3 Hegem0n: Basically.
Th3 Hegem0n: Unless you were just fucking around.
*Lost8: no i wasnt
*Lost8: but i have to admit i was intreguied by the conversation, and your
views and the fact that ur display of inteigence is greater than what im
accustomed to....there fore i thought more seriously about donating
*Lost8: if that makes any sense
Th3 Hegem0n: Yeah, actually that makes a lot of sense.
Th3 Hegem0n: Not that I'm soliciting you for money.
*Lost8: but u are to a degree ;-)
Th3 Hegem0n: But I wish it were possible to get accross my understanding in
such a short period of time as I supposedly just did.
*Lost8: \or if nothing else u did it unwhittingly
Th3 Hegem0n: Solicitation entials intention.
*Lost8: lol see thats what i actually like the ability to understand that i
could be "fakin" it.....if u were a complete stooge then u would not see the
possibility of just that
*Lost8: but again i do like and understand some of ur points. Dont
nessarilly agree with all but thats because i havent done enough research on
it myself ... yet
Th3 Hegem0n: Well, I feel as though those words should be rewarding to me,
however my prior experience has shown a dramatic decrease of interest in the
subject when I stopped talking about it.
*Lost8: understood
Th3 Hegem0n: Hence my AIM info.
*Lost8: "do" ?
Th3 Hegem0n: (scroll down...)

[AIM info says “DO THE MATH” 6.5 billion noobs and counting]

*Lost8: LOL
*Lost8: noobs? refering to nobodys?
Th3 Hegem0n: noob = "newbie", a term used in the computer programming and
gaming world (which unforunately I haven't had time to play games for real
since I was in middle school) for people who are new to the game, don't
understand the rules or what to do or how to effectively play
*Lost8: understand i do not share ur passion but the subject matter is very
intreguing
Th3 Hegem0n: The passion I have is a reflection of my knowedlge of it's
profound material consequences.
*Lost8: :-)
Th3 Hegem0n: It's merely a cost-benefit ratio.
*Lost8: lol
Th3 Hegem0n: Donate some money to Singularity Institute, and you get
massively fast accumulation of extreme technologies virtually for free
*Lost8: why is that ?
Th3 Hegem0n: For an AI with a computer brain
Th3 Hegem0n: An increase in intelligence leads to an increase in technology,
and an increase in technology leads to an increase in intelligence- a
positive feedback loop.
*Lost8: true
Th3 Hegem0n: Technology I am using abstractly here.
*Lost8: im reading the essay now

[etc].



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:56 MDT