Re: Consciousness was Re: This is SL4. This is SL4 on drugs. Any questions?

From: Brian Phillips (deepbluehalo@earthlink.net)
Date: Mon Sep 09 2002 - 21:56:50 MDT



"Samantha Atkins" wrote

> It is not just an intuition of course. Psychedelics are one way of
> experiencing other states of consciousness I think we make the world
> artificially small if we assumed that altered states are all pathological
or
> worthless or irrelevant to Singularity. Few states could be more
"altered"
> than being uploaded. :-)
>
This is part of the "understanding the issues" I spoke of earlier.
Anyone who has experienced what is colloquially referred to
as "ego-death" under any of the less pleasant entheogens
understands the truly frightening implications of the classic
extropian catchphrase "an uploaded human mind able to
self-modify it's own cognition and ascend into a posthuman
entity". Note I am not talking about runaway goo or psychopathic
posthuman entities or anything.. simply what internal mental
change of this order (even if successful) would be like *from
the inside*. Take a quick trip to the mental wards, choose the
long-term care facilities at a state hospital (where all the *throwaways*
get sent) and read something by Sacks. Altered states of consciousness
are *boring*? Try terrifying. Possibly neccessary, often profitable but
among the most scary experiences one can undergo in one's life.
And these are the "temporary changes"....
   (The value of extensive clinical exposure becoming evident yet?)


> The current self-image/personality/ego structure is indeed quite likely to
> dissolve when much of its ground of being changes significantly. Some
> traditions, such as Buddhism, have long held that what we normally see as
> "self" is an illusion Meditators and others who have done much
> self-examination often have a good understanding for how much the self is
a
> construct, not the true self or center of consciousness (the observer) and
so
> on.
>
This is very true.. but is not of especially great help when one is storming
the Bardo :)
> When *you* are not an advanced primate biological lifeform any longer then
> much that is derived from that previous state will be superflous. When
> scarcity largely is no more then all kinds of beliefs, assumptions, habits
of
> mind will need serious overhaul. When we perhaps can choose to make the
> boundaries between our minds more porous selectively even the notion of
> ourselves as individual might come under modification.
>
> If we have the powers and abilities that we believe will be possible in
> future states then we also have great need to adjust and modify our
> emotional/reactive natures, to re-integrate on a quite different basis
than
> normal.
>
> For all of these reasons and many more I believe that work on
consciousness,
> on understanding and also on shifting and modifying states of
consciousness,
> is quite crucial to Singularity.

I suspect that understanding the mechanisms whereby a biological frame
produces sentience is only the first step in getting a *useful*
machine intelligence.
I have raised this point before, with Annissimov and others.
"How do you get something that can or will talk to you?"
Apparently people have a real problem with discerning the difference
between a critique of Friendliness and a healthy appreciation of
the differences between *purely human states of mental function
within the SAME BRAIN* (normal vs. tripping), to say nothing
of what an AI's "thought processesses" would be like.
I don't critique Friendliness, it sounds like a grand idea (you
want to give the AI something like a rationally derived
Buddha-nature, this sounds like a good thing for a functionally
omnipotent entity to have....). I do think the communication
issues are non-trivial (and boy are they EVER non-trivial).
(The worst part is people tell me I am "anthropomorphizing
AI".. feyh!)
This is a wildly anthropomorphic example but it should serve
to illustrate the point. (Just as a thought experiment)
  Upload a lungfish. Give it an IQ of 130 or so. From the point
you turn the computer on make sure the lungfish is experiencing
something rather analogous to the effects of 600 micrograms
of LSD in a healthy 90 kg adult human. The rise-plateau-fall
pattern of intensity interaction is, rather than linear, wildly chaotic,
or perhaps rythmic in some virutally undetectable pattern (for
the lungfish anyway). Give the lungfish the ability to moderate the
dissolving effects of the drug, with one difference.. the lungfish
doesn't know which way is *down* (i.e. even if it knew
what normal was like it can't apply that knowledge to the
present state). Give this lungfish access to a file in which you
have placed an electronic version of the human Universal
Grammer. Talk to the lungfish.
  Was what you just did to the lungfish a nice thing to do?
Would you do it to a child? A pet? a seed AI?
Hmmm......


Samantha I'm glad you chimed in, you and Ben seem
to at least have exposure to the issues, both in terms
of the substances and the non-drug induced mental
tinkering of the meditative traditions. But most
people don't, and they simply don't know that they don't
know. Which is why talking about this issue on SL4 is almost,
but not quite, completely useless.

regards,
Brian
"This is a Gifted Uploaded lungfish.
 This is a Gifted Uploaded lungfish on drugs.
 Any questions?"







This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:41 MDT