From: Cliff Stabbert (email@example.com)
Date: Mon Dec 23 2002 - 10:39:41 MST
Monday, December 23, 2002, 9:46:41 AM, Ben Goertzel wrote:
BG> What I'm curious about is if anyone has clearly-articulable reasons or
BG> feelings leading them to prefer a different sort of choice -- e.g. "option 5
BG> only" or "option 2 only" etc. -- or a different option not on the list (e.g.
BG> "my brain transplanted into a chinchilla's body" -- but with a detailed
BG> explanation ;)
Thanks for the stimulating thoughts. Some comments on your choices:
BG> Ben3: A Ben in a humanoid robot body, with the ability to breathe in space,
BG> fly, etc.
I'd be interested in trying a variety of bodies, and a variety of
input systems. What would it be like if I could widen the spectrum of
light visible to me? What if I had eyes all around my head -- would
my (uploaded) brain's visual cortex be able to reconfigure itself to
deal with the increased input without radical alteration? What would
a dolphin's body and sensorium feel like -- and if I hung out in a
dolphin body with (real) dolphins for a while, would there be a
learning/adjustment period after which I could communicate with them?
I'd also be interested in the experience of breathing in space... ;)
BG> Ben5: an uploaded computer program allowed to evolve and self-modify in any
BG> direction ... quite probably it will soon become clever enough to be
BG> disinterested in Ben1, Ben2, Ben3 and Ben4 and effectively "disappear"...
Will this Ben5 be embodied / provided with the tools for embodiment
(sensorium, action on the outside world) or will it be self-contained
(i.e., in a virtual/mathematical mindspace of sorts)?
I also imagine that a Ben5 would soon disappear off our radar, in a
sense. I wonder whether this disappearing act would represent some
form of enlightenment, or some sort of transcension to a "higher
Would Ben5, after that point, be distinguishable from a Tom5,
Dick5, and Harry5 after they reach that point -- when individual
identity as we understand it is left behind, does consciousness
converge or diverge?
To be honest, I have some doubts about the Ben5 scenario myself. I'm
not at all yet certain that the kind of consciousness we are isn't
inherently complexity-limited in some ways. A paranoid personality
allowed to evolve and self-modify would quickly spin into some form of
self-destruction, I'd imagine...so given infinite self-modifiability,
wouldn't the smallest flaws in our personalities or mental health
spiral out of control? Perhaps we could modify our personality in
such a way to avoid this, but that presumes we know how to handle the
self-modification tools and know what to change and how, which in turn
presumes we're "given" those tools and that knowledge by an external
process such as an FAI. I can't quite picture us bootstrapping it all
the way given just the "mental room"...
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:41 MDT