Re: Egan dismisses Singularity, maybe

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Mon Apr 22 2002 - 14:49:36 MDT


Damien Broderick wrote:
>
> It's a capital mistake to confuse a writer with the fiction, let alone one
> character in a cast. Still this is a provocative passage in SCHILD'S
> LADDER. I'm still reading the book, so for all I know there's a fullblown
> Spike before the end, but I was struck by how Egan's moderately posthuman
> figures are placed 20,000 years hence with no mention of anyone having
> Sublimed. Then on p.55 [UK edition]:
>
> ==========
>
> `What do you think you're going to find in there [a new region of altered
> spacetime]? Some great shining light of transcendence?'
> `Hardly.' _Transcendence_ was a content-free word left over from religion,
> but in some moribund planetary cultures it had come to refer to a mythical
> process of mental restructuring that would result in vastly greater
> intelligence and a boundless cornucopia of hazy superpowers--if only the
> details could be perfected, preferably by someone else. It was probably an
> appealing notion if you were so lazy that you'd never actually learnt
> anything about the universe you inhabited, and couldn't quite conceive of
> putting in the effort to do so; this magical cargo of transmogrification
> was sure to come along eventually and render the need superfluous.
> Tchicaya said, `I already possess general intelligence, thanks. I don't
> need anything more.' It was a rigorous result in information theory that
> once you learn in a sufficiently flexible manner--something humanity had
> achieved in the Bronze Age--the only limits you faced were speed and
> storage; all other structural changes were just a matter of style.
> ==================================

Naturally, Greg Egan can't afford to have his characters Sublime off,
because then he wouldn't have anything to write about. I personally feel
that the best way to deal with this plot hole is to never mention it; I
thought Iain Banks's Culture novels were seriously wounded by his later
attempts to explain away the lack of a Singularity. Alternatively, the main
characters can be refuseniks or very junior sentients, although in neither
case are they likely to be dealing with the most difficult problems of
contemporary civilization.

However, Tchicaya's tone sounds too harsh to be just patching up a plot
hole. I strongly suspect that Violet Mosala of "Distress" is standing in
for Greg Egan when she attacks pseudoscience, and Tchicaya in the quoted
passage sounds a lot like Violet Mosala. I think Greg Egan probably
believes what's written above - although, since Amazon still says "Schild's
Ladder" isn't available in the US, all I know about the entire novel is the
one paragraph Damien posted. But it does seem consistent with the picture
painted in "Diaspora", or for that matter Peer carving table legs in
"Permutation City".

Naturally, I'm not going to take Egan's comments personally (Greg Egan being
my favorite SF author of all time may have something to do with this). I
expect Egan ran across one or two nontechnical writings on the Singularity
that took a worshipful tone and didn't back it up with specific arguments;
he thought that this was all there was to it; and he subsequently became
annoyed with the whole deal, or at least what he thought was the whole deal.

Nonetheless, as one of the "someone elses" working out the details Greg Egan
refers to, I think Greg Egan's characterization of the enormous possible
variation within the design space of minds-in-general as "matters of style"
is yet more fallout from Standard Social Sciences Model genericity. If you
break down general intelligence into a set of interacting, internally
specialized subsystems, then it becomes much clearer that pumping up the
computing resources available to a subsystem changes *what* you think and
not just how fast you think it or how well you remember it. Take the
decomposition from "Levels of Organization in General Intelligence" as an
example; even if you leave the general architecture completely constant and:

(1) Add computing resources to the subsystems handling categorization, so
that they can perceptually reify and perceptually match more complex
patterns within sensory modalities and abstract imagery (2.5.1).
(2) Pump up the number and kind of beliefs and expectations that are
automatically checked for resonance against mental imagery during the
resolution of a sequitur (2.6.3).
(3) Pump up the size of working memory: the size and resolution of
perceptual workspace in sensory modalities, the size and resolution of the
focus of attention, and the amount of mental imagery that can be
simultaneously compared against all other mental imagery (2.6.2).

Then the end result should be an intelligence that is immediately bored by a
very large class of problems that humans find interesting, and which has fun
solving an even larger class of problems that humans would find permanently
intractable intuitively. Even if, given infinite time and storage, a human
could simulate by hand a Turing machine that understood the problem, the
human would simply be standing in for the CPU and would not necessarily have
the cognitive capacity to represent internally an understanding of the
higher levels of organization. And of course Part III of _Levels_ argues
directly that minds-in-general can access major cognitive advantages that
are not available to present-day evolved humans.

It is more than our instincts and our sensory modalities that mark
present-day humans as thinly modified primates; our basic cognitive
architecture bears the signature of incremental evolution as well. See
Terrence Deacon's "The Symbolic Species", for example. Or _Levels_, of
course.

It seems pretty clear that tomorrow's sophisticated citizen will instantly
recognize Greg Egan's most exotic characters as (a) human-level
intelligences, (b) human-level intelligences with a roughly anthropomorphic
balance of domain competencies, (c) evolved human-level intelligences, and
even (d) modified primates. One might excuse this on the grounds that
Egan's characters need basically human architectures in order to map onto
our own minds as sympathetic characters. In fact, Greg Egan's characters
are already so far afield by contemporary standards that the only
sympathetic character left in Konishi polis is Inoshiro. But of course
anthropomorphism is only excusable as a literary necessity if Greg Egan
knows it's anthropomorphism; I doubt that Egan would want to be held to
standards any lower than that.

So, even in terms of what we can figure out today, I would have to say that
Tchicaya is, in all probability, flat wrong, and that this probably reflects
an identical mistake by Egan.

-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:38 MDT