From: Eliezer S. Yudkowsky (email@example.com)
Date: Sun Jan 28 2001 - 12:29:20 MST
Ben Goertzel wrote:
> I think you sell evolution short... it's a much more general and powerful
> process than you give it credit for
I *know* evolution is a general and powerful process. It's the process
that produced me, and I acknowledge that. I just think that intelligence
is a *more* general and *more* powerful process. If evolution gets in the
way of my declarative goals, or the declarative goals of a Friendly AI,
then we'll *deal* with it, and without too much trouble. The emergent
forces of evolution are unintelligent. I can outsmart them; a Friendly
seed AI can outwit, outclass, and outgun them; a superintelligence can
simply crush the forces involved via brute brainpower.
So when people argue that the character of post-Singularity reality is
*still* going to be determined by evolution, on the grounds that evolution
is going to sneak in and stomp on the goal systems of the transhumans that
will enact the new world, I just don't buy it. Evolution has been
amazingly ineffective at stomping on my own goal systems when I don't want
it to... there are times when I want it to, which leads some people to
argue that it's all evolution anyway, but they're missing the point; there
are times when I *don't* want a specific evolutionary selection pressure
to determine my decisions, and at those times, I - the merely human
hardwired-brainware-guy! - can successfully resist those forces.
Maybe, in the final analysis, I'm an altruist because despite the wide
range of selfish *behaviors* and selfish thoughts and emotions that are
evolutionary advantages, the nature of our society is such that the
evolutionary advantage lies with individuals who believe themselves to be
altruistic. (As Ridley put it, there are advantages in issuing calls to
righteousness, but not to obeying them.) Maybe I'm an altruist because
I'm an organism whose evolved *behaviors* are skewed almost entirely
towards selfishness, but whose evolved *beliefs* are skewed almost
entirely towards altruism, and thus I slid steadily into altruism as my
declarative cognition gained knowledge about evolutionary psychology and
used it to adjust my choices... who cares? The upshot is, I've become
humanstyle Friendly, and my declarative cognition has successfully
outgunned all sorts of selection forces in the process.
> I tend to buy into those theories of the origins of the universe, like John
> Wheeler's, which argue that in the early universe, physical law itself evolved...
General Relativity looks to me suspiciously like a mutation from a
Newtonian ruleset. Likewise for Special Relativity as an
existing-functionality-preserving mutation from flat space. The
transition from quarks to baryons to atoms to molecules practically
screams of evolution, which is characterized by stochastic layers of
decreasing computational efficiency and increasing functional complexity.
And the sheer hackiness of the transition from the quantum to the
classical leads me to believe that there's some kind of functional
adaptation here as well, perhaps involving the transition from a Monte
Carlo algorithm to actual quantum superposition and randomness.
> And there is much
> evidence that the human brain is itself evolutionary (see Edelman's work on
> Neural Darwinism; earlier
> work by neuroscientists like Vernon Mountcastle and Szentogothai)
Sure. Hedonic neurons, directed evolution in the human immune system...
like I said, evolution is a very powerful method. Intelligence is *more*
powerful. You just don't get to see that unless you have access to
general intelligence - not algorithms, not classical-AI heuristics, but
true general intelligence. Wait until Webmind starts writing vis own
treatises on seed AI before you conclude that an evolutionary goal system
is the best possible method...
-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Sat May 18 2013 - 04:00:20 MDT