From: Brian Phillips (email@example.com)
Date: Tue Jun 12 2001 - 16:37:24 MDT
Deferring to your greater expertise in these matters I'll
confine my comment to analysis of a single phrase.
"Once a Seed AI achieves (roughly) human-level intelligence,
it will quickly, and dramatically, outpace human augmentation
Moreover, as the SI progresses, it will have less and less
need for human input and skills."
IF X THEN Y.
IF NOT (yet) X THEN NOT Y.
That's the only possible hole.
But until you have even canine-level AI
code....it's a large large hole.
But the essay was otherwise noteworthy!
----- Original Message -----
From: Peter Voss <firstname.lastname@example.org>
Sent: Tuesday, June 12, 2001 5:21 PM
Subject: Extro5 talk - feedback requested
> I'd appreciate comments/ suggestions on these notes for the panel
> at Extro5 "Convergent or Divergent Super-Intelligence: Can we keep up with
> AIs by integrating with technology?"
> Advanced Intelligence: SI, IA, and the Global Brain
> Why Machines will become Hyper-Intelligent before Humans do
> Peter Voss
> www.optimal.org - Any and all feedback welcome: email@example.com
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT