Re: Extro5 talk - feedback requested

From: Brian Phillips (deepbluehalo@earthlink.net)
Date: Tue Jun 12 2001 - 16:37:24 MDT


Peter,
Deferring to your greater expertise in these matters I'll
confine my comment to analysis of a single phrase.
"Once a Seed AI achieves (roughly) human-level intelligence,
 it will quickly, and dramatically, outpace human augmentation
 Moreover, as the SI progresses, it will have less and less
need for human input and skills."
IF X THEN Y.
also means..
IF NOT (yet) X THEN NOT Y.
That's the only possible hole.
But until you have even canine-level AI
code....it's a large large hole.
But the essay was otherwise noteworthy!

cheers,
Brian

----- Original Message -----
From: Peter Voss <peter@optimal.org>
To: <sl4@sysopmind.com>
Sent: Tuesday, June 12, 2001 5:21 PM
Subject: Extro5 talk - feedback requested

> I'd appreciate comments/ suggestions on these notes for the panel
discussion
> at Extro5 "Convergent or Divergent Super-Intelligence: Can we keep up with
> AIs by integrating with technology?"
>
>
> Advanced Intelligence: SI, IA, and the Global Brain
> http://www.optimal.org/peter/si_versus_ia.htm
>
> Why Machines will become Hyper-Intelligent before Humans do
> http://www.optimal.org/peter/hyperintelligence.htm
>
> Thanks
>
> Peter Voss
>
> www.optimal.org - Any and all feedback welcome: peter@optimal.org
>
>



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT