Re: AGI Reproduction?

From: Peter de Blanc (peter.deblanc@verizon.net)
Date: Fri Feb 03 2006 - 13:45:16 MST


On Fri, 2006-02-03 at 12:12 -0800, Jeff Herrlich wrote:
> Hi Peter,
>
> If there is only one non-friendly AGI that values its own life (goal-
> satisfaction) above all others, we will all certainly be killed once
> it acquires the means to do so. If multiple, comparably powerful AGIs
> are created (using the original human-coded software). They will each
> value there own survival above all others. Under these situations, it
> may be less likely that one AGI would attack another AGI. By virtue of
> this, it may be less likely that an AGI would attempt to exterminate
> humanity simply because humanity might still serve as a valuable
> resource, at least for a while. Or, it may decide to restructure
> its own goal system in a way that did not include human
> extermination. I didn't say it would be pretty, I only said thi! s
> would improve the chances of (at least some) humans surviving, in one
> form or another (uploads?)
>
> Jeff

I consider this unlikely. A flesh-and-blood human is less useful than 70
kilograms of computronium, and what does an SI need a human upload to
do?



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:55 MDT