From: Aleksei Riikonen (email@example.com)
Date: Thu Mar 20 2003 - 03:53:00 MST
Robin Lee Powell wrote:
> I doubt any of this is original to me, and it's certainly not well
> written, but I wanted to share it anyways.
You seemed to hold the following views:
(1) Bostrom is drastically underestimating computational requirements, and
thus his simulation argument is incorrect.
(2) For ethical reasons, posthuman civilizations will probably refrain from
running ancestor simulations, and thus Bostrom's simulation argument is
A comment on view (2):
I don't think that Bostrom argued that posthuman civilizations necessarily
would run ancestor simulations. Rather, he argued that if we reach a
posthuman stage, and _if_ we end up running ancestor simulations, then it
is likely that we are currently living in a simulation.
Thus (2) is not a counterargument against Bostrom, but a counterargument
against a view not presented by Bostrom in "Are You Living In a Computer
It is quite possible that no complete ancestor simulations will be run in a
posthuman future. The Sysop Scenario is such a case, if a Sysop wouldn't
allow individuals to torment sentients in Holocaust simulations etc.
A comment on view (1):
> [...] an AI which is capable of observing all the humans in the
> simulation at once and feeding them only the data they need [...] We
> can't even imagine how much computing power such an AI would have to
> have, so I won't even try. I will suggest, however, that such an AI's
> computing needs would dwarf the computing needs of the human brains in
> the simulations by dozens of orders of magnitude.
You present no grounds for this suggestion, and I don't see why Bostrom
would be wrong in assuming that the computing power needed for the process
of selecting what to simulate precisely and what to "fudge" would be rather
-- Aleksei Riikonen - firstname.lastname@example.org
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:42 MDT