From: Matt Mahoney (firstname.lastname@example.org)
Date: Thu Oct 08 2009 - 07:59:42 MDT
From: Mike Dougherty <email@example.com>
On Wed, Oct 7, 2009 at 9:58 PM, Matt Mahoney <firstname.lastname@example.org> wrote:
>> communicate a few bits per second. An AI could guess 90% to 99%
>> of what you know because that knowledge is shared by others. That is
>> only possible if the AI is connected to many people. Also, the
>> cheapest way to collect the remaining 1% to 10% is to monitor your
>> communication and actions. For this, it needs internet access because
>> that's where you do most of your communication.
> yeah, and early adopters will be waiting in line for the opportunity
> to install these monitors on themselves. (google Wave?)
Google and Yahoo already have copies of thousands of your emails, even those you deleted. AI will make surveillance easier. Imagine billions of high resolution public webcams with face recognition and speech recognition, all instantly transcribed, indexed, and instantly searchable. It's what we want. If we cared about privacy we would be having this conversation by encrypted email instead of on a public forum.
>> And when I say "cheap" I mean on the order of US $100 trillion to $1 quadrillion. That's how much it costs to collect 10^17 to 10^18 bits of knowledge from 10^10 human brains at 150 words per minute, 1 bit per character compression, and a global average wage rate of $5 per hour. At least until we develop nanoscale brain scanners.
> So what happens then? You pay me $3 per human upload? At that rate,
> we'll see how quickly the 'non-essential' humans are liquidated.
So what? If a program simulates you so well that nobody can tell the difference, is it you? Nobody can answer that question, because the question itself is irrational. It only seems important because evolution programmed you to fear the things that can kill you. You'll upload if someone promises to wave a magic wand that transfers your soul.
> That's why I hope to have some useful skill even after the machines
> have displaced the mundane human workfarce.
Forget it. An AI that models your mind could do anything you could.
>> An AI isolated from the internet would be *more* dangerous, for the simple reasons that it would know less about people and people would know less about it. And that's assuming it's possible at all. And don't get started on RSI voodoo. It's humanity, not a human, that creates AI. So that is the threshold you need to cross. Anything less is gray goo. An AI can't understand its own source code (Wolpert's theorem) so any improvement has to come from learning and hardware.
> Is DNA the "sourcecode" for biological life? Would that suggest that
> supposed-intelligence (that which we possess compared to so-called
> artificial intelligence we seek to create) is also bound by
Yes, because Wolpert's theorem still applies. You aren't smart enough to tell which of your children will be more successful than you, even if you sequence their DNA.
> fwiw - I'm confident that AI research by less than the
> sum of humanity can produce something 'better' than gray goo; there's
> also the paper-clip universe. :)
Depends what you mean by "better". You were created by evolution. It wasn't your idea to program yourself to fear death and then die. But evolution knows better.
-- Matt Mahoney, email@example.com
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:04 MDT