From: Charles D Hixson (firstname.lastname@example.org)
Date: Wed Jan 18 2006 - 17:18:39 MST
On Tuesday 17 January 2006 03:22 am, Brian Atkins wrote:
> Keith Henson wrote:
> > At 11:05 AM 1/16/2006 -0600, you wrote:
> >> Keith Henson wrote:
> >>> Corporations already have legal rights, a company can own the
> >>> hardware on which the AI is implemented. Corporations can own stock,
> >>> so a corporate AI that owns a controlling block of stock in itself is
> >>> a free agent with effectively the same rights as meat persons.
> >> No, I don't think a computer hardware/software system can own shares
> >> in a corporation. Neither can other objects like a vase or rock.
> > You missed the point. A corporation can own stock in corporations
> > including itself. The corporation can own anything including hardware.
> Your original post: "a corporate AI ... owns a controlling block of stock
> in itself".
> That doesn't make any sense as far as I can see. Legally, an AI being
> considered akin to a vase or rock, cannot own any shares in any
> corporation. The term "corporate AI" as you're using it makes no sense
> Can an AI attempt to maintain control over the actions of a corporation via
> various methods without actually itself owning the controlling stock?
> Perhaps, but as I mentioned in my first message such an arrangement seems
> rather pointless and overcomplicated. The AI in such a case is simply
> property of the corporation, and relies on its control mechanisms over the
> humans who actually perform the actions of the corporation.
I think that the idea is that the corporation would own the computer which
controlled the corporation, and that the corporation would also own itself.
Legally the computer would be a chattel of the corporation, as if your body
owned your brain. (Except that a body is physical, and a corporation is a
I find it weird that corporations would be granted any legal rights at all,
but actuality forces me to admit that as a practical matter they have been.
Given that a corporation can own a controlling interest in itself (I'm not
sure about that one!), then this seems like a perfectly reasonable approach
under the current legal system.
> See above. The corporation that owns the AI property would have rights, but
> exercising and/or defending those rights would be delegated to the human
> officers in control of the corporation. The AI itself has no direct rights
> at all.
Can corporations be corporate officers? On their own board? If so then a
self-standing autonomous corporate entity might require three corporations.
With interlocking directorates. One for president, one for secretary, and
one for treasurer. (I seem to recall that those three offices were required
and needed to be different "persons".) If they can't be directors on their
own board, then this is expanded to 4 corporations per individual. And if
they can't be on a board at all (HAH!), then the scheme falls apart at the
hands of human greed.
> > If Eliezer is right about the speed with which a seed AI goes way beyond
> > humans, the problem will be humans keeping any rights at all rather than
> > figuring out how AIs of human level or above should be treated at law.
> Of course, and I happen to agree. But I can't resist being pedantic on this
> little thread for some reason.
This archive was generated by hypermail 2.1.5 : Tue May 21 2013 - 04:00:49 MDT