Re: [sl4] A model of RSI

From: Matt Mahoney (matmahoney@yahoo.com)
Date: Fri Sep 26 2008 - 15:54:47 MDT


-- Matt Mahoney, matmahoney@yahoo.com

--- On Fri, 9/26/08, Nick Tarleton <nickptar@gmail.com> wrote:

> On Thu, Sep 25, 2008 at 4:27 PM, Matt Mahoney
> <matmahoney@yahoo.com> wrote:
>
> > Belief in consciousness is universal, as is the desire to preserve it.
> > Therefore you will make a copy of your mind, technology permitting.
> > Whether that copy actually contains your consciousness or just
> > makes that claim is
> > irrelevant to any future observable events.
> >
> > (Also, how do the above articles relate to this position?)
>
>
> "Relevant" or not, I prefer that my consciousness persist. (The articles
> make the point that my preferences may involve non-ontologically
> primitive or non-natural categories, including ones I don't yet fully know
> how to define, like "contains my consciousness".)
>
> > Bostrom does not seem to offer any good alternatives.
>
>
> Sections 6-11?
>
>
> > In any case, he implicitly assumes that certain forms
> of intelligence, what
> > he calls eudaemotic (with human-like motivations and
> "conscious") are
> > preferable to other types.
>
>
> Bostrom prefers eudaemonic agents, as do I, whether or not
> they're
> preferable in some universal sense.



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT