From: Olie L (firstname.lastname@example.org)
Date: Fri Nov 25 2005 - 23:40:32 MST
>From: "David Picon Alvarez" <email@example.com>
> > The following is quite an interesting opinion, and it is, of course,
> > your own - why preface it by attributing it to me?
>I believe that this kind of thinking, the "what would Eliezer do" thinking,
>no matter how well one knows Eliezer and how tractable Eliezer is about his
>thought processes, is one of the reasons why many people consider the SIAI
>and other SL4 groups as cultish.
Are you insinuating that just because I have a poster of Eliezer on my
bedroom wall, a "Yudkowski is our saviour" poster in my lounge and a "What
would Eliezer do?" bumper sticker, that I'm somehow subscribing to some sort
of cult? I find that personally offensive, as I do not believe in any sort
of regious dogma. My position that the Singularity is 50% likely to occur
in the next 20 years, freeing us from drudgery and giving us eternal
uploaded life within a few hours is based purely on rational analysis.
Going back to Vassar's original comments, I find it a bit odd to suggest
that /any/ clever AI researcher would be preaching that all good AI
researchers ought to give up their hobbies, and focus every ounce of their
attention on the Big Problem At Hand.
Putting it in a different format:
If you think that in order to achieve a better morality, it's worthwhile to
force suffering - even your own - you're sort of missing the point.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:53 MDT