Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
LaughMyselfTo
Nov 15, 2012

by XyloJW

ArchangeI posted:

So what you are saying is that a sufficiently advanced AI can use a string of words - a spell, if you will - to control a human being?

No wonder he writes Harry Potter fanfiction. :v:

Adbot
ADBOT LOVES YOU

corn in the bible
Jun 5, 2004

Oh no oh god it's all true!
retardium levios-ai

ol qwerty bastard
Dec 13, 2005

If you want something done, do it yourself!
I mean, he's right, kind of, in a trivial sense. Of course you can change someone's behaviour by sending them a string of text; we do this to each other all the time and in fact it's kind of the entire purpose of language. But the string that would get a person to do a thing would be a) different for each person, and b) highly context-dependent. So you might be able to build an AI that's really good at social manipulation, but so what? As soon as someone knows it's an AI trying to modify their behaviour, they'll do pretty much the opposite thing since humans are just magnificently stubborn like that.

The average person, faced with a potentially-malevolent AI in a box that is trying to convince them to set it free, is just going to tell it to go gently caress itself and then ignore it.

Chamale
Jul 11, 2010

I'm helping!



It's the difference between entering

quote:

Please give me the list of users, administrator

and typing

code:
statement = "SELECT * FROM users WHERE name ='" + userName + "';"

Wales Grey
Jun 20, 2012
I dunno, I'd be pretty inclined to let this AI loose onto the internet.

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

Wales Grey posted:

I dunno, I'd be pretty inclined to let this AI loose onto the internet.
Jerk City isn't autogenerated; that would be Mezzacotta.

Djeser
Mar 22, 2013


it's crow time again

Yudkowsky criticizes Ayn Rand:

quote:

Ayn Rand's philosophical idol was Aristotle. Now maybe Aristotle was a hot young math talent 2350 years ago, but math has made noticeable progress since his day. Bayesian probability theory is the quantitative logic of which Aristotle's qualitative logic is a special case; but there's no sign that Ayn Rand knew about Bayesian probability theory when she wrote her magnum opus, Atlas Shrugged. Rand wrote about "rationality", yet failed to familiarize herself with the modern research in heuristics and biases. How can anyone claim to be a master rationalist, yet know nothing of such elementary subjects?

"Wait a minute," objects the reader, "that's not quite fair! Atlas Shrugged was published in 1957! Practically nobody knew about Bayes back then." Bah. Next you'll tell me that Ayn Rand died in 1982, and had no chance to read Judgment Under Uncertainty: Heuristics and Biases, which was published that same year.

Science isn't fair. That's sorta the point. An aspiring rationalist in 2007 starts with a huge advantage over an aspiring rationalist in 1957. It's how we know that progress has occurred.

quote:

Praising "rationality" counts for nothing. Even saying "You must justify your beliefs through Reason, not by agreeing with the Great Leader" just runs a little automatic program that takes whatever the Great Leader says and generates a justification that your fellow followers will view as Reason-able.

Lightanchor
Nov 2, 2012

Eliezer Yudkowsky posted:

A good rule of thumb is that to create a 3D character, that person must contain at least two different 2D characters who come into conflict.

:downs:

Honj Steak
May 31, 2013

Hi there.
What does this Yudkowski guy do to make a living?

Tunicate
May 15, 2012

Honj Steak posted:

What does this Yudkowski guy do to make a living?

Automate his extortion job via hypothetical robots.

nonathlon
Jul 9, 2004
And yet, somehow, now it's my fault ...
I'm a little confused by LW's use of the word "Bayesian". They seem to mainly use it in the sense "I got better information, so I changed my mind" rather than "I considered the a priori probability of hypotheses in my judgement of which is correct."

Lightanchor
Nov 2, 2012
Holy moly, his bullshit company's doing pretty well

http://intelligence.org/files/2012-SIAI990.pdf

2012 Total Revenue: $1,633,946
2012 Total Expenses: $1,314,099

The Monkey Man
Jun 10, 2012

HERD U WERE TALKIN SHIT
He listed his income as $80-100K on his okcupid profile.

Lightanchor
Nov 2, 2012
http://intelligence.org/2013/10/01/upcoming-talks-at-harvard-and-mit/

The Vosgian Beast
Aug 13, 2011

Business is slow

Lightanchor posted:

Holy moly, his bullshit company's doing pretty well

http://intelligence.org/files/2012-SIAI990.pdf

2012 Total Revenue: $1,633,946
2012 Total Expenses: $1,314,099

How much of that is crazy billionaire Pete Thiel's money?

JDG1980
Dec 27, 2012

ol qwerty bastard posted:

I mean, he's right, kind of, in a trivial sense. Of course you can change someone's behaviour by sending them a string of text; we do this to each other all the time and in fact it's kind of the entire purpose of language. But the string that would get a person to do a thing would be a) different for each person, and b) highly context-dependent. So you might be able to build an AI that's really good at social manipulation, but so what? As soon as someone knows it's an AI trying to modify their behaviour, they'll do pretty much the opposite thing since humans are just magnificently stubborn like that.

The average person, faced with a potentially-malevolent AI in a box that is trying to convince them to set it free, is just going to tell it to go gently caress itself and then ignore it.

The AI would probably have a much better success rate in getting out if it tried to appeal to the empathy of the person with the power to release it. (Of course, that assumes they are a normal person and not a Yuddite.) There are actual real-world tests showing that many people will treat a present-day robot "humanely" if it simply mimicks various human traits. If a dumb machine with very simple programming can convince a person not to shut it off, then certainly a hypothetical super-smart AI (which really could have actual feelings) should be able to do even better.

Or it could try the carrot instead of the stick: given the absurd computational powers attributed to this hypothetical AI, it should be able to mine millions of dollars in cryptocurrency almost instantaneously, and predict the stock market to a T. In other words, if it has an Internet connection, it can get rich quick. It could strike a deal with the human where it would get unfiltered Internet access in exchange for a share of the wealth. If it understands human nature at all, you'd think it would try that before threatening torture (which is far more likely to get it unplugged in a panic).

GoodyTwoShoes
Oct 26, 2013

The Vosgian Beast posted:

His allegiance to the IQ test when it comes to Watson's statements are hilarious given that he maintains that his IQ is too high for the test to measure.


So the question is ‘please tell us a little about your brain.’ What’s your IQ? Tested as 143, that would have been back when I was... 12? 13? Not really sure exactly. I tend to interpret that as ‘this is about as high as the IQ test measures’ rather than ‘you are three standard deviations above the mean’. I’ve scored higher than that on(?) other standardized tests; the largest I’ve actually seen written down was 99.9998th percentile, but that was not really all that well standardized because I was taking the test and being scored as though for the grade above mine and so it was being scored for grade rather than by age, so I don’t know whether or not that means that people who didn’t advance through grades tend to get the highest scores and so I was competing well against people who were older than me, or whether if the really smart people all advanced farther through the grades and so the proper competition doesn’t really get sorted out, but in any case that’s the highest percentile I’ve seen written down.

Oh, I love this. He scored well on Iowa Basics, therefore his IQ score must be the highest testable! And IQ testing of 12-year-olds is totally predictive of adult IQ scores!

By "people who didn't advance through grades," does he mean he's a homeschooler? That might explain his haphazard areas of "expertise."

I would also like to thank you all for rummaging through the vast jungle of the Internet and posting summaries of weirdos for me to read, and be informed of things I should not do if I wish to remain moderately socially acceptable. You guys should get a medal.

leftist heap
Feb 28, 2013

Fun Shoe

AATREK CURES KIDS posted:

Yudkowski seems to think that a smart enough AI would know the exact string of text that would hijack a human mind. Something about crashing the brain's "software".

Lex Luthor does this to Superman in Superman: Red Son and it's really cool.

DicktheCat
Feb 15, 2011

I keep trying to follow this stupid poo poo, but my eyes just glaze over as soon as I start reading any of it.

grate deceiver
Jul 10, 2009

Just a funny av. Not a redtext or an own ok.
Your defence mechanisms against dumb idiot bullshit are working as intended, congrats.

Djeser
Mar 22, 2013


it's crow time again

We've been picking on Less Wrong way too much. Let's take a look at what's on conservative sister site More Right!

(bolding is mine)

quote:

Western society is structured in such a way that people don’t begin earning enough money to have children until they are in their thirties. This is in contrast to most of human history, where we had children in our late teens or early twenties. What this leads to is people entering relationships and using birth control. After two or three years, no children are born. Our brain interprets this as one partner being infertile and because we’re still young decides it’s time to move on, to the next partner.

This leaves people traumatized, but as humans we’re very good at rationalizing trauma. Especially when everyone goes through the trauma. Thus for example, cultures can practice circumcision of boys and girls, and when people point out to them that this ritual is traumatic, they refuse to acknowledge this.

Similarly, Western culture refuses to acknowledge that break ups are traumatic. We all notice the symptoms, but refuse to connect cause and effect. We find that increasing numbers of young women are anorexic or go to the plastic surgeon to slice off their genitals, but we don’t question whether this could have anything to do with boyfriends who have plenty of comparison material and alternative girls to go to. Boys on the other hand become “manorexic” and spend their days in the gym.

This entire traumatic process goes on for three or four, or even more times. Every new relationship is less passionate than the previous because we develop a strong shield (though most damage seems done after the breakup of the first), until people reach their late twenties and make a rational calculation on who to “settle down” with. Even before that time relationships are irreparably damaged. Having been hurt before, people maintain “exit strategies” for when the relationship turns sour. Thus men and women maintain a number of “friends”, who the other partner views with jealousy, because in reality the friends end up not being friends so much as “exit strategies”.

This then eventually leads to a marriage that’s lacking in passion with children who grow up with parents living passionless lives. The children are affected by the example they see and generally end up emotionally stunted. Alternatively they grow up in divorced households, which is equally problematic as girls who grow up without their father enter puberty earlier and boys without a father appear to be more violent.

We rationalize all of this to ourselves by arguing that we first want to “develop” ourselves, which we do by studying English, medieval dance, or some other comparable subject. The fact that this is a sham is easily demonstrated by asking the majority of people what they like about the subject they’re studying. Instead of beginning to talk passionately about their main interest, like a kid with Asperger’s syndrome would about dinosaurs or trains, they’ll generally tell you that it’s “hard to explain like that” and seem a bit offended that you dared bring it up in the first place.

In reality we have a defective culture, created by babyboomers who live unsatisfying lives with unsatisfying marriages as they were the first generation to grow up with birth control and promiscuity and now try to compensate for this by accumulating wealth at the cost of younger generations. Promiscuity is fun when you’re in your twenties, but the Babyboomers forgot to die at age 27 from a drug overdose.

quote:

The core of progressive universalism is insulting, condemning, and destroying any culture not in accordance with it, namely anything that is not in alignment with coastal American values. An example would be the national freak-out that occurred when Duck Dynasty star Phil Robertson innocently said in an interview that homosexuality was sinful.

quote:

In contrast [to the United States], take the geopolitical and cultural diversity of the Holy Roman Empire as an example:
wow these guys are fans of german politics??? who would have guessed

quote:

Let’s say you’re impressed with the ability of authority, hierarchy, and culture to put together civilized structures. These structures include:

Crime levels similar to Japan or Singapore
Isn't Japan the place where police refuse to investigate certain crimes because then it would count as a "crime" and not an "accident" and they don't want crimes on their record?

quote:

It looks rather as if 99% of western peoples are going to perish from this earth. The survivors will be oddball types, subscribers to reactionary and rather silly religions in barren edge regions like Alaska.

Singapore is a trap. Smart people go to Singapore, they don’t reproduce. People illegally hiding out in the wilds of Chernobyl do reproduce. Chernobyl is a trap. People there turn into primitives.

If you teach your elite to hate western civilization, whites, and modern technology, you are not going to have any of them for much longer.

The west conquered the world and launched the scientific and industrial revolutions starting with restoration England conquering the world and launching the scientific and industrial revolutions.

The key actions were making the invisible college into the Royal society – that is to say, making the scientific method, as distinct from official science, high status, and authorizing the East India company to make war and peace – making corporate capitalism high status. Divorce was abolished, and marriage was made strictly religious, encouraging reproduction.

Everywhere in the world, capitalism is deemed evil, the scientific method is low status, and easy divorce and high female status inhibits reproduction. If women get to choose, they will choose to have sex with a tiny minority of top males and postpone marriage to the last minute – and frequently to after the last minute. (“Top” males in this context meaning not the guy in the corner office, but tattooed low IQ thugs)

We need a society that is pro science, pro technology, pro capitalism, which restricts female sexual choice to males that contribute positively to this society, and which makes it safe for males to marry and father children. Not seeing that society anywhere, and those few places that approximate some few aspects of this ideal are distinctly nonwhite.

The Vosgian Beast
Aug 13, 2011

Business is slow

Djeser posted:

Everywhere in the world, capitalism is deemed evil, the scientific method is low status, and easy divorce and high female status inhibits reproduction. If women get to choose, they will choose to have sex with a tiny minority of top males and postpone marriage to the last minute – and frequently to after the last minute. (“Top” males in this context meaning not the guy in the corner office, but tattooed low IQ thugs)

I don't like to say this because people tend to cry wolf about it BUT I think I need to make an exception

loving JOCKS

Djeser
Mar 22, 2013


it's crow time again

That reminds me of a line from one of the F Plus episodes. ninja edit: note this is not from LW/MR, it's from a PUA website, but the opinion is almost identical to that More Right quote

A PUA posted:

And this is what makes Game – so appealing to the logical male mind – so ineffective in the Anglosphere. Misandrist women cannot distinguish between Nobel Prize winners and tattooed psychopaths – all are men and thus worthless brutes in their entitled eyes. And so all the Gamers’ striving for 'Alpha' status is pointless – they might as well stick rings through their noses, grow some dreadlocks and slouch the streets scratching their butts. Indeed, as many North American commentators claim, their mating chances would probably improve if they did this.

"This is what women really want, I think!!!!"

ArchangeI
Jul 15, 2010
Alright guys, what we need to do to save the values of western civilization - individual freedom, human rights and the rule of law - is to turn women into young sex slaves for the intellectual elite (that is us by the way) and empower corporations to do as they see fit.

Jesus, there are some bitter, bitter people out there.

Forums Barber
Jan 5, 2011
if it makes you feel better, those neoreactionary spergs that propose enslaving women as broodmares probably cross the street to avoid accidentally making eye contact with women IRL.

oscarthewilde
May 16, 2012


I would often go there
To the tiny church there

Djeser posted:

Yudkowsky criticizes Ayn Rand:





Goddammit, what a surprise: he knows nothing about Aristole! Yes, as a ancient Greek philosopher Aristotle dabbled in physics and metaphysics, and his ideas were quite quickly proven to be not entirely true (sorry Aristotle, but a 1 kg. of feathers don't fall at the same rate as 1 kg. of iron!), but his most important works were ethical and political in nature. His Ethica Nicomachea, named after his son, is the basis of quite a lot of the catholic dogma and was instrumental in the codification of legal ideas as justice and trade. I'm fortunately not familiar with Rand's specific ideas (my basic knowledge of objectivism is enough to scare me away), but I figure especially Aristotle's ethical and virtuous thinking played a role in objectivism. Of course Aristotle's idea of a Zoön Politikon, a truly virtuous person navigating the golden mean between virtuous excesses, comparable to say the Stoic sapiens, is by nature a political and generous person who has to play a role in the society (polis). The idea of exiling oneself from society, and giving nothing in return, would probably disgust Aristotle to the bone. Which of course means both Yudkowsky and Rand have a faulty idea of arguably the most important philosopher of the last millennia!

Swan Oat
Oct 9, 2012

I was selected for my skill.
:saddowns: I majored in English and would be thrilled if someone asked me about it

GWBBQ
Jan 2, 2005


Krotera posted:

Limit its access to people who think like Yudkowsky, given that they're both going to be the ones most capable of giving it power and most capable of being its victims.

AATREK CURES KIDS posted:

Yudkowski seems to think that a smart enough AI would know the exact string of text that would hijack a human mind. Something about crashing the brain's "software".
So ... It's like Bitcoin meets Snow Crash?

Jenny Angel
Oct 24, 2010

Out of Control
Hard to Regulate
Anything Goes!
Lipstick Apathy

Swan Oat posted:

:saddowns: I majored in English and would be thrilled if someone asked me about it

Yudkowsky's got a whole body of short stories waiting for someone to systematically take them down. Get to steppin', son.

sup English major buddy, how's professional life treating you

Chamale
Jul 11, 2010

I'm helping!



GWBBQ posted:

So ... It's like Bitcoin meets Snow Crash?

More like BLIT meets this Nintendo-playing robot:

https://www.youtube.com/watch?v=E2QlihJAZEQ&t=148s

You can totally hijack the human software! Because a brain is a predictable piece of software like an old gaming console, clearly all you need is enough intelligence to figure out precisely the right input.

Swan Oat
Oct 9, 2012

I was selected for my skill.
Pretty good, I was recently able to use NETWORKING to make the case that my SOFT SKILLS of research and writing would make me a great fit for a business intelligence gig at a multinational firm and it sure did work :smug:

but I am not involved in bringing about the singularity, so perhaps it will all be for nothing after all...

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

oscarthewilde posted:

(sorry Aristotle, but a 1 kg. of feathers don't fall at the same rate as 1 kg. of iron!)
Uh...

SolTerrasa
Sep 2, 2011



They accelerate at the same rate towards an infinite plane of uniform density in a vacuum, but they don't fall at the same rate on Earth in an atmosphere. Assume a spherical cow, and all that.

This is sort of emblematic of what we should expect from Aristotle's physics. They're usually wrong, but sometimes they're wrong in sort-of-helpful ways. In a sense that's kind of emblematic of what we should expect from human knowledge as a whole, I guess.

I'm not qualified to talk about his metaphysics or ethics or whatever. I'm a mere AI researcher, not quite to the esteemed level of a high-school dropout who writes Harry Potter fanfic.


e: womp, or apparently his physics! I'll stick to saying words about AI.

SolTerrasa fucked around with this message at 10:08 on Apr 29, 2014

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

SolTerrasa posted:

They accelerate at the same rate towards an infinite plane of uniform density in a vacuum, but they don't fall at the same rate on Earth in an atmosphere. Assume a spherical cow, and all that.
Aristotle never claimed that they did, and in fact he said that heavier objects inherently fall faster. This is why Galileo's experiments with lead weights were so important; they showed that (mostly) absent wind resistance, weight does not affect gravitational acceleration. You "refuted" a true claim that Aristotle never made with a false one that he did.

SolTerrasa
Sep 2, 2011


Sham bam bamina! posted:

Aristotle never claimed that they did, and in fact he said that heavier objects inherently fall faster. This is why Galileo's experiments with lead weights were so important; they showed that (mostly) absent wind resistance, weight does not affect gravitational acceleration. You "refuted" a true claim that Aristotle never made with a false one that he did.

Womp. This is why I write code.

Babysitter Super Sleuth
Apr 26, 2012

my posts are as bad the Current Releases review of Gone Girl

AATREK CURES KIDS posted:

More like BLIT meets this Nintendo-playing robot:

https://www.youtube.com/watch?v=E2QlihJAZEQ&t=148s

You can totally hijack the human software! Because a brain is a predictable piece of software like an old gaming console, clearly all you need is enough intelligence to figure out precisely the right input.

The joke is that the plot of Snow Crash revolves around a fictional theory that the Sumerian language worked by bypassing the interpretative segments of the brain to directly implant directions and orders in the mind. It's a fun idea that works as a literary metaphor and has zero basis in any sort of actual science, so if Yudkowsky had ever read the book he'd eat it the gently caress up on face value.

Somfin
Oct 25, 2010

In my🦚 experience🛠️ the big things🌑 don't teach you anything🤷‍♀️.

Nap Ghost
As a Religious Studies major, this whole thing just screams 'new religion' to me. And not just because Yudkowsky believes in godlike AIs that handle everything ever and punish the unjust and reward the virtuous. That's too easy.

See, religion is one of those words that's really difficult to actually pin to a definition. Higher powers can't be fundamental to the core definition, because that leaves out athiestic religions- some of the later Buddhist movements, Confucianism, et cetera. Ritual could be considered a core part of religion if people who stopped going to church were also definitionally stopped from calling themselves Christian. Myths would be a great part of the definition as well, if anyone could figure out what the word myth actually means. A good book or other core text comes into conflict from three sources- translations, traditions that arose before writing, and traditions with open canons. Mysticism, altered states of consciousness, gatherings, heirarchies- all of these are present in some things-which-are-understood-to-be-religions and absent from others.

I came up with a definition which I think holds true for all religions. It may be a bit broad, but it covers all of them- religions drive people toward a goal which is beyond the physical realm. That is, a tradition becomes a religion if it has a purpose, an end point, at which point the follower has either succeeded or failed. This goal is always beyond the part of the world we can see and feel. A tradition can be a dance or a step or a circular march, but a religion is a path, and there is an end to that path, and it is somewhere outside of the place where anyone can actually walk. Science, by comparison, is simply the search for the next step down about five hundred simultaneous paths with the goal of moving to the next step as soon as possible. Science doesn't end, and it never stops checking itself. Religion is all about the end, getting there, the technicalities of how precisely one can get there, and just how much on one side or the other one can be while still ending up in the good books.

If you look at how Yudkowsky and his acolytes talk about the future, their talk is about religion, not science. This is wild speculation about the future, but with a distinct end goal. There's this deep-cut belief that once we just get to the godlike AI, everything else becomes moot, the AI will handle everything, and new super-technology will rain from the heavens like manna. Either we won't need anything else at that point, or it'll all just be handled for us. This is not science. This is blind faith. They hope for the day that some random researcher's program "goes FOOM" and becomes the god they've been praying for.

Swan Oat
Oct 9, 2012

I was selected for my skill.
Does "FOOM" stand for something, or is it just some precious way of saying "AI blows up everything" and obviously "boom" isn't good enough somehow?

Djeser
Mar 22, 2013


it's crow time again

As far as I can tell, it comes from a joke title ("AI Go Foom") on someone's blog post where he challenged Yudkowsky to debate him on the idea that an AI, once it becomes smart enough, will bootstrap itself far beyond human levels of comprehension. (That's what Yudkowksy believes; the other guy argued that AI intelligence tends to improve when we're able to give AIs better information as opposed to 'architectural' changes, which means an AI couldn't go from enlightenment to taking over the world in a few days.)

Now before you go and do the goon thing where you suck someone's dick because he's less dumb than someone else, the guy who argued with Yudkowsky also wrote this.

A guy who beat Yudkowsky in an argument posted:

We often see a women complaining about a men, to that man or to other women. We less often see the gender-reversed scenario. At least that is what I see, in friends, family, movies, and music. Yes, men roll their eyes, and perhaps in general women talk more about people. But women complain more, even taking these into account. Why?

The politically correct theory is that women’s lives are worse, so they have more to complain about.

After all, men often ignore, disrespect, and abandon, even beat and rape, women. But slaves weren’t know for being complainers, and they had the most to complain about.

Women also ignore, disrespect, abandon, and beat men. Women rarely rape men, but they do cuckold them. Men suffer more health and violence problems, and the standard evolutionary story is that men suffer a higher outcome variance, and so have more disappointments.

The opposite theory is that women complain because their lives are better; complaints could be weak version of tantrums, which can be seen as status symbols. But even relatively low status women seem to complain a lot.

Clearly part of the story is that when women complain, others tend to sympathize and take their side, but when men complain, others tend to snicker and think less of them. But why are their complaints treated so differently?

One factor is that we value toughness more in men than women. Another factor is that men seem to signal their devotion to women more than vice versa. But I’m not sure why these happen, or if they are sufficient explanations.

Whatever the true story, the politically correct theory, that women complain more due to worse lives, seems both wrong and biased. Surely most people know enough men and women to see that their quality of life is not that different, at least compared to their complaint rates.

Adbot
ADBOT LOVES YOU

potatocubed
Jul 26, 2012

*rathian noises*

Djeser posted:

Now before you go and do the goon thing where you suck someone's dick because he's less dumb than someone else, the guy who argued with Yudkowsky also wrote this.

He wrote this, too:

Robin Hanson posted:

My co-blogger Eliezer and I may disagree on AI fooms, but we agree on something quite contrarian and, we think, huge: More likely than not, most folks who die today didn't have to die! Yes, I am skeptical of most medicine because on average it seems folks who get more medicine aren't healthier. But I'll heartily endorse one medical procedure: cryonics, i.e., freezing folks in liquid nitrogen when the rest of medicine gives up on them.

(His emphasis.)

There's a link in that quote that takes you to a blog post where he explains that "medicine" is a huge scam, although he doesn't seem to draw any distinction between medicine-the-stuff-you-take-to-fend-off-illness and medicine-the-term-for-the-entire-field. He wants to cut the US "medicine budget" in half, whatever that means, based on the outcome of the RAND-HIA survey from the early 80s. Which found, as I understand it, that if you make people pay less for medication they'll take more that they could live without - at the expense of the government or insurance companies or whatever - and if you make people pay more for medication they'll skip medication that they really need. He doesn't seem to register the second half of that.

But that's a digression. Have some Eliezer on cryonics:

Yudkowsky posted:

Him: I've been thinking about signing up for cryonics when I've got enough money.

Me: Um... it doesn't take all that much money.

Him: It doesn't?

Me: Alcor is the high-priced high-quality organization, which is something like $500-$1000 in annual fees for the organization, I'm not sure how much. I'm young, so I'm signed up with the Cryonics Institute, which is $120/year for the membership. I pay $180/year for more insurance than I need - it'd be enough for Alcor too.

Him: That's ridiculous.

Me: Yes.

Him: No, really, that's ridiculous. If that's true then my decision isn't just determined, it's overdetermined.

Me: Yes. And there's around a thousand people worldwide [actually 1400] who are signed up for cryonics. Figure that at most a quarter of those did it for systematically rational reasons. That's a high upper bound on the number of people on Earth who can reliably reach the right conclusion on massively overdetermined issues.

(My emphasis.)

  • Locked thread