|
Torture by having dust poured in your eyes!
|
# ? Mar 26, 2015 20:29 |
|
|
# ? May 17, 2024 18:36 |
Cycloneman posted:The solution to this "problem" - that choosing dust specks over torture is "inconsistent" with choosing low possibility of death over minor inconvenience - is that the situations are not analogous at all. By avoiding torture I am achieving a terminal value (fairness) that I am not achieving by avoiding minor risks.
|
|
# ? Mar 26, 2015 20:44 |
Isn't quantifying the unquantifiable pretty much whas Bayesian statistics are about?
|
|
# ? Mar 26, 2015 20:44 |
|
anilEhilated posted:Isn't quantifying the unquantifiable pretty much whas Bayesian statistics are about? Not in my opinion. Quantifying the "impossible to be very precise about", at the very best. Nessus posted:But that makes everything way more complex. What if the Godputer can't understand that, despite being God!? This is why Yudkowsky has self-published a bunch of "machine ethics papers", which extend utility into the "coherent extrapolated volition". Google the phrase and read the paper if you want, but he wants God to figure out what the best possible version of all humans would want, collectively, and do that, rather than simply adding up utilons. This has, somewhat surprisingly, not caused him to change his belief re: torture v dust. SolTerrasa fucked around with this message at 21:28 on Mar 26, 2015 |
# ? Mar 26, 2015 21:25 |
|
The real solution is that anyone who says we should torture a single person for 50 years instead of everyone getting dust specks in our eyes is the person we should volunteer to torture to save our eyes.
|
# ? Mar 26, 2015 21:42 |
SolTerrasa posted:This is why Yudkowsky has self-published a bunch of "machine ethics papers", which extend utility into the "coherent extrapolated volition". Google the phrase and read the paper if you want, but he wants God to figure out what the best possible version of all humans would want, collectively, and do that, rather than simply adding up utilons. This has, somewhat surprisingly, not caused him to change his belief re: torture v dust.
|
|
# ? Mar 26, 2015 21:48 |
|
Trasson posted:The real solution is that anyone who says we should torture a single person for 50 years instead of everyone getting dust specks in our eyes is the person we should volunteer to torture to save our eyes. Seems fair enough.
|
# ? Mar 26, 2015 22:35 |
|
Realistically we make similar trades all the time (although they're more verifiable). We know that a certain amount of people will be seriously injured and experience chronic pain due to automobiles - we accept that because automobiles allow our current society to exist. We know that vaccines will kill several children every year - but vaccines are mandatory for public schools because infectious diseases would and did kill a lot more children. The death penalty is applied hundreds of times every year, certainly killing one person with the hope that fewer will suffer as a consequence. People undergo painful medical experiments so that cures and treatments can be found. Sure they're usually volunteers, but I'm sure you could find at least one volunteer for decades of torture if it would verifiably lead to a reduced accident rate for the general population.
|
# ? Mar 26, 2015 23:48 |
|
Night10194 posted:I'd define it primarily as people who are actual contributors to his institute (financial ones), and who are members of the Less Wrong community. I have no idea if you're a member. You're pretty obviously a fan of the work, but that has no real bearing on if you self identify as a member of the community or of Yudkowsky or similar 'rationalist' orbits. My primary academic interest is in the fact that this fiction, and the Sequences, and many of his theories, have a cast very similar to a lot of Christian religious and apocalyptic dogma, despite their avowed atheism. I'm currently beginning to gather data and do reading on his work because of the fascinating parallels between the Cryonics stuff and the Christian resurrection of the Dead, the similarities between AI Go Foom and classic apocalypse, etc, because I have approval and support from my old advisers from my master's program that there might be a productive bit of work to be done on singularity and science fetish cults, and on the sort of cross pollination between commonplace religious ideas in the larger culture and the texture of what they end up believing. Would be very interested to read what you come up with. To that end, MIRI posts their annual 990 tax docs online! https://intelligence.org/transparency/ Despite what our friend Legacyspy says, Yud's cult is doing better than ever financially. $1.3M in contributions in 2013! Up from $600k in 2011 and $400k in 2009. They also sold the old singularity related webaddress to Kurzweil to avoid confusion for $300k, or something. Looking at the donor list, the vast majority of the money is contributions from a handful of silicon valley millionaires https://intelligence.org/topdonors/ Yud is the only person doing 'research', and has been paying himself about $88k/year since 2007. Hosting the Singularity Summit in 2006 was really the turning point for funding and recognition. Kurzweil's name and $350k of paypal guy Peter Thiel's money really got things going. Well, the number of donors doesn't look like it really increased but the handful of big donors started then. Thiel alone has donated over $1.6M so far. The top 10 donors donated $4.3M, roughly 80-90% of all donations in the last 5 years. This guy did a writeup but is a few years old http://lesswrong.com/lw/5il/siai_an_examination/ EDIT Back to the actual fanfic after the wonderful Legacyspy derail. Watson's 2-4-6 Task is an example of confirmation bias not positive bias. Confirmation bias is an accepted term in psychology and other sciences. Positive bias is a meaningless neologism Yud invented and propagated on LW and HPMoR. i81icu812 fucked around with this message at 04:28 on Mar 27, 2015 |
# ? Mar 27, 2015 00:03 |
|
Trasson posted:The real solution is that anyone who says we should torture a single person for 50 years instead of everyone getting dust specks in our eyes is the person we should volunteer to torture to save our eyes. Well, I reckon first we should punch that arsehole who keeps throwing dust in everyone's faces.
|
# ? Mar 27, 2015 00:41 |
|
This is a trip down memory lane. Jesus, I was nineteen when this came out. I don't remember being very impressed with it the first time I clicked through. It has some things going for it -- the prose is merely annoying and the grammar is inoffensive. I kept reading because it made me laugh, especially when the sequence where Harry comes into the Grand Hall. But Chapter 7 still made me gag and the whole thing reeked of Mary Sue insertism. I bailed on it after Quirrell...took it upon himself to teach Harry A Very Special Lesson after the incident with Snape goes down. The stupidity of that sequence is mind numbing. Also proof that the author was watching way too much Naruto at the time. I had no idea he was so crazy though. Or that he was a grown rear end man, older than me. Jesus, where did his poor parents go wrong?
|
# ? Mar 27, 2015 02:22 |
|
If I had to guess I'd say it'd be around the point where they let him drop out of middle school.
|
# ? Mar 27, 2015 02:25 |
|
No kidding. I just read his precious autobiography I'm going to do something nice for my mom and dad since they put up with my insufferable smugness instead of capitulating to it.
|
# ? Mar 27, 2015 02:39 |
|
http://www.bizjournals.com/sanfrancisco/print-edition/2012/12/28/radical-science-finds-savior-in-thiel.html posted:The Thiel Foundation’s Breakout Labs funds high-risk but potentially disruptive science that would have a hard time attracting funding from conventional sources such as venture firms because it’s too complicated, too expensive or too far from commercialization for investors to feel comfortable. So, they're literally a league of mad scientists.
|
# ? Mar 27, 2015 03:22 |
|
Is anyone else blinking abnormally often as they read about this dust speck debate?
|
# ? Mar 27, 2015 04:39 |
|
No, I just keep torturing to keep the dust away.
|
# ? Mar 27, 2015 05:22 |
|
i81icu812 posted:Back to the actual fanfic after the wonderful Legacyspy derail. Watson's 2-4-6 Task is an example of confirmation bias not positive bias. Confirmation bias is an accepted term in psychology and other sciences. Positive bias is a meaningless neologism Yud invented and propagated on LW and HPMoR. I just Googled and it is indeed named "confirmation bias" and not "positive bias". Does Eliezer just call it by his own invented name, or does he also claim to have come up with the concept himself? What's his purpose for calling it by a different name? JosephWongKS fucked around with this message at 05:26 on Mar 27, 2015 |
# ? Mar 27, 2015 05:23 |
|
Can't we just cut the Gordian knot here and shoot the AI forcing us to choose between torture and dust motes? Just shoot it. Shoot it dead. Shoot it in the goddamn face.
|
# ? Mar 27, 2015 05:24 |
|
Chapter 8: Positive Bias Part Seven quote:
Neville was THE King Butt of Jokes even in the canon series. He’s going to be so terribly brutalized in this version of the story. quote:
So why was Harry so reluctant to trust McGonagall or the other adults in the previous chapters?
|
# ? Mar 27, 2015 05:32 |
JosephWongKS posted:So why was Harry so reluctant to trust McGonagall or the other adults in the previous chapters?
|
|
# ? Mar 27, 2015 05:46 |
|
JosephWongKS posted:I just Googled and it is indeed named "confirmation bias" and not "positive bias". Does Eliezer just call it by his own invented name, or does he also claim to have come up with the concept himself? What's his purpose for calling it by a different name? Yud coined a neologism! He even wrote a whole blog post about it. http://lesswrong.com/lw/iw/positive_bias_look_into_the_dark/ Actual distinction between confirmation bias and Yud's thinking? Who knows. Why does he do this? Who knows. Perhaps it makes him feel special? Per his autobiography, he also made up Algernic, Unrationalization, Countersphexist, and Singularitarianist. Plus neruohacking, his anime power of rewiring his brain (to avoid those unpleasant teenage emotions you see). He likes making up terms. See also AI go foom. i81icu812 fucked around with this message at 06:04 on Mar 27, 2015 |
# ? Mar 27, 2015 05:57 |
|
i81icu812 posted:Yud coined a neologism! He even wrote a whole blog post about it. http://lesswrong.com/lw/iw/positive_bias_look_into_the_dark/ Actual distinction between confirmation bias and Yud's thinking? Who knows. Those sound uncannily similar to Scientology or Mormonism jargon, or "New Age" beliefs in general. I can see where the cult leader comparisons are coming from.
|
# ? Mar 27, 2015 06:02 |
|
JosephWongKS posted:Those sound uncannily similar to Scientology or Mormonism jargon, or "New Age" beliefs in general. I can see where the cult leader comparisons are coming from. Ahem, the preferred LW term is phyg. No cults here, no sir.
|
# ? Mar 27, 2015 06:06 |
|
petrol blue posted:Torture by having dust poured in your eyes! Its kinda interesting how Yud has this hard on for torture and this massive phobia of death, when so many torture techniques involve getting your brain to think you are dying (eg waterboarding). Like you can just tell that something in his head lies at that intersection that if we had a better grasp of it a whole lot of stuff would start making sense.
|
# ? Mar 27, 2015 06:17 |
|
i81icu812 posted:Yud coined a neologism! He even wrote a whole blog post about it. http://lesswrong.com/lw/iw/positive_bias_look_into_the_dark/ Actual distinction between confirmation bias and Yud's thinking? Who knows. There's a slight difference. Confirmation bias is assessing evidence by emotion rather then dispassionately, so you agree with what you like and discount what you don't like. "Positive bias" refers to the situation where you're not even aware there's another possibility, so you keep trying to prop up what you know works rather then seeking a deeper understanding. It's the idea that the phrase "think outside the box" is supposed to evoke.
|
# ? Mar 27, 2015 06:28 |
|
Added Space posted:There's a slight difference. Confirmation bias is assessing evidence by emotion rather then dispassionately, so you agree with what you like and discount what you don't like. "Positive bias" refers to the situation where you're not even aware there's another possibility, so you keep trying to prop up what you know works rather then seeking a deeper understanding. It's the idea that the phrase "think outside the box" is supposed to evoke. Umm no. Yud took a classic experiment designed around confirmation bias and decided that he wasn't going to call it confirmation bias, because reasons. You see, in the classic test you might see 2-4-6 and decide the rule is that the pair has to increase by 2 So seeking to confirm your hypothesis you would give what Yud calls a positive example of your hypothesis you test 10-12-14 and its right,etc. It is all a related failing. He decided that literally a well attested textbook example of confirmation bias is really about something different, and he did nothing to develop or test the idea and then splooged out a blog post (because mental masturbation, see what I did there?).
|
# ? Mar 27, 2015 07:05 |
|
HIJK posted:Can't we just cut the Gordian knot here and shoot the AI forcing us to choose between torture and dust motes? "Look at you hacker, a pathetic... Creature... Hang on, I've got something in my eye... Give me a second..."
|
# ? Mar 27, 2015 07:27 |
|
su3su2u1 posted:Umm no. Yud took a classic experiment designed around confirmation bias and decided that he wasn't going to call it confirmation bias, because reasons. Yeah. Pretty much. Yud likes doing this, look for it to continue! Moreover, the majority of people DON'T do what Hermione did and exclusively guess confirmatory sequences without ever changing their hypotheses. It is true that most sequences are confirmatory of the hypothesis being tested, it appears we are wired to think this way per classical confirmation bias. However, most people will change their hypothesis from one guess to another to narrow down the rule. Generally, roughly half of all guessed sequences are disconfirming examples of the hypothesis used for the previously guessed sequence. (e.g. Hypothesize even number: guess 4-6-8 -> Hypothesize odd numbers: guess 5-7-9 -> Hypothesize increasing numbers: guess 1-10-199 -> etc. All guesses confirmed the current hypothesis but disconfirmed the previous hypothesis as the hypothesis iteratively changed. It probably helps that the 2-4-6 protocol has the subject write down their hypothesis when they make a guess.) Most people still don't get the correct 2-4-6 rule, but Yud's Hermione does remarkably poorly. See: http://www.google.com/url?sa=t&rct=...KkYuCvLa551olBg JWKS posted:A reasonably close-to-canon portrayal of Hermione so far. Can’t wait to see how she gets caricatured or straw-womanned in the service of showing off Eliezarry’s wit and wisdom. So here is example number 1. Hermione is lobotomized to illustrate i81icu812 fucked around with this message at 10:04 on Mar 27, 2015 |
# ? Mar 27, 2015 08:31 |
|
JosephWongKS posted:So why was Harry so reluctant to trust McGonagall or the other adults in the previous chapters?
|
# ? Mar 27, 2015 09:42 |
|
How does Harry know what the Prefects are yet and how does he know they're all in the front carriage? I'll admit I'm glazing over a lot of this thread because Yud is the boring kind of crazy, but I don't recall any time where Harry was informed about the school's prefect system between all the inane "experiments"
|
# ? Mar 29, 2015 19:26 |
|
He's been getting a lot of the details from this cool "Harry Potter" book series he's been reading.
|
# ? Mar 29, 2015 20:15 |
|
Chapter 8: Positive Bias Part Eight quote:
I’m totally lost here. Did I skip a chapter somewhere along the line?. quote:
Ah I see, it was just offscreen shenanigans. quote:
I correctly guessed that the “two figures approaching, looking utterly ridiculous with their faces cloaked by winter scarves” in Chapter 7 were Fred and George. Points to House Me. quote:
Does Eliezarry address Kantian ethics elsewhere in the story? Otherwise it’d be a pretty one-sided argument from a purportedly well-read child prodigy.
|
# ? Mar 30, 2015 08:41 |
|
quote:"- and after we were done giving him all the sweets I'd bought, we were like, 'Let's give him some money! Ha ha ha! Have some Knuts, boy! Have a silver Sickle!' and dancing around him and laughing evilly and so on. I think there were some people in the crowd who wanted to interfere at first, but bystander apathy held them off at least until they saw what we were doing, and then I think they were all too confused to do anything. Finally he said in this tiny little whisper 'go away' so the three of us all screamed and ran off, shrieking something about the light burning us. Hopefully he won't be as scared of being bullied in the future. That's called desensitisation therapy, by the way." No. No it is not. Desensitization therapy is about training a practiced relaxation response in to the phobic stimulus and gradually increasing the stimulus hierarchy. The point of the therapy is to train a non-panic response to whatever the phobia is. Scaring the crap out of someone isn't useful if they aren't trying to control themselves and train another reaction. The most charitable you could be is to call this a sort of attempted classical conditioning. But really this is just bullying, whatever any consequentialist may want to call it. Also it is usually spelled with a 'z' in the US. I guess this is the UK localization? Bystander apathy and consequentialism are correctly used, so 2/4 on correct terminology for this chapter so far. 3/5 if you count naming quark flavors, though I have to say I haven't seen truth and beauty over top and bottom in years. i81icu812 fucked around with this message at 23:14 on Mar 30, 2015 |
# ? Mar 30, 2015 09:23 |
|
It's also strange that Eliezer would (through his self-insert) justify / rationalize bullying, since it sounds like he may have been bullied himself during that period when he "lost the ability to handle school" and "spent a lot of time crying":i81icu812 posted:Behold. Yudkowsky's autobiography, written in 2000. http://web.archive.org/web/20010205221413/http://sysopmind.com/eliezer.html
|
# ? Mar 30, 2015 09:33 |
|
I think Yud is referring to flooding, but who the hell knows with Yud.
|
# ? Mar 30, 2015 09:45 |
|
Chapter 8: Positive Bias Part Nine quote:
Example No. 2 of Hermione being dumbed down to make Eliezarry look smarter. quote:
Example No. 3. quote:
I’m not in academia / research myself, but I don’t think real scientists actually sit around measuring themselves and each other by the books they’ve read, do they? quote:
Isn’t judging all Gryffindors by the actions of this one prefect, a fallacy of hasty generalization and therefore irrational? quote:
I’d be sad too if I was stuck in this story.
|
# ? Mar 30, 2015 10:10 |
|
Its been a few years since I actually sat down and read these, but good lord Eliezarry is a monster. He's like this little bully-tyrant running around with no understanding or appreciation for other people.
|
# ? Mar 30, 2015 15:22 |
|
LowellDND posted:Its been a few years since I actually sat down and read these, but good lord Eliezarry is a monster. To be slightly fair, he is very eventually called out on this kind of crap and does agree he was being a bully. This story is allegedly supposed to be a coming-of-age story where the annoying little prick realizes his annoying prick status and reforms, but the realization happens WAY at the end and we never see any reform.
|
# ? Mar 30, 2015 15:55 |
|
JosephWongKS posted:I’m not in academia / research myself, but I don’t think real scientists actually sit around measuring themselves and each other by the books they’ve read, do they? I'm fairly certain that scientists measure their dick sizes by comparing how many studies they've authored, books they've written, and/or projects they've completed and by how much attention they've garnered in the process. And to be fair, that's not the sort of comparison a couple of ten-year-olds could really engage in.
|
# ? Mar 30, 2015 16:33 |
|
|
# ? May 17, 2024 18:36 |
|
I'll have you know I'd authored several papers and books by that age, and drew pictures for them too.
|
# ? Mar 30, 2015 16:47 |