|
anilEhilated posted:Jesus H. Irrational Christ. The best part is that he's never actually read Harry Potter, only fanfic of it.
|
# ¿ Feb 22, 2015 09:13 |
|
|
# ¿ Apr 30, 2024 02:35 |
|
Yud is cramming his whole story into just the first year, so a few events from The Philosopher's Stone show up, but nothing beyond that.
|
# ¿ Feb 23, 2015 01:45 |
|
When Yud!Harry finds out about Parseltongue/talking to animals, he is aghast that he might accidentally eat a sentient creature. When he finds out about unicorn blood, he is aghast that no one is breeding and slaughtering these sentient creatures to prolong human lives.
|
# ¿ Feb 23, 2015 02:37 |
|
The Unholy Ghost posted:I guess to summarize, I doubt any of you would be desperate to hate this story so much if the author didn't have such a big head. This is correct. There's so much bad writing out there that if Yud wasn't head of a rationality
|
# ¿ Feb 23, 2015 03:15 |
|
It'd still be a story featuring an insufferable, egotistical prick as the hero.
|
# ¿ Feb 26, 2015 05:59 |
|
If your foreshadowing just comes off as you being a lovely writer, you're probably a lovely writer, hth. And that's assuming that Yud had actually planned that twist from the start. It's really obvious from how the first few chapters are written that HP:MoR started as just some side project that Yud only started to take it seriously after it received so much attention. Telarra fucked around with this message at 06:16 on Feb 26, 2015 |
# ¿ Feb 26, 2015 06:13 |
|
Pretty sure Yud has literally said Harry is based on himself when he was younger. Plus, y'know, all the giant infodumps about science and rationality Yud forces through the mouthpiece that is Harry.
|
# ¿ Mar 3, 2015 22:01 |
|
How is reinterpreting the grim reaper look-alikes as personifications of death 'very clever'? Especially when they can't actually kill anyone?
|
# ¿ Mar 7, 2015 00:38 |
|
It was editted, but he was unwilling to compromise on certain points, so it will still be that chapter.
|
# ¿ Mar 16, 2015 19:32 |
|
Nessus posted:Also, "surprise sex." Really. It should come as no surprise that Yud is a huge fan.
|
# ¿ Mar 19, 2015 04:31 |
|
su3su2u1 posted:They might hate stilted writing, or badly paced plots. The story is pretty much a disaster, the only way I can think of people enjoying it is if they are "riding along" with Harry as a power fantasy. This is why I enjoyed it on my first reading.
|
# ¿ Mar 19, 2015 17:00 |
|
No, it's him being a smartass and letting her think he did something when the drink did it itself.
|
# ¿ Mar 24, 2015 05:56 |
|
Ohhhh, it's 3^^^3 dust specks, not 3^3^3^3? In that case I agree, the torture option is obviously the morally correct choice.
|
# ¿ Mar 24, 2015 07:32 |
|
More seriously, yes, friendly AI isn't some crazy idea. Science fiction has toyed with the idea of artificial minds overthrowing their creators for probably a century by now, to the point where everyone is aware of it. The problem is that Yud claims to have more than science fiction to contribute here, and he doesn't. He has unitless utilitarianism and a fetish for Baye's theorem, and that's about it.
|
# ¿ Mar 24, 2015 07:52 |
|
Specifically, Yud's research team, the Machine Intelligence Research Institute.
|
# ¿ Mar 25, 2015 23:42 |
|
It's also unbelievably stupid because it's the "well what if I multiply it by INFINITY? " trick they love so much. Hell, that's the entire sum of Yud's arguments about cryonics right there.
|
# ¿ Mar 26, 2015 16:10 |
|
If I had to guess I'd say it'd be around the point where they let him drop out of middle school.
|
# ¿ Mar 27, 2015 02:25 |
|
It's someone else. Yud is merely dumb enough to contact Daniel Radcliffe to try to get permission to publish it. e: And this is why I have such a beef with this fanfic. There's a lot of dumb fanfics, but I draw the line at spreading misinformation and fostering a fanbase of smug, self-righteous "rationalists". Telarra fucked around with this message at 01:35 on Apr 11, 2015 |
# ¿ Apr 11, 2015 01:22 |
|
No, no leaks. We know he lost a bunch and called it quits because he posted so himself, albeit after a hundred paragraphs of patting himself on the back for the times that he won, and holding up himself as a paragon of motivation.Big Yud posted:There were three more AI-Box experiments besides the ones described on the linked page, which I never got around to adding in. People started offering me thousands of dollars as stakes—"I'll pay you $5000 if you can convince me to let you out of the box." They didn't seem sincerely convinced that not even a transhuman AI could make them let it out—they were just curious—but I was tempted by the money. So, after investigating to make sure they could afford to lose it, I played another three AI-Box experiments. I won the first, and then lost the next two. And then I called a halt to it. I didn't like the person I turned into when I started to lose. "An experiment is totally still valid if you cut off the results once you stop getting ones you like, right?"
|
# ¿ Oct 6, 2015 20:21 |
|
He used lojban in one of his short stories, and yes, he hosed it up pretty royally. I wouldn't be surprised if there's some lojban-derived spells mangled beyond recognition buried in MoR somewhere.
|
# ¿ Oct 16, 2015 05:09 |
|
Does Dumbledore's voice just sound really off to anyone else?
|
# ¿ Dec 30, 2015 10:07 |
|
There's no way the italicized introduction was there from the start, but archive.org doesn't cache fanfiction.net, and the cache of hpmor.com only goes back to early 2012, with 70+ chapters.
|
# ¿ Jan 17, 2016 23:03 |
|
Night10194 posted:Don't forget he believes we'll find a way around thermodynamics solely because he really, really wants to
|
# ¿ Mar 21, 2016 16:27 |
|
Trasson posted:All of this crap speaks to Harry Potter magic being a learned discipline where someone needs practice, focus, and dedication in order to accomplish great things. So the complete, total opposite of how Yudkowsky sees the world.
|
# ¿ Mar 1, 2017 22:30 |
|
He was upset that his holy Bayes Theorem gave him an answer he didn't like, so he decided that causality must be wrong.
|
# ¿ Mar 5, 2017 09:45 |
|
Milky Moor posted:I think it's somewhat childish to take someone's work and basically misinterpret it to argue your own points. You can get away with it if you have a deep knowledge and appreciation of the source material, I think, but Yud clearly doesn't. Probably the best example of this is how he makes Dementors the embodiment of death.
|
# ¿ Mar 18, 2017 06:05 |
|
No one is arguing for But don't let that stop you from proclaiming your superior maturity, and projecting your specific fears of death onto others.
|
# ¿ Apr 10, 2017 07:08 |
|
And on a more snide note, perhaps granting rich white guys immortality would be how we can make them give a poo poo about the fate of the planet.
|
# ¿ Apr 10, 2017 07:12 |
|
Tiggum posted:I don't know where you're getting this "magic solution" idea from because no one (not even Yudkowsky) is suggesting that that will be a thing. Yud totally does, it's one of the fundamental tenets of his organization. Both that it will magically happen (better software/ai makes it progressively easier to make even better software/ai -> at some point "the AI goes foom" as the graph asymptotes), and that it will magically solve everything (a god-AI able to reason from first principles the best solution for absolutely everything, and able to manipulate humans into carrying out its plans). He imagines their role as ensuring that this AI god comes to pass, and that it is "friendly". Telarra fucked around with this message at 09:01 on Apr 10, 2017 |
# ¿ Apr 10, 2017 08:37 |
|
|
# ¿ Apr 30, 2024 02:35 |
|
Tunicate posted:Case in point; Yud believes that sufficiently powerful AI will overcome thermodynamics, so the heat death of the universe is not an endpoint for his hyperimmortality afterlife It will also overcome entropy, so he can see his father again.
|
# ¿ Apr 10, 2017 09:17 |