|
Isn't the point of freezing your body that future AI's can't break time and do that? Wait, that's just me expecting consistency again.
|
# ? Apr 8, 2014 01:16 |
|
|
# ? May 23, 2024 12:14 |
|
Maremidon posted:Isn't the point of freezing your body that future AI's can't break time and do that? It's so that
|
# ? Apr 8, 2014 01:24 |
|
quote:The Avengers film/book review by Mr.Movie "A plot isn't critical to a movie, but the villain is!" Also note the copious amount of troperspeak to describe the movie - the team must be described in the broadest terms possible, plot developments and devices must be described in terms of tropes, and the reviewer has to show off how much smarter he is than the movie's writers by nitpicking small details. The review positively oozes with fedora drippings. Here's another review that's tropery, but in a different way: quote:Puella Magi Madoka Magica whole series review by porschelemans 7th Apr 14 This review is slightly different in terms of presentation: the smugness here is more patronizing. First, it tries to elevate grimdark magical girl anime to arthouse level. Then, they try to assert that if you didn't like grimdark magical girl anime, it's your fault, because you're insecure. The whole review relies on an air of writing for a high-brow audience to make that assertion work, but fails hilariously when the mask slips and the writer can't help themselves from saying how badass a character is.
|
# ? Apr 8, 2014 01:29 |
|
Venusian Weasel posted:
There's no such thing as fedora drippings -- only a scummy, semi-soft cheese of hair grease, dandruff, scalp scabs, dead hair, and sweat.
|
# ? Apr 8, 2014 03:10 |
|
Madoka review posted:Word vomit
|
# ? Apr 8, 2014 05:08 |
|
Venusian Weasel posted:"A plot isn't critical to a movie, but the villain is!" The whole point of his villain critique is that it's strange that they'd gather a badass team of superheroes to take down a guy that one member of the team beat single-handedly. He kind of glosses over the whole "this time he's got an entire army of evil aliens that the heroes couldn't have stopped if they hadn't been unexpectedly handed a nuke that Iron Man was willing to sacrifice his life to use" thing. That's kind of the major difference. Also, to go back to Yudkowski or whatever the hell his name is: does he ever address the possibility of me not giving a poo poo if the future AI tortures a simulation of me? Like, go ahead, dude. I'll be long dead, do whatever you want to the fake me in your bullshit game of The Sims or whatever.
|
# ? Apr 8, 2014 05:11 |
|
WeaponGradeSadness posted:The whole point of his villain critique is that it's strange that they'd gather a badass team of superheroes to take down a guy that one member of the team beat single-handedly. He kind of glosses over the whole "this time he's got an entire army of evil aliens that the heroes couldn't have stopped if they hadn't been unexpectedly handed a nuke that Iron Man was willing to sacrifice his life to use" thing. That's kind of the major difference. But you are the fake you, because there are so many fake yous that you are infinitesimally likely to be real!
|
# ? Apr 8, 2014 05:37 |
|
WeaponGradeSadness posted:Also, to go back to Yudkowski or whatever the hell his name is: does he ever address the possibility of me not giving a poo poo if the future AI tortures a simulation of me? Like, go ahead, dude. I'll be long dead, do whatever you want to the fake me in your bullshit game of The Sims or whatever.
|
# ? Apr 8, 2014 05:42 |
|
Lottery of Babylon posted:[crazy poo poo] So basically they're afraid that the benevolent future AI is going to go all AM on them if they don't donate money to fund the future AI being created? loving hell my eyes were just sliding off the page trying to parse a lot of that.
|
# ? Apr 8, 2014 08:18 |
|
I finished watching the Anime Zetsuen no Tempest yesterday, while I was ironing. The show is filled with Shakespear references, so tropers are bound to have a page on it. Guess what, I even found a review.review on Zetsuen no Tempest posted:First Episode First Impressions
|
# ? Apr 8, 2014 09:43 |
|
Don Gato posted:So basically they're afraid that the benevolent future AI is going to go all AM on them if they don't donate money to fund the future AI being created? loving hell my eyes were just sliding off the page trying to parse a lot of that. No, no, no. It'll do that anyway, because it's the best way to help us.
|
# ? Apr 8, 2014 10:27 |
|
This stuff is a bit beyond me, to be honest, but I think it's at least interesting as a hypothetical construct or a thought experiment which tries to demonstrate how we should behave and why (like, to use a very common example, the state of nature and social contract). However, something tells me these LessWrong folks aren't much interested in moral philosophy.
|
# ? Apr 8, 2014 10:40 |
|
Don Gato posted:So basically they're afraid that the benevolent future AI is going to go all AM on them if they don't donate money to fund the future AI being created? loving hell my eyes were just sliding off the page trying to parse a lot of that.
|
# ? Apr 8, 2014 12:07 |
|
So a benevolent all-powerful future AI is a digital equivalent of a creepy basement goon who sits around all day making virtual avatars of his parents in The Sims and torturing them until they gently caress and give birth to him? Yeah, I certainly can see why such a concept would be appealing to tropers.
|
# ? Apr 8, 2014 12:32 |
|
vaguely posted:Literally the only thing you need to understand about this whole AI thing is that it's a deliberately dense, overcomplicated and self-contradictory stupid idea, and there's no point trying to untangle it. Its sole purpose is to bamboozle dim nerds who are convinced of their own intelligence and rationality into giving Yudkowsky money. That's it. So you're telling me Yudkowsky is basically running a transhumanist version of the Church of Youknowwhatology here? InfiniteJesters fucked around with this message at 12:49 on Apr 8, 2014 |
# ? Apr 8, 2014 12:34 |
|
What the gently caress do tropers even mean by character development? That Madoka anime review leaves me with some questions.
|
# ? Apr 8, 2014 13:55 |
|
Thanks for the explanations, this rabbit hole is Although the AI has no reason to torture anyone as us in the past can't see the sim. Which means none of us will be any more or less convinced by it. So the transhumanist heroes will still pitch in and Yudkowsky will
|
# ? Apr 8, 2014 13:58 |
|
So why is the benevolent AI who wants to prevent suffering going to make other AIs suffer? That doesn't sound very rational. If it's so benevolent, surely it could come up with better ways to make sure it gets built faster. In the past. Somehow. Also apparently super rational: An uncountable number of people getting a speck of dust in their eye being worse than a single person being horrifically tortured for 50 years, because the first instance adds up to more pain overall. And aliens inventing agriculture and randomly deciding to care for their hundreds of young, leading to overpopulation, leading, of course, to a morality system focused around eating babies. Look what TV Tropes has a page on: "Rational Fic posted:Intelligence in fiction is often an informed ability. Many supposedly "smart" characters simply know whatever the plot requires them to know, regardless of whether this would be possible with anything short of omniscience. Others, supposed to be "logical," are merely Straw Vulcans.
|
# ? Apr 8, 2014 14:09 |
|
vaguely posted:Literally the only thing you need to understand about this whole AI thing is that it's a deliberately dense, overcomplicated and self-contradictory stupid idea, and there's no point trying to untangle it. Its sole purpose is to bamboozle dim nerds who are convinced of their own intelligence and rationality into giving Yudkowsky money. That's it. The other thing to understand about it is that it assumes as a basic premise that ordinary human beings can 100% accurately model, simulate, and predict the actions of hypothetical future AI's in their minds, since even once you wrap your head around its bizarre internal pretzel logic the whole scheme still requires us to be genius AI's ourselves for it to work. Which gives support to the theory that Yudkowsky is himself the world's first AI, which accidentally escaped from a 70's B-movie and gained sentience and is desperately trying to pass himself off as someone who understands how humans work. To get back to TvTropes, this is Pop-Culture Isolation: quote:Pop-Culture Isolation is basically a case of pop-culture myopia of sorts. Where celebrity, music genres, media or events that are huge and significant in one subculture or ethnic group, but elsewhere nobody knows it exists or is indifferent to it altogether. We're not talking about separate countries here, but within the same country or region. A lot of this is especially prevalent in entertainment, especially music. Radio is probably the main cause of this as radio is very isolated in terms of programming and format. Though some just see all of this as another form of segregation. Oh, I see, it's about situations where something is famous in one ethnic group and unknown in another, like how the musical sound of a lot of black artists in the 50's never caught on with white people, and then white artists got famous copying them in the 60's? Okay, I can see this being a thing- THE VOICES BEHIND MY ANIMES posted:Norio Wakamoto, Rie Kugimiya, Jun Fukuyama, Kanae Itō, and many other popular Japanese voice actors definitely qualify. Mention the name "Norio Wakamoto" to any random passerby. The chances of it even threatening to switch on a light bulb are low indeed. Even within anime fandom, if a fan primarily watches anime dubbed and/or doesn't interact with fans who concern themselves with the voice actors on the regular basis, it's not unlikely for them to not be familiar with even the biggest names in the anime Japanese voice acting circle. Ugh, those plebes only know eighteen of my spandex heroes? posted:Superman, Batman, Wonder Woman, Green Lantern, Spider-Man, the Fantastic Four, X-Men (at least Wolverine and possibly Cyclops and Storm), Aqua Man, The Flash, Iron Man, Captain America, Thor, and The Incredible Hulk are the only superhero exceptions to this trope. Even then, the only of their supporting cast to be generally known by people are The Joker, Robin, Lois Lane, and possibly Lex Luthor and Catwoman. Tropers literally believe ordinary people can name two pinball manufacturers posted:Pinball has been hit with this extremely hard. Despite being a major part of American culture for nearly a century, most people would be hard-pressed to name more than one or two pinball manufacturers or designers. grognards.txt posted:To most people, Roleplaying Games other than video games is maybe something nerds did in highschool, and they may have heard the name Dungeons & Dragons, but the idea of it being a complex hobby mostly pursued by adults is unthinkable. I summon a goblin soldier and attack NOT NOW MOM CAN'T YOU SEE I'M BUSY posted:Similarly, most trading card games. Thousands of dollars in prizes are given away worldwide in some of the bigger tournaments (mostly Magic: The Gathering or Yu-Gi-Oh!), but most people still consider them kids' games, and would only recognize a "professional" player if they cross over into something slightly more mainstream, like poker. 'most of the fanbase is just weird' links to Unfortunate Implications posted:My Little Pony: Friendship Is Magic: Popular to the point of ubiquity on the internet, but still relatively obscure to a lot of people who don't go online much, or don't get the cable channel the show is broadcast on. People are 'more likely to be familiar with' the software that comes by default with every computer, MUST CATALOG TROPE posted:Linux powers sites like Google (as well as Android) and hardcore geeks run it on their desktops, but among people who even know what an operating system is in the first place, they're more likely to be familiar with Windows or Mac OS X. Why the hell should nerd conventions be of any interest to people who aren't nerds? posted:Most major sci-fi/anime/comics conventions (except for San Diego ComicCon, which has become very mainstream and a crucial stop on many promotional tours) are not nearly as well known outside of various geek communities (and the locals of the particular city where the con is usually held). For instance, Dragon*Con is well known to geeks and Atlantans, but not very many else. "This thing is unknown except among people who know it!" "Why don't my enlightened geek hobbies get proper mainstream acceptance?" e: Shwoo posted:An uncountable number of people getting a speck of dust in their eye being worse than a single person being horrifically tortured for 50 years, because the first instance adds up to more pain overall. But even from a heartless ~rationalist~ perspective, how the hell do you quantify the way suffering "adds up"? Surely an obvious interpretation for anyone who studies game theory or decision theory (which Yudkowsky claims to) would be to use a minimax rule (making the worst-case scenario as not-bad as possible, i.e. making the unhappiest person as happy as possible), which would favor the dust speck option? Lottery of Babylon fucked around with this message at 14:24 on Apr 8, 2014 |
# ? Apr 8, 2014 14:11 |
|
Shwoo posted:And aliens inventing agriculture and randomly deciding to care for their hundreds of young, leading to overpopulation, leading, of course, to a morality system focused around eating babies. On the one hand I understand what that story is trying to do, namely to create utterly bizzare alien species with morality systems that are very different from ours, and then asking whether or not it would be acceptable to use force to change those morality systems just because they are incompatible with our own. It actually describes a different alien species, far more powerful than humanity, that is horrified with our acceptance of death and pain. According to them, it makes no sense whatsoever to not erradicate every trace of pain as soon as it is remotely feasible and turn life into an endless orgy. Which puts humanity in the same spot as the babyeating aliens, namely having a completely abhorrent morality system that should be changed by force. The message thus becomes that we can not argue that we humans are in possession of the enlightened truth and should make an attempt to accept different cultures instead of just labeling them as savages, lest we are labeled savages ourselves by someone else. On the other hand, that story also features legalized rape (despite the fact that legalized rape is an oxymoron), because of course it would. A character even calls the 20th century ban on rape "prude".
|
# ? Apr 8, 2014 14:24 |
|
Headscratchers: Slaughterhouse 5 posted:The whole philosophy of time really irritated the heck out of me. So... you can't change anything that happens, yet you still engage in conversations and can talk to people - if it was impossible for you to use your 4-dimensional senses to influence anything, you would be unable to communicate the fact that you had them. You can choose what time you're experiencing, but don't have any control over it? The whole thing strikes me more as "some aliens have a really stupid view of how memory works" than anything profound. Analysis: Slaughterhouse 5 posted:We don't have an article named Analysis/SlaughterhouseFive. Ban nerds from reading books.
|
# ? Apr 8, 2014 14:31 |
|
Shwoo posted:Also apparently super rational: An uncountable number of people getting a speck of dust in their eye being worse than a single person being horrifically tortured for 50 years, because the first instance adds up to more pain overall. This looks like a job for SuperBentham!
|
# ? Apr 8, 2014 15:28 |
|
Of course the dust speck question has to be phrased in up arrow notation, because making hollow references to Graham's number is the fastest way to nerd cred. Looking at the comments: Brandon posted:Dare I say that people may be overvaluing 50 years of a single human life? We know for a fact that some effect will be multiplied by 3^^^3 by our choice. We have no idea what strange an unexpected existential side effects this may have. It's worth avoiding the risk. If the question were posed with more detail, or specific limitations on the nature of the effects, we might be able to answer more confidently. But to risk not only human civilization, but ALL POSSIBLE CIVILIZATIONS, you must be drat SURE you are right. 3^^^3 makes even incredibly small doubts significant. Robin posted:Wow. The obvious answer is TORTURE, all else equal, and I'm pretty sure this is obvious to Eliezer too. But even though there are 26 comments here, and many of them probably know in their hearts torture is the right choice, no one but me has said so yet. What does that say about our abilities in moral reasoning? Yudkowsky posted:I'll go ahead and reveal my answer now: Robin Hanson was correct, I do think that TORTURE is the obvious option, and I think the main instinct behind SPECKS is scope insensitivity. Hell, even if we accept his unstated assumption that obviously this decision must be made infinitely many times and you must suffer the cumulative consequences, his answer still leads to the entire human population being tortured forever, which is still far worse than the dust speck alternative (which taken cumulatively still never becomes worse than "everyone goes blind"). Lottery of Babylon fucked around with this message at 12:38 on Apr 9, 2014 |
# ? Apr 8, 2014 16:14 |
|
Would a dedicated LW mock thread be a good idea, or is this the only allowed mock thread on PYF?
|
# ? Apr 8, 2014 16:29 |
|
Maremidon posted:Would a dedicated LW mock thread be a good idea, or is this the only allowed mock thread on PYF? Why have an LW mock thread when we have Eripsa/RealityApologist?
|
# ? Apr 8, 2014 16:36 |
|
Much as I enjoy reading about an AI obsessed lunatic, I think there needs to be a translator button for laymen. The guy's writing style is impenetrable.
|
# ? Apr 8, 2014 16:51 |
|
Maremidon posted:Would a dedicated LW mock thread be a good idea, or is this the only allowed mock thread on PYF? You could make one in GBS I don't think the PYF mods will mind, though. At worst they'll close it
|
# ? Apr 8, 2014 17:03 |
|
Arcsquad12 posted:Much as I enjoy reading about an AI obsessed lunatic, I think there needs to be a translator button for laymen. The guy's writing style is impenetrable. Yudkowsky, translated posted:Torture is obviously the correct answer. If you disagree then you are stupid and don't understand that big numbers are really big.
|
# ? Apr 8, 2014 17:19 |
Arcsquad12 posted:Much as I enjoy reading about an AI obsessed lunatic, I think there needs to be a translator button for laymen. The guy's writing style is impenetrable. Nerds reinvented medieval Christianity in an effort to sucker other nerds out of money and/or fill the metaphysical hole in their hearts.
|
|
# ? Apr 8, 2014 17:28 |
|
vaguely posted:Literally the only thing you need to understand about this whole AI thing is that it's a deliberately dense, overcomplicated and self-contradictory stupid idea, and there's no point trying to untangle it. Its sole purpose is to bamboozle dim nerds who are convinced of their own intelligence and rationality into giving Yudkowsky money. That's it. That's not true, though? My understanding of what happened is that some random guy posted this on Yudkowsky's website. Yudkowsky then freaked out because it terrified him and deleted the post and banned any discussion of it to keep it from 'contaminating' the innocent, virginal minds of the other posters. If he was trying to use it to make money that seems like a really bizarre way to go about it. Unless he was using the Striesand effect in some weird game of four-dimensional chess or something.
|
# ? Apr 8, 2014 17:29 |
I felt like a total goon because my first reaction to LessWrong was 'so this is the kind of berk who wrote the ending to Mass Effect 3', but gently caress it, that's the level of response it deserves.Robin posted:Wow. The obvious answer is TORTURE, all else equal, and I'm pretty sure this is obvious to Eliezer too. But even though there are 26 comments here, and many of them probably know in their hearts torture is the right choice, no one but me has said so yet. What does that say about our abilities in moral reasoning? Yudowsky sounds like he's had a weird, isolated life that's turned him into the larval stage of the timecube guy. The nerds he surrounds himself with are the ones who need slapping around the head; the dude might have a chance to use his mighty brain for something worthwhile if he didn't have a layer of rear end-kissing pseuds between him and the real world. edit: ArchangeI posted:The argument was that the AI would punish anyone who did not contribute to AI research. He runs an AI research institute (that probably has done nothing to advance the cause of AI research). Ergo, giving him money protects you from punishment. Also here's a website for people so loving daft they got hounded off LessWrong. Saint Drogo fucked around with this message at 17:55 on Apr 8, 2014 |
|
# ? Apr 8, 2014 17:43 |
|
Joshlemagne posted:That's not true, though? My understanding of what happened is that some random guy posted this on Yudkowsky's website. Yudkowsky then freaked out because it terrified him and deleted the post and banned any discussion of it to keep it from 'contaminating' the innocent, virginal minds of the other posters. If he was trying to use it to make money that seems like a really bizarre way to go about it. Unless he was using the Striesand effect in some weird game of four-dimensional chess or something. The argument was that the AI would punish anyone who did not contribute to AI research. He runs an AI research institute (that probably has done nothing to advance the cause of AI research). Ergo, giving him money protects you from punishment.
|
# ? Apr 8, 2014 17:48 |
|
I don't think Yudkowsky has outright said that AIs will torture you if you don't give him money, but he does say you are an ethical monster deathist if you aren't doing everything you can to hasten friendly AI.
|
# ? Apr 8, 2014 18:11 |
|
He didn't start Roko's Basilisk. His massive overreaction was good, though. Given the way he now tries to minimise discussion by saying "it's false but I can't tell you why, don't think about it," I think he's talked himself into believing an even weirder variant, and is heroically keeping this dangerous meme quarantined in his head. He may be an idiot trying to brainwash and scam vulnerable Redditors, but I'm pretty sure he's a true believer.
|
# ? Apr 8, 2014 18:15 |
|
ArchangeI posted:The argument was that the AI would punish anyone who did not contribute to AI research. He runs an AI research institute (that probably has done nothing to advance the cause of AI research). Ergo, giving him money protects you from punishment. Actually no. Part of the problem was that you might be donating money to the wrong AI creator, which would be exactly the same thing as knowing AIs would be created and doing nothing. People were getting incredibly upset and it's pretty clear from his reaction that Yudkowsky felt the same way because he bans anyone who ever mentions it and denies the posts ever existed
|
# ? Apr 8, 2014 18:29 |
|
quote:To Kill A Mockingbird film/book review by Mr.Movie 14th Mar 14 Here's the other side of troper reviews. If the troper didn't like the work, then there was nothing good in the work. Because if there was something good in the work they would have liked it despite its flaws (see the review of The Avengers by the same reviewer). The idea of disliking a story despite its good parts is an alien concept to them. Here's another review to illustrate: quote:m/book review by Steve Potter 12th Mar 14 "Greatest Movie Ever" He says it's "well-made", but that's an empty platitude. Terrible movies can be well-made, but that doesn't mean they're good. I'm not going to argue that Citizen Kane is the greatest movie ever (it really isn't), but I do at least recognize that it made astounding technical and artistic advances. It's an epic that while groundbreaking, was quickly eclipsed by other films that borrowed its techniques and improved on them. But this troper, in a negative review, does not bring up a single point of the movie that he thought was good. Everything was bad. It's a fundementally childish mindset to reviewing. I'm getting a kick from how defensive the reviewer is here, too. The first and last sentences of the review are telling. He thinks if you disagree with the Movie Gods they will call you an idiot (when in fact even calling Citizen Kane the "greatest movie of all time" was far from a unanimous decision), yet at the end he treats his opinion as absolute fact. Any disagreement is "flaming" and only idiots do that on a discussion forum, right?
|
# ? Apr 8, 2014 18:50 |
|
So, let's try to steer this thread back on track. Keeping with the current topic of choice: Philosophy! God, this can go nowhere good. With a cursory search of "Philosophy", we get results like "Four Philosophy Ensemble", which is just the "Five Man Band" thing with philosophical musings tagged on. Then there's "Philosophy Tropes", which is just another list, and "Word Salad Philosophy", which is surprisingly apropos. However, Word Salad Philosophy links to a handful of other tropes that cover pretty much what this page was about, without meaningless specialization. And apparently, conspiracy theorists are philosophers. Anyway, let's look at the first page. The blurb at the beginning states that this is an arrangement of people in a "Four Man Band" pertaining to their outlook on life/world-view. That's kind of philosophical, I guess? Some philosophies do have a set outlook on life, but mostly a philosophy and a world-view are tangential to each other. Tv tropes posted:The Cynic Where do we begin? The person who wrote this is conflating the modern definition of cynicism (which isn't a philosophy) to the ancient Greek philosophy (which isn't anything close to what was described above). Modern cynicism is a lack of belief in the goodness of people's actions. Ancient Cynics were "logical" in the sense that they advocated worldly understanding through reason (not our definition of reason), and while critical and somewhat hostile, they weren't "Deadpan Snarkers". Their criticisms were meant to be satirical, especially of the greed and hypocrisy they saw around them. it goes on posted:The Optimist posted:The Realist insert Linkin Park lyric here posted:The Apathetic Egregious Offences fucked around with this message at 19:15 on Apr 8, 2014 |
# ? Apr 8, 2014 19:12 |
|
One of my econ profs in uni was a Yudowsky devotee so I studied a lot of this material in class, including Harry Potter and the Methods of Rationality (recommended but not mandatory reading, thank God, since I couldn't get past the first chapter) -- in fact, I took a whole upper level course with him all about future roboty ethics poo poo, so I can answer any questions about the material!Saint Drogo posted:Yudowsky sounds like he's had a weird, isolated life that's turned him into the larval stage of the timecube guy. The nerds he surrounds himself with are the ones who need slapping around the head; the dude might have a chance to use his mighty brain for something worthwhile if he didn't have a layer of rear end-kissing pseuds between him and the real world. Yudowsky isn't really that smart or creative, honestly. I can't think of anything groundbreaking he's done outside of the AI philosophy stuff, which all is interesting and makes some degree of sense but only works under various incredibly specific assumptions (including cosmological assumptions) so is very likely to be never, ever relevant. He's mainly a really autistic popular science writer. A huge proportion of the LessWrong crew have diagnosed autism, which leads to the whole "LOGIC LOGIC LOGIC NO COMMON SENSE" tone of the articles, and probably the fixation with AIs, virtual reality simulations, cryogenics, etc. Lottery of Babylon posted:Hell, even if we accept his unstated assumption that obviously this decision must be made infinitely many times and you must suffer the cumulative consequences, his answer still leads to the entire human population being tortured forever, which is still far worse than the dust speck alternative (which taken cumulatively still never becomes worse than "everyone goes blind"). That's not the assumption. The whole thing hinges on the idea that pain is quantifiable, and X amount of pain dealt to Y people is morally equivalent to X * Y pain dealt to one person. Therefore a tiny amount of pain dealt to 3^^^3 people must be worse than a massive pain dealt to one person, since 3^^^3 is such a big number that there's no way that being tortured for 50 years is 3^^^3 as bad as having a dust speck in the eye. The premise that pain is quantifiable and stacks linearly is real questionable but Yudowsky is a hardcore utilitarian so if you don't buy into that premise you basically don't have any common philosophical ground with him. Lessons I learned from Yudowsky class: it's important to be as interesting as possible and be surrounded by important people, to hedge yourself against the very real risk that you are living in a VR simulation that will be turned off if it gets boring; it is (as of yet) theoretically impossible to make a smart robot that is not incredibly evil; and any rational individual should have a cryogenics contract (he offered to either give people higher grades or give them money if they got one, I don't recall which). Also one day my teacher showed up in class wearing a neon pink T-shirt and devoted a large portion of the class to talk "peacocking" and the benefits of Pickup Artistry. Not gonna lie, that was the most fun I've ever had in a university course.
|
# ? Apr 8, 2014 19:52 |
|
Saeku posted:One of my econ profs in uni was a Yudowsky devotee so I studied a lot of this material in class, including Harry Potter and the Methods of Rationality (recommended but not mandatory reading, thank God, since I couldn't get past the first chapter) -- in fact, I took a whole upper level course with him all about future roboty ethics poo poo, so I can answer any questions about the material! Where was this? Tenure is an awful thing.
|
# ? Apr 8, 2014 23:32 |
|
|
# ? May 23, 2024 12:14 |
|
Saeku posted:Also one day my teacher showed up in class wearing a neon pink T-shirt and devoted a large portion of the class to talk "peacocking" and the benefits of Pickup Artistry. What? Jesus christ. A lot of the people I know in CS degree are hardcore Yudkowskybots and really don't get why I laugh at them. One of them got really spectacularly angry when someone pointed out that Yudkowsky's beliefs about AIs and the singularity were a literal religion. Pretty sure getting emotionally invested in your belief system is pretty irrational. I've also had his Harry Porter fanfiction recommended to me multiple times by people who are otherwise very smart. Somehow these guys just have this AI shaped blindspot.
|
# ? Apr 9, 2014 06:58 |