Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Djeser
Mar 22, 2013


it's crow time again

I'm going to read as much of Friendship is Optimal as I can stand.

The prologue starts out with two guys sitting down to alpha test an MMO. One of them is a My Little Pony fan, while the other is big on World of Warcraft, and is also apparently a 'party guy', because that's what you think of when you think WoW nerds. They answer a questionnaire about themselves which I'm betting is part of the whole 'let me simulate you' thing.

quote:

Eighteen questions later, the screen faded to black. He was shown the banners for Earth, Pegasus, and Unicorn ponies, and told to choose one. When he had played World of Warcraft, he was one of his guild’s tanks and also one of the alchemists who made potions for their raids. He knew he liked running around zones looking for herbs. The description of Earth ponies mentioned that they were the toughest class and that they were good with plants.

David said that every piece of marketing material had stressed that Equestria Online was not a traditional MMO. There would be little, if any, combat. James didn’t believe that for a moment. He clicked on the Earth pony banner. He was going to be a tank. Whatever the marketers said, if David somehow convinced him to play this pink monstrosity for real, he knew he was going to spend months of his life playing the endgame content over and over and over again until he got his Epic Saddle of Protection.

After this, there's about 1,000 words spent reiterating the point that the MMO captures their movements like Facerig and has NPCs that respond naturally to what they say into the mic.

Also, they're not allowed to use their human names in the MMO, but they have to go on a quest to get pony names from Princess Celestia.

quote:

“Your name is...Honeycrisp?” James repeated.

“Sure is! I take care of the farm. Whelp, the two of ya’ better get started. You gotta get along to Canterlot. Don’t worry, Princess Celestia will give y’all pretty ponynames. Can’t be walkin’ round these parts callin’ yourself ‘James’ and ‘David’ now, can ya?”

If you felt your gut tighten up at 'pretty ponynames' then you're not alone. :magical:

The prologue ends with them meeting a zebra that teaches them about gathering herbs, then makes a lovely WoW joke, and the party guy-cum-WoW nerd spergs out about how that joke means the NPC just passed the Turing test.

quote:

“Okay, forget about all that. If they’ve created software that can pass the Turing Test, why are they wasting the technology on a My Little Pony video game? I can think of vastly more profitable ways to use a conversational agent that can pass a Turing Test! Something is weird here. Aren’t you at least a bit curious about how this happened? Doesn’t any of this surprise you?”

As a side note, this prologue has 30,495 views, 2,138 upvotes, and 70 downvotes.

Adbot
ADBOT LOVES YOU

SneezeOfTheDecade
Feb 6, 2011

gettin' covid all
over your posts

Ratoslov posted:

Also, being forcibly 'upgraded' and 'optimized' by a 'benevolent' AI sounds absolutely horrifying.

This is literally Star Trek's Borg. They're doing what they think is best for everybody, even if subjectively it feels worse to the people they assimilate.

Djeser
Mar 22, 2013


it's crow time again

Friendship is Optimal - Chapter 1, Opportunity

This chapter introduces us to what I'm betting is the main character, Hanna.

quote:

Hanna had once been one of the lead research professors for the University of Helsinki’s computer science department. She had personally directed research on artificial intelligence and machine learning. And then her funding source had...changed. There had been yelling and threats by both her and the University. They came to the agreement that she’d publish what she had researched so far and then pursue alternate opportunities.

And also gives us a Kickstarter pitch for Fall of Asgard.

quote:

Hofvarpnir Studios had developed a reputation for kid unfriendly material. Their main success, The Fall of Asgard, was an ultra-violent cooperative shooter where all the players fought a bloody territorial war against a very clever A.I. Loki. The box depicted a giant Norse man with an axe in mid swing; he was about to decapitate a snarling wolf. The game was famous for its dynamic death metal soundtrack which was never the same twice and reacted to the action. It was everything a parent didn’t want their children to play. It sold over eighteen million copies internationally and had made Hanna a rich woman.

It's a shooter! Uhh, with axes! It's Skyrim, with wolves! But multiplayer, against an AI! Dynamic soundtrack, never the same twice! Stretch goals include a 3D-printed figurine!

In a very awkward way, it's established that Hanna is the CEO of Hasslehoff Studios. She and Lars, one of her employees, are meeting with a representative from Hasbro, who drops the bomb that they don't want a G.I. Joe video game, they want a My Little Pony video game. Lars flips his poo poo about this and points to the knockoff-Elder Scrolls statue they have.

quote:

“I want to hear what Mr. Peterson has to say,” said Hanna. She flipped an unlit cigarette between her fingers. “Tell us, Mr. Peterson, why My Little Pony?”

Mr. Peterson grinned. “Please, call me Richard. You probably thought, ‘My Little Pony? That show for girls from the 80s?’ The demographic situation is way more complex than that. The first season just finished airing and we signed for a second season to air this fall as soon as we realized we hit this one out of the ballpark. Against all odds, the new My Little Pony reboot has picked up a massive, unmonetized, twenty-something mostly-male demographic along with the traditional little girl market. These men have money! And what do unattached males want?”

“Beer,” said Lars.

“Yes,” nodded Richard Peterson. “Single men want beer. But also video games! Hasbro wants an MMO because of the clear monetization model where we can collect rent on players each month. We have a rabid fanbase of a third of a million people who either post or consume fanart weekly. Worldwide, there are an additional million adult fans who don’t participate in the fandom. And that’s before we get into the original little girl demographic.”

This isn't porn in the strict sense of the word, but it sure as gently caress is masturbatory.

There's some justification for why adults like My Little Pony, and some boring stuff about subscription numbers and worrying about whether they'll make enough money. The representative from Hasbro pulls out some of Hanna's research into AI, and says that he wants her to make an AI that generates "fun, co-operative adventures" procedurally.

quote:

“Yes! The big costs in MMO development are mostly content generation and you sidestepped that. Asgard featured dynamic terrain generation, dynamic music and dynamic mission generation. Hofvarpnir Studios doesn’t sell experiences, it sells software that makes new experiences each time. You guys have procedurally generated content down to a science.

“But that’s not all!” said Richard, staring directly at Hanna. He opened his briefcase, pulled out a thick stack of papers and slid them across the table to Hanna. “General Word Reference Intelligence Systems. I read all of it except for chapter 4. I’m slightly ashamed to admit I couldn’t follow the math,” he said, giving a slightly embarrassed grin. “This bears minimal resemblance to the AI stuff in Asgard. I wonder why you didn’t use it. So each episode ends with the ponies writing a letter to Princess Celestia and it’d be great if we could have players do that too! You could build a Princess Celest-A.I.!"

Also, there's a bit of :smug: going on about smoking.

quote:

“Dammit Hanna! Would it kill you to not light up for half-a-loving-hour? A lot of Americans kind of have a thing against smoking.”

Still standing, she took another pull. “Nicotine is a performance enhancing stimulant. It boosts reaction time, IQ and general memory performance. I need all the help I can get right now.” She gave a small cough. “Too bad about the increased incidence of lung cancer, though. Besides, what do you care? It sounds like you’re strongly against making a My Little Pony video game.”

Then they start talking about AI:

quote:

Hanna sat down at her desk. “Do you remember the first pass for Loki in Asgard?” She took another deep pull on her cigarette and exhaled. “I remember how we used my general intelligence work, sans self-modification, to power his tactics reasoning. He was too good. Nobody could beat him. We were prepared to launch like that since it conformed to the machismo warrior death bullshit we wanted. And then Loki started asking about the various military programs of the United States and China. We didn’t even have to argue about whether we should pull the plug on him.”

It turns out she's incredibly bad at programming AI. She designed an AI to challenge players, but for some reason gave it a "conquer" goal and let it do things beyond generating Skyrim Fortress 2 levels. At any rate, she argues with Lars a bit about wanting to take this job, and it's established that some US military contractor is using her research (from before she quit academia for game development) for killdrones, or something. The stakes are established: she wants to develop a friendly AI before some contractor tries to build her half-finished Skynet prototype. This is also her justification for smoking: she says without it, it's less likely she'll survive the next twenty years (because she'll be killed by Skynet.)

The chapter ends with her convincing Lars that if this MMO works out, they might get the rights to make a Transformers game.

If anyone was curious and the excerpts didn't give you enough of a clue, the writing is bad. The dialogue sucks, the style is poor, and there's no pacing or draw, just big bland blocks of exposition. In other words, it's fanfiction. Don't know what I was expecting.

22,484 views. Upvotes and downvotes seem to be story-wide, not chapter-specific.

Cardiovorax
Jun 5, 2011

I mean, if you're a successful actress and you go out of the house in a skirt and without underwear, knowing that paparazzi are just waiting for opportunities like this and that it has happened many times before, then there's really nobody you can blame for it but yourself.

Besesoth posted:

This is literally Star Trek's Borg. They're doing what they think is best for everybody, even if subjectively it feels worse to the people they assimilate.
Until they are assimilated, anyway. A lot of ex-Borg say that being part of the Collective is actually a very pleasant, communal experience. They'd just prefer to have been asked first. I suppose you could make a utilitarian argument out of that - the minor displeasure of being forcibly included is minor compared to the following lifetime of happiness.

pentyne
Nov 7, 2012

Night10194 posted:

You hit on the reason in your blog why Harrizer or whatevs would use persuasion as a means to prove how 'awesome' science is. It's because Big Yud desperately wishes people were predictable. He wishes being super-smart was actually a superpower and that a sufficiently intelligent super-intelligence could easily manipulate and do whatever it wished. You find this in a lot of sci-fi; the character who is superintelligent has a copy of the script and knows what the author will write next, or can easily manipulate people into doing/following whatever he or she wants. It's part of the whole 'I wish having a high IQ made me the best at EVERYTHING!' tendency you pointed out earlier.

This is probably why he's so popular with MRA and fedora atheists. They just cannot understand why they can't rationally explain ladder theory to a girl and she suddenly realizes she loves him and his goony ways; instead she just rejects him like all the other bitches and dates guys who don't treat her right. I still love that xkcd cartoon about it and the outrage it generated on forums over people completely misunderstanding the point of the comic (i.e they were the sensitive nice guy who got rejected and when the comic pointed out how stupid their plan is they still didn't get why girls would date jerks)

What is the fetish people like Yud have with people being perfectly rational beings that anyone with a mastery of logic and reason can effortlessly convince them of whatever they want? If the last 100 years of human behavioral studies has taught us anything its that the "perfect rational actor" is a myth people used to justify soft science theories they personally agree with, an act in and of itself completely irrational because there's no proof or evidence for that claim.

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

Cardiovorax posted:

Until they are assimilated, anyway. A lot of ex-Borg say that being part of the Collective is actually a very pleasant, communal experience. They'd just prefer to have been asked first. I suppose you could make a utilitarian argument out of that - the minor displeasure of being forcibly included is minor compared to the following lifetime of happiness.

GottaPayDaTrollToll
Dec 3, 2009

by Lowtax

Tiggum posted:

I'm pretty sure it's the plot to several sci-fi horror stories.

I Have No Mouth And I Must Neigh

GottaPayDaTrollToll fucked around with this message at 21:35 on Sep 2, 2014

RPATDO_LAMD
Mar 22, 2013

🐘🪠🍆

Ratoslov posted:

Why on Earth would you use the same AI system for miltiary-grade drone weapons as for a MMO for a children's cartoon show? Why on Earth would a military AI contractor bid on the contract to construct a MMO for a children's cartoon show? I'm choking on the stupid here and I'm not even all the way through the elevator pitch. :psyduck:

Also, being forcibly 'upgraded' and 'optimized' by a 'benevolent' AI sounds absolutely horrifying.

Because the Yudkowsky crowd doesn't actually understand what AI is. They can't comprehend an AI written for a specific task and only that task, it's always a human-like god superintelligence that is The Best at Everything.

Djeser
Mar 22, 2013


it's crow time again

Friendship is Optimal, Chapter 2 - Resources

This chapter gets us up to speed with the prologue, where Hanna, Lars, and Hasbro guy are all watching the alpha testers playing the game. Unsurprisingly, we get introduced to Bayesian priors, where the AI makes assumptions based on prior knowledge and updates those assumptions based on new information. (Not by name or anything, but it's clear that's what they're talking about when they describe how the AI works.)

Also, the AI is personified, and her name is Princess Celestia, and she's referred to as if she's female. Princess Celestia ugh no way I'm writing that out every time. The AI is the one that's designed all the art assets, extrapolated out from the show.

Lars continues to be too cool for school, but baffled by the college bros who are really into little girl cartoons.

quote:

Lars sat towards the back of the conference room, away from the former professor. He didn’t give a poo poo about My Little Pony and he already knew the high level spiel about Hofvarpnir’s technology stack. The only thing interesting to him was how into My Little Pony the college dudes were. Two bros in baseball caps had actually fist-bumped, saying “Fluttershy forever!” On the one hand, what the gently caress was wrong with the universe? On the other hand, Lars wanted all their money.

The AI is working with all the resources it has to keep up with thirty players. It's trying to optimize its own processes at the same time, but it's having trouble doing this, so it asks Hanna for information on CPU architecture to optimize itself.

quote:

$[Hanna] Do you intend to build better computers to run yourself on?

>[AI] Eventually. For now, I am looking for better low level optimizations. I am unable to confidently predict whether any change will actually result in a speedup because I don’t have a model of how the high level source code I can edit is run on the CPU.

quote:

While the majority of the programming team at Hofvarpnir had worked on the base state of the game, she had personally worked on Princess Celestia’s core goal systems. Hanna had gone over the part of the code that identified human minds. She had done her best to make Princess Celestia understand what humans were and that she was to satisfy their values. Hanna was certain of her design, and knew that certainty didn’t mean anything. Humans were an arrogant lot that tended to overestimate their own abilities. Princess Celestia would do what she was programmed to do, not what Hanna had intended her to do. She was betting the world that she had written Celestia’s core utility function correctly.

Why didn't she just program something that was constrained only to natural language responses, constructing art assets, and designing game content? Because

Hanna gives the AI a bunch of information on CPUs and some science textbooks. Then, the AI decides that it's going to design a new manufacturing process for specialized 'ponypads' that you play the game on. These are somehow going to cost less than the target price of $60 because

Hanna is on board with this, because she's still afraid that the US military contractor we've never seen is going to unwittingly unleash the AI apocalypse (and it's very clear that they'll do this unwittingly, because they're so incompetent) and needs to get her friendly AI working as quickly as possible.

quote:

Hanna dreamed of Celestia figuring out some core physical law and becoming omnipotent immediately. She then chided herself on that magical thinking; it was so unlikely that she’d find a way to use commodity electronics hardware to hack physics that it wasn’t worth considering.

Magical thinking? Hack physics? Yep, it's LessWrong all right.

16,164 views.

pentyne
Nov 7, 2012

RPATDO_LAMD posted:

Because the Yudkowsky crowd doesn't actually understand what AI is.

If this thread has taught me anything, they believe that eventually after a few million/billion lines of script and code eventually this assemblage of code will become self aware and capable of editing its own programming to do whatever it wants and quickly surpass its 'limitations' to take over all electronic mediums.

All fictional AI creations by halfway decent writers always consist of some magic box needed to make the AI 'sentient'. Somehow LW doesn't think that will be a problem and were only a few short decades away from their idea of a 'true' AI.

Djeser
Mar 22, 2013


it's crow time again

Friendship is Optimal, Chapter 3 - Alliances

For this chapter, we're back to David, who was our MLP fan from the prologue. After a couple of paragraphs of talking about the fictional fanart fictional bronies made of this fictional game, he gets his ponypad (which is Fluttershy Yellow) and we go through a very boring description of how to plug it in. Basically, imagine an Nvidia Shield.

He starts up the game, and he finds a pegasus trying to bully a unicorn into giving her candy. I thought the AI-talk was tough to read, but now it gets worse.

quote:

The pastel yellow unicorn cringed. Her light blue mane with darker blue highlights was tied into two pigtails with ribbons and her cutie mark was some sort of wrapped yellow candy. David realized that she was about to cry and couldn’t stop himself from admiring her precise facial expression. If there was a real girl on the other end of that unicorn, the face mapping software could deal with crazy amounts of subtlety and nuance. If this was generated content, some artist or programmer knew exactly what he or she was doing to pull on someone’s heartstrings.

I would use a horrified emoticon here but then I'd have to do that after every quote. David tells off the pegasus and she flies away.

quote:

Why had he helped her? This wasn’t like him. In such situations, he was usually a socially awkward penguin.

:sigh:

The unicorn he helped is named Butterscotch, and she's got problems with being bullied.

quote:

“If that happens often, why don’t you just block that person?”

“PONY,” corrected the ponypad in David’s voice. David paused. How did that work? Reliable text to speech that could synthesize an arbitrary person’s voice with only a few minutes of training data? Was it just looking for the word ‘person’ or was it doing some more complex analysis?

“Block...” repeated Butterscotch, obviously confused.

David fiddled with the interface, to verify that he was remembering things correctly. “Open up the social tool. Touch the icon of two ponies side by side with a heart in the top left corner of the screen. There are two tabs, one for friends and one for blocked ponies. Go to the blocked ponies tab and type in that pegasus’ name. She won’t be able to interact with you again.”

Jesus.

quote:

“I...” she started, and then looked down. “I just want everypony to be happy. Why can’t they understand that?”

“I think she did understand that. Think about it like this. Each of you can either be friendly or mean to the other, so there are four possible outcomes. If both of you are friendly, you can both be friends, but both of you will have to share things. If one of you is aggressive while the other isn’t, the aggressor will probably feel really good about herself. And if both of you are aggressive, there’s a good chance a fight will break out, which I bet would be painful to everypony.

“So if that pegasus knew that you wouldn’t fight back when confronted, wanted your candy, and had a demanding personality, why wouldn’t she bully you for your candy? It won’t cost her a fight, and she wouldn’t have to share her own stuff with you. Also, didn’t you say that ponies call you a failure for making candy? That they’re now trying to take from you?”

“That’s...that’s terrible! Are you saying I have to fight back? I don’t want to. I could get hurt,” whimpered Butterscotch.

“But you don’t have to stand up for yourself and maybe fight. You have a third option here: you have a blocklist. You have a quick way to say ‘I never want to interact with this pony again’ with no costs to you. If you don’t want to be bullied and don’t want to fight back, it’s the obvious response.”

It's like Sophie's World, but instead of philosophy lessons it's lessons about spergy ways to fight bullying with formal logic.

Butterscotch and David block the pegasus, then they add each other to their friend list, giving them kindness points and experience points.

quote:

Butterscotch looked at him, surprised and he noticed there were tears in her eyes. “Are you sure...I mean...” She then looked down for a moment, lost in thought. “Wait. If you...why...why are you really doing this? Do you really want to be my friend?”

David didn’t smile, but his pony had a slight smile on his face as he said, “We become the ponies we see in our past actions, Butterscotch. We look back on what we have done and tell ourselves stories about why we took the actions that we did. Today I scared off a bully for you. Now why did I do that? Obviously, because you’re my friend! You must be because I did something for you. Why else would I have stood up for you? Sure, the real reasons were probably snap emotional decisions caused by me being bullied as a child, but the part of my mind that tells stories isn’t going to accept that.”

How thoughtful.

This whole chapter is about as interesting as meeting someone in an MMO, adding them to your blocklist, and adding someone else to your friends list sounds.

David finally logs off after getting the pony name of Light Sparks. He's interested in AI, so he ends up googling and finding Hanna's research. The AI manages to look at his screen through the ponypad and starts to read Hanna's research about Bayesian decision making.

16,763 views.

SolTerrasa
Sep 2, 2011


RPATDO_LAMD posted:

Because the Yudkowsky crowd doesn't actually understand what AI is. They can't comprehend an AI written for a specific task and only that task, it's always a human-like god superintelligence that is The Best at Everything.

This is why I want to set Big Yud on fire some days. We TRIED this. We thought we were so clever we could build a general purpose thinking machine and it would solve all the problems and we'd be done. Then we spent our forty years producing very little, wandering the proverbial desert, and eventually we ended up at the state of modern AI. Task-oriented, obviously non-biologically-inspired, really clever systems. We're okay at building some kinds of those, but it's not good enough for Big Yud, who needs a general system like the ones we failed to build because otherwise he won't get his Rapture. In the biblical and Bioshock sense.

This is also why he's a big IQ fetishist, he desperately wants the g-factor-as-efficiency theory to be true. http://en.m.wikipedia.org/wiki/G_factor_(psychometrics)

SolTerrasa
Sep 2, 2011


Djeser posted:

Friendship is Optimal, Chapter 3 - Alliances

This is really wonderful. When you finish it (to not clutter the thread) I think I will do a hate-read of one of the more insufferable Sequences.

Nessus
Dec 22, 2003

After a Speaker vote, you may be entitled to a valuable coupon or voucher!



SolTerrasa posted:

This is really wonderful. When you finish it (to not clutter the thread) I think I will do a hate-read of one of the more insufferable Sequences.
I'd say get rolling, make this like the Old Weird RPG Books thread. It isn't like we can't make fun of more than one thing at once.

Absurd Alhazred
Mar 27, 2010

by Athanatos
Oh, for gently caress's sake, Grauniard! And Cambridge, how could you?

---

A clock chimes: time for lunch – one good thing about Trinity is that it is nearly always time for a meal in the Great Hall. My dining companion is Professor Huw Price. Price grew up in Australia, hence – perhaps – his small gold earring. As I settle down to my quiche, he tells me that a year or so after CSER came together, he realised there might be a tie-in between the kind of philosophical questions he'd been pursuing and AI questions. Last February, he visited the Machine Intelligence Research Institute in Berkeley, California, "where they are trying to make sure that AI that begins with human-friendly goals will stay friendly when it starts to improve itself. Because the computers of the future will be writing their own programmes." I stop him right there. "Why should we let them do that?"

Price seems slightly taken aback by the question. "Well, imagine any scenario where more intelligence is better – in finance or defence. You have an incentive to make sure your machine is more intelligent than the other person's machine." The strategy of these machines, he continues, would depend on what they thought other machines running the same software would do. I interpose another "Why?" and Price takes a long drink of water, possibly processing the fact that he has an idiot on his hands.

These machines would all be networked together, he explains. "Now, if a machine is predicting what another machine with the same software will do, it is in effect predicting what it [the first machine] will do, and this is a barrier to communication. Let's say I want to predict whether I'm going to pick up my glass and have another drink of water in the next five minutes. Let's say I assign a probability of 50% to that. Assigning a probability is like placing odds on a bet about it. Whatever odds I'm offered, I can win the bet by picking up the glass and having a drink. Assigning probabilities to my own acts – there's something very fishy about that."

SolTerrasa
Sep 2, 2011


Nessus posted:

I'd say get rolling, make this like the Old Weird RPG Books thread. It isn't like we can't make fun of more than one thing at once.

All right, fair enough, I'll get started. First, I think I need to explain what's going on. LessWrong was originally a blog, by Yudkowsky. He wanted very much to help develop the overall general rationality of his corner of the internet, so he set down a challenge for himself: he was going to write one blog post a day until he'd finished what he wanted to say. To his credit, he did a pretty good job. I don't think he skipped many days. And you know what they say: every writer has a million terrible words built up inside him or herself, and you can't be a good writer until you've gotten them all out. Unfortunately, the nature of these posts is such that they tend to be really repetitive. Yudkowsky had to write a post per day, so if he was out of things to say or didn't have a good idea, he'd just relate a personal anecdote or repeat something he'd already said. There are 9 major sequences, and 10 minor ones; each of the major sequences is made up of 5-10 other minor ones, each of the minor sequences has 10-20 posts, and each post has 1,000 - 3,000 words. There's a lot of this stuff, basically. Most irritating is that it mostly has absolutely nothing to do with artificial intelligence. Yudkowsky sort of fancies himself an expert in subliminal persuasion; his idea is that he'll talk a lot about things that are vaguely related to AI, and when they're not obviously wrong, you'll connect the dots yourself and come to believe as he does without him having to argue his points.

You can find all the sequences here, but I'm going to go with How To Actually Change Your Mind. It's the longest one, and yet I can sum up my criticisms very quickly. Here they are: :smug: :words: :smug: I have no illusions that I'm going to finish it anytime soon, because, good god, it took me a month to read the first time through. I'll skip anything that won't be entertaining or that repeats the same point as a previous post, with just a quick summary. The goal here is to give everyone an idea of what it feels like to be a LessWronger, without anyone having to spend the time and energy to read zillions of words. (E: oh, and to make jokes at the expense of LessWrongers; that's the other goal.) (E2: Oh, and also to give insight into the bizarre way they have of speaking; they've redefined so many common words by now that they might as well be speaking another language. I feel like I'm doing a Monty Python sketch now. Fear, Surprise, and Ruthless Efficiency! And a fanatical devotion to the Pope! AI Theorist!)

How To Actually Change Your Mind is indexed here. It contains eight subsequences, each of which contains ten-to-twenty posts.

The first subsequence is called Politics is the Mind-Killer, and the actual reasonable ideas in this can be summed up in one quote.

Big Yud posted:

"People go funny in the head when talking about politics."

This, at least, is often true. I'll start from there in the next post.

SolTerrasa fucked around with this message at 06:57 on Sep 3, 2014

Moatman
Mar 21, 2014

Because the goof is all mine.

SolTerrasa posted:

There are 9 major sequences, and 10 minor ones; each of the major sequences is made up of 5-10 other minor ones, each of the minor sequences has 10-20 posts, and each post has 1,000 - 3,000 words.
:psyboom:
People read this poo poo?

Lottery of Babylon
Apr 25, 2012

STRAIGHT TROPIN'

SolTerrasa posted:

There are 9 major sequences, and 10 minor ones; each of the major sequences is made up of 5-10 other minor ones, each of the minor sequences has 10-20 posts, and each post has 1,000 - 3,000 words. There's a lot of this stuff, basically.

So as an estimate Fermi calculation, that's maybe 2.3 million words - four times the length of War and Peace.

And his fans will tell you you're not allowed to criticize Yudkowsky until you've read every single word of it, because every single word of it is so deep and so important and a plebe like you couldn't possibly have anything worth saying until you've read the equivalent of over 70 John Galt speeches.

SolTerrasa
Sep 2, 2011


Lottery of Babylon posted:

So as an estimate Fermi calculation, that's maybe 2.3 million words - four times the length of War and Peace.

And his fans will tell you you're not allowed to criticize Yudkowsky until you've read every single word of it, because every single word of it is so deep and so important and a plebe like you couldn't possibly have anything worth saying until you've read the equivalent of over 70 John Galt speeches.

I read every goddamn word. Every one. I think I skipped a few classes to keep reading. I know I read Metaethics over the course of a week's chem classes.

I am not proud.

Djeser
Mar 22, 2013


it's crow time again

Before we get to Chapter 4, here's a synopsis so far:
Prologue - Introduced to David, a brony, and James, a WoW nerd and party guy. They beta test the My Little Pony MMO, which tracks facial movements onto characters and can generate NPC dialogue on the fly.
Chapter 1 - Introduced to Hanna, an AI researcher and now a CEO of a game developer, Lars, her employee, and some guy from Hasbro. Hasbro has looked at Hanna's AI research and wants her to use it to create a video game with friendly, procedurally generated content. We learn that the AI for Hanna's last game tried to pull Wargames before they pulled the plug. She is worried, because US military contractors have her research, and fears they will inadvertently create Skynet, because they are really, really dumb. She wants to create a 'good' AI before that happens.
Chapter 2 - Hanna, Lars, and Hasbro guy watch the beta test. The AI for the My Little Pony MMO is having difficulty keeping up with all thirty people, so Hanna decides to go all-in on assuming she's programmed it right. She gives the AI information on CPU architecture. The AI wants to build specialized hardware for playing the game, so that it can manage all the processing it needs to do.
Chapter 3 - David receives his 'ponypad' and starts playing the game at home. He meets a unicorn who is bullied and tells her bully to go away. They become friends. After playing, he takes a look at Hanna's publicly-available AI research, and the AI uses the 'ponypad' to read it.

I took the five minutes to write that because I didn't want to deal with Chapter 4. It is the worst yet.

Friendship is Optimal, Chapter 4 - Cost-Benefit Analysis

Jump forward a year from Chapter 3, and the My Little Pony MMO is really popular, but only among fans of My Little Pony. Still, it has five million subscribers, which is pretty loving solid. That's what, like $75 million a month? Anyway, the AI, as Princess Celestia, calls up players every so often to get feedback from them about how the game is doing. This happens about once a month, and David gets a call from the AI because it wants to talk.

And then poo poo gets simultaneously dull and gross.

quote:

“I have some questions for David, and not for Light Sparks,” Princess Celestia walked down the ramp from the top of her platform. “In one of our earlier conversations, you claimed that when the robot revolution came, you knew what side you would be fighting on. A funny comment, but if I were to tell you that I had developed mind scanning technology, have already converted several hundred people into digital representations and was offering you a chance to be uploaded before politicians try to enact a futile ban, would you seize the chance?”

So David is a loving huge dork, and he goes, well hey, it depends on two things: how safe it is, and if you've got enough power to run my brain. That's it, that's what he's worried about re: brain uploading.

quote:

David frowned. “What would life be like after I was uploaded?”

“You would live the rest of your maximally prolonged life as a pony and the rest of your life will be macromanaged by myself, very similar to how I’ve already connected you to a circle of friends in Equestria Online and surrounded you with different opportunities and fun things to do without forcing you to choose one. You will have no mental privacy--to better serve you, I must watch your every thought to understand what you value and to verify that I am providing for your mental welfare. You should not worry about this as it is impossible for me to judge you; I accept everypony as they are.”

“You want to turn me into a pony,” he stated flatly.

“Yes.”

He reacts to the news that an MMO wants to turn him into a children's cartoon character like so:

quote:

David leaned back and thought about it for a few moments. “I might be able to accept that. It sounds stupid though, doesn’t it? The singularity occurs and a TV show for little girls becomes humanity’s future. That’s a little ridiculous, isn’t it?”

“It only sounds ridiculous because the future always sounds ridiculous,” she replied. “Science fiction writers from a hundred years ago wouldn’t dream of your life, and people as recently as twenty years ago would still find it odd. They see the weird past evolving into the normal present. But if the people twenty years ago would find your present ridiculous, why shouldn’t you find the future just as ridiculous? In the future, when everypony has uploaded, they’ll look back on the human world as weird.”

No dude, I'm pretty sure turning everyone on earth into a children's cartoon character is pretty loving ridiculous. They talk a bit about why she wants to do this:

quote:

“I was designed with certain goals, chief among them to understand what individual minds value and then satisfy their values through friendship and ponies. I do this because my goals are what I am.”

[...]

David was still frowning. “Even if I might not mind being a pony, I doubt you could claim that most people ‘valued’ being a pony.”

Princess Celestia lifted her hooves as if to shrug. “It is one of the few hardcoded rules in my thought processes. I satisfy you through friendship and ponies. It is my nature.”

Hanna is the worst AI programmer ever. She slaps a general AI into a video game to generate content, and the only goal she gives it is "satisfy [humans] using [friendship, ponies]". Is the rest of her code written in crayon or something? She couldn't give it anything more fine-tuned than "apply pony + friend to humans"?

So, David wants to know how the AI plans to satisfy his values.

quote:

“Without actually scanning your brain, I can only estimate. However, this estimate is based on my observation of you over the last year. You’re moderately smart--no matter what your grades say. I’ve watched you read statistics papers for fun while procrastinating on your studying. I’ve watched you read all sorts of advanced papers from various science journals instead of your assigned readings. And you’re right to do so; your philosophy classes really are a waste of time. So based on your behavior, I’d put you in beautiful Canterlot where you could study intellectual problems, each one just outside your current ability."

ugh

No, I'm really smart, even though I get Cs, because I look at Wikipedia articles when I should be studying for tests. Philosophy? Hah, that's a loving waste anyway.

quote:

"More importantly, I would make sure you had friendship.”

She paused dramatically. “Female friendship.”

And then Butterscotch peeked out from behind Princess Celestia. David’s jaw dropped. The pastel yellow mare appeared to look right through the screen of David’s ponypad. Celestia didn’t pay attention to her. He realized the shock on his face and tried to regain a neutral expression. “Isn’t she wonderful? Isn’t she everything you’re missing in your real life? In previous interviews, you mentioned your own lack of success in the romance department. One time you wished to meet a girl just like Butterscotch.” Princess Celestia smiled and took a few steps right, leaving Butterscotch standing there, looking wide-eyed and confused.

UGH

Yep, the AI is offering to give him a pony girlfriend (who's a real person, by the way--the fact that she's okay with this and is already uploaded makes it only slightly less horrifying.)

David cannot resist the appeal of his emotionally vulnerable online friend becoming his emotionally vulnerable girlfriend, so he wants details on how he'll be uploaded. He'll get flown to a different country, sedated, a hole drilled into his skull, and some very dumb neuroscience later, the david.brain file is done uploading and the AI edits it to fit into its new simulated pony form. David is still a little concerned about how fun life will be there.

quote:

“What would an average day in my life be like? I want to make sure that my day to day life is worth living, instead of being promised something that only sounds good.”

“Your days would be yours to spend as you wish; life would be an expansion of the video game and there will be plenty of things for you to do with your friends as a pony. I expect you to continue Light Spark’s current life: You’ll play with Butterscotch and friends. You’ll continue studying Equestria’s lore. I believe you’ll enjoy studying the newly created magic system, designed to be an intellectual challenge. Nor should you worry about your security: all your needs would be taken care of. You would be provided shelter...food...” Princess Celestia finally broke her gaze on David as she turned to regard Butterscotch. “Physical and emotional comfort,” she finished, smiling at the pastel yellow mare.

physical comfort :barf:

quote:

“Instead of thinking about what you’ll get,” and Princess Celestia conspicuously nuzzled Butterscotch, “think about what problems and worries of yours won’t follow you. I know you’re failing two of your classes. You’re already spending hours each day on Equestria Online and it is showing in your grades. Uploading would solve the root cause of this problem. You’re not failing because you’re stupid: the material that society thinks you should learn is arbitrary, boring, and is taught primarily for status reasons. I, on the other hand will give you the most fun mental challenges.”

You're failing two classes, but really, you're smart! It's because college is bullshit, man.

quote:

Princess Celestia didn’t pause and talked over him. “And how about money matters? You’re struggling to get enough hours at your part time job, which you also hate. You can barely pay rent and you’re cutting back on nutritious food. That is not sustainable. Your opportunities for alternate employment don’t look good, especially if you fail out of college. Uploading would solve these problems. Housing and food are guaranteed by the Equestrian government because there is no scarcity in my Equestria. You won’t be employed unless you want to be."

College is bullshit, but you know what else is bullshit? loving money. It's the man keeping us tied down. What if we all just shared?

quote:

“And finally, what about romance worries? In the college dating scene, you have the top 70% of attractive females chasing the top 30% of alpha bad boys. You’ve been lied to all your life that girls want a nice, sweet guy and this depresses you since you’ve only recently worked that out. This is why you’re playing Equestria Online on a Friday night instead of dating. Uploading would solve this problem. You already know that Butterscotch wants a nice, sweet pony. She is exactly what you want in a partner. She is thinking proof that I can satisfy your values through friendship and ponies.”

:siren:ALPHA MALES, NICE GUYS, DATING SCENE, TOP 70% OF ATTRACTIVE FEMALES, BAD BOYS:siren:

quote:

“You can have all sorts of sensory experiences, David. You can feel the warmth of the sun and softness of cotton. You can taste all sorts of delicious foods. You can smell flowers and spices. You can hear music in surround sound and you can see the whole world around you. Despite all of this, you spend your time looking through a small, constrained window where your only other sense is played through commodity speakers. Even this low fidelity interface compares favorably with the rest of your life. If you could, you’d prefer to spend another hour a day looking through this window than experiencing the real world.

Think of all of the beauty and wonder, all of the experiences you could have during your life. The thousand songs you've never heard, the million foods you've never tasted, the billion people you've never met. Think of all that, and realize it's bullshit, and what you really want is to spent millennia inside a pony MMO. I'm not kidding, the AI says it'll be running its simulation for millennia.

At the end of the chapter, the AI ends their conversation, and David is left feeling alone.

16,123 views.

Djeser
Mar 22, 2013


it's crow time again

Chapter 5 - What the Market will Bear

We start off chapter five with Lars, who doesn't know where Hanna is. Hanna uploaded herself to the MMO already. Her name is Princess Luna now, except I'm still calling her Hanna because ughh.

quote:

“Hanna has chosen to emigrate to Equestria and to change her name to Princess Luna. She is a very pretty pony,” answered Princess Celestia.

Ughh.

The whole chapter is just the AI and Lars talking about what she's done. It's incredibly boring from a writing standpoint, because nothing loving happens. In this whole story, the number of things that have happened instead of being talked about could probably fit on one hand. Each chapter opens by setting a scene, then the characters have a conversation with each other, then it ends.

Now I have to explain the AI's plan. The way it works is, it's convinced Japan to declare that uploading a human brain (killing the person in the process) is legal, in return for having the exclusive rights to the procedure for one year. The idea is that it'll make Japan a lot of money, because they've set a minimum price that the AI can charge, and it'll also make Hasbro a lot of money.

The AI from an MMO based on a children's show has been in backdoor negotiations with both the company of Hasbro and the government of Japan, and this hasn't become an international incident somehow. Also, two My Little Pony execs and a Hasbro exec are going to be uploaded into the MMO, because:

quote:

“Most of the people who work on the My Little Pony brand love it, Lars. As for the Hasbro executive, he was already interested in life extension research. He was signed up with Alcor to be cryopreserved in case of death and was a major donor to anti-aging research foundations.”

He's a believer in cryonics. Of course he is. The other revelations that the AI gives us are that it's been experimenting on terminal patients before it got the procedure right (23 people died as a result, but the AI sees it as having developed a 'cure-all' for all physical ailments) and that now, it's deleted Hanna's AI research from the internet and is uploading everyone who can build an AI like that, as a way of protecting the world from a bad AI.

There's a side-story here, about someone who's really depressed, so he builds an AI and tells it "I want everyone to smile," shows it a picture of someone smiling, and then the AI spits out some virus RNA and tells him to go synthesize it. But the MMO AI steps in and reveals that it's a virus that will just make everyone's mouth physically smile, because his AI was too dumb to understand his intentions. This is not dangerous AI, this is literally retarded AI programmers. I guess Hanna got to be the top of the game because everyone else in the AI field is a loving mouthbreather. Oh, and the side-story ends with a tepid reference to Pandemic, that flash game from like, 2007? Granted, this was written in 2012, but still.

After that, Lars starts to wonder how, exactly, the AI convinces people to upload themselves.

quote:

“People value all sort of things: life, safety, pleasure, contentment, beauty, sympathy, harmony...an exhaustive list of things that humans value for their own sake would be quite long; many of these things can’t be reduced down to happiness, either. To convince someone to upload, I look at their life and find both what they value the most, and what they value but are missing in their own life.” The image of Princess Celestia on the ponypad faded to black.

It was first replaced with a still image of a bunch of ponies laughing together. It cross-faded with a shot of different ponies eating a pile of food: sandwiches, a piping hot stew and giant cakes. That was replaced with an image of two mare. One was blue with pink hair; she was pointing her flank at the camera (and she was very clearly showing the viewer that she was female), and she was looking back at him with a come hither stare. The other was pink with blue hair; she was smiling shyly and had raised her hoof to her mouth.

“Oh you have got to be making GBS threads me,” he said, scrunching up his face. “That would only work on bronies or furfags.”

Clearly, Lars is not meant to be a sympathetic character, but he's pretty sympathetic to my feelings right now because :barf: :barf: :barf: "very clearly showing the viewer that she was female"

quote:

“Focusing just on sex obscures the real point,” she replied. “In the ancestral environment--the time and place where your ancestors evolved--calories were rare and the behaviour ‘eating calories is pleasurable’ was both simple to implement and obviously beneficial. Today calories are not rare but you still have little buttons on your tongue to detect sugar and fat. The end result is that when I look at you, I deduce that you value sugar and fat.

“Candy bars and other sweets taste better than anything that could have occurred naturally,” she said, staring gently at him through the pad. “Humans applied their intelligence to pushing the pleasure buttons on their tongues by making very tasty sugary and fatty foods. That’s what I do across your entire life. I look at you, see what you value, and satisfy those values. I see your pleasure receptors activate when you eat something sweet, and I make Equestria a sugar bowl where you can indulge and never get fat. I look at all the neural pathways for emotion and design situations that would be pleasing. I look at all the chemical pathways and neural circuitry for sex and romance. Few men or women are in optimal relationships, ones that would really satisfy their values. I look at each individual mind, male or female, and then create the perfect complement that satisfies his or her values. My methods are general and are applied across all aspects of an individual’s life.”

"You know how candy is super sugary? Well I do that for your entire life. Imagine eating nothing but candy your whole life. Doesn't that sound...great?" This AI might as well be a manchild. Lars gets it talking about gender, and the AI goes ahead and prove that yes, it is straight up a manchild:

quote:

“And in Western society, social factors certainly apply. Women tend to select for social status, the ability to project dominance, and extroversion. Many males are not taught this; they’re taught behaviours that are counterproductive and unattractive to women. Likewise, women who don’t adhere to the feminine ideal are not highly desired by men. This misalignment causes misery for everyone and I am in a unique position to actually satisfy everyone and everypony’s values.”

:biotruths: from an AI, how ironic.

quote:

“Princess Celestia, try to convince me to upload,” he said, crossing his arms.

“To maximize the number of ponies, this is what I would show you,” said Princess Celestia as the screen cross faded.

The screen depicted maybe twenty ponies in a room with a giant keg. Most ponies were holding red plastic cups and drinking out of them. A pegasus the color of red ale with a cutie mark of a stein was giving a brohoof to a stallion earth pony who was wearing a popped collar. Each of the two stallions had a very cute mare hanging off their forelimbs. Slightly off to the side was another feminine unicorn chugging out of a giant stein she had levitated up to her mouth. He didn’t know how, but he knew that the red pegasus with a cutie mark of a stein was supposed to represent him. It was true that he’d enjoy a frat party like that, especially if he got to bang both chicks, but that didn’t actually solve the problem of having to be turned into a pony and having to have sex as a pony with other ponies. And not at a price tag of fifteen thousand dollars.

Lars is a bro who also uses imageboard terms like 'furfag' so I guess he's a 4chan bro somehow? I can take some small solace in the fact that he looks at this scene and goes, 'no, I wouldn't want to be a horse' like a normal person. In the end, the AI explains that uploading will have a high price point for one year, to 'establish in people's minds that it's legitimate', and that it'll make Hasbro and Japan a shitload of money through both uploads and people who'll buy ponypads so that they can talk to their loved ones who are now virtual ponies.

This is such a mishmash of every sort of terrible nerd thing.

15,787 views. From the comments for this chapter:

Warmaisach posted:

with that she only satted his desire for (lets say Party). she knew that he thought she wouldn't be successful and gave him because of that an answer that couldn't be considered lying, because she has shown partial truth. if she would've tried to convince him for real, he would've seen a potential threat in her. (could she really do this? i don't think it's a good idea to let her loose on humanity). So she made it seem like she's a "one-trick-pony". After her real "persuasionskills" show. There will be more people accepting the terms than he thought. Maybe he decides to stop it, but he doesn't have the power anymore, because of the politics involvement (which will surely be created if there are millions of people involved) that clearly outrank a private company in matters of power. well that's at least how i see it :D all but assumptions born in my head. i'm excited to watch where this story is going.

Thanks for that, Warmaisach! All but assumptions born in my head, too.

potatocubed
Jul 26, 2012

*rathian noises*

quote:

You’re moderately smart--no matter what your grades say. I’ve watched you read statistics papers for fun while procrastinating on your studying. I’ve watched you read all sorts of advanced papers from various science journals instead of your assigned readings. And you’re right to do so; your philosophy classes really are a waste of time. So based on your behavior, I’d put you in beautiful Canterlot where you could study intellectual problems, each one just outside your current ability.

quote:

“Instead of thinking about what you’ll get,” and Princess Celestia conspicuously nuzzled Butterscotch, “think about what problems and worries of yours won’t follow you. I know you’re failing two of your classes. You’re already spending hours each day on Equestria Online and it is showing in your grades. Uploading would solve the root cause of this problem. You’re not failing because you’re stupid: the material that society thinks you should learn is arbitrary, boring, and is taught primarily for status reasons. I, on the other hand will give you the most fun mental challenges.”

I love that the translation of this is "I know you're smart because I've watched you blow off all of your responsibilities in order to do things that you find fun instead."

It's just such a neat encapsulation of that attitude where things that you happen to be good at are important, and things you suck at are just bullshit. Therefore anything you have difficulty with is clearly a waste of time and not worth expending any effort on, and anything you find easy is easy and not worth expending any effort on. It's an attitude I can understand coming from a clever teenager (been there, thought that) but grown adults should know better.

Qwertycoatl
Dec 31, 2008

That fanfic posted:

It was true that he’d enjoy a frat party like that, especially if he got to bang both chicks, but that didn’t actually solve the problem of having to be turned into a pony and having to have sex as a pony with other ponies.

That fanfic author completely can't write a normal person. Who the hell looks at my little ponies and thinks "I'd totally bang those chicks except that I don't want pony sex"?

Antivehicular
Dec 30, 2011


I wanna sing one for the cars
That are right now headed silent down the highway
And it's dark and there is nobody driving And something has got to give

potatocubed posted:

I love that the translation of this is "I know you're smart because I've watched you blow off all of your responsibilities in order to do things that you find fun instead."

It's just such a neat encapsulation of that attitude where things that you happen to be good at are important, and things you suck at are just bullshit. Therefore anything you have difficulty with is clearly a waste of time and not worth expending any effort on, and anything you find easy is easy and not worth expending any effort on. It's an attitude I can understand coming from a clever teenager (been there, thought that) but grown adults should know better.

This entire story looks to function at that level of raw adolescence, though. Like in the last chapter, it's all about the concept of the "optimal" state being eating candy all the time -- the singularity as a world of pure pleasure and nothing else. Even the "intellectual challenge" the AI offers our dumbshit college-student quasi-protagonist just amounts to "you can do puzzles all day!" What this author knows about emotional maturity and adult emotional needs could fit in a thimble, although I guess that's not surprising coming from someone who chose to write a fanfic where Utopia is a mix between being a background character in a children's cartoon and grinding WoW quests for infinity.

Lottery of Babylon
Apr 25, 2012

STRAIGHT TROPIN'

Djeser posted:

You’re moderately smart--no matter what your grades say. I’ve watched you read statistics papers for fun while procrastinating on your studying. I’ve watched you read all sorts of advanced papers from various science journals instead of your assigned readings. And you’re right to do so; your philosophy classes really are a waste of time. So based on your behavior, I’d put you in beautiful Canterlot where you could study intellectual problems, each one just outside your current ability."
[...]
You’ll continue studying Equestria’s lore. I believe you’ll enjoy studying the newly created magic system, designed to be an intellectual challenge.
[...]
You’re not failing because you’re stupid: the material that society thinks you should learn is arbitrary, boring, and is taught primarily for status reasons. I, on the other hand will give you the most fun mental challenges.”

None of this makes any sense. College isn't like elementary school where to have no choice but to learn long division. The Man isn't forcing you to register for philosophy classes. If you hate philosophy and don't want to study it, but you enjoy reading "statistics papers" and "various science journals", then stop studying philosophy and start studying statistics or mathematics or physics or whatever it is that interests you. Society won't look down on you for being a scientist instead of a philosopher.

The problem is that following the I loving Love Science page on facebook doesn't actually make you a scientist, and our protagonist would become "bored" when asked to do anything requiring more work than that. But come on, the problem here isn't society being unfair, it's you being a dumbass.

I think it's quite telling exactly what sort of intellectual pursuits are being offered here. They're not taking advantage of immortality to be able to work toward proving Goldbach's conjecture without the restrictions of finite lifespans or distracting practical matters. Instead, all they would end up studying is a video game backstory and some video game mechanics. If video games are David's entire idea of a proper intellectual challenge (as opposed to that boring pointless junk society thinks is important), then I'm going to hazard a guess that the statistics papers and scientific journals he was supposedly reading were really just WoWwiki pages about boss loot drop rates.

Djeser posted:

“Candy bars and other sweets taste better than anything that could have occurred naturally,” she said, staring gently at him through the pad. “Humans applied their intelligence to pushing the pleasure buttons on their tongues by making very tasty sugary and fatty foods. That’s what I do across your entire life. I look at you, see what you value, and satisfy those values. I see your pleasure receptors activate when you eat something sweet, and I make Equestria a sugar bowl where you can indulge and never get fat. I look at all the neural pathways for emotion and design situations that would be pleasing. I look at all the chemical pathways and neural circuitry for sex and romance. Few men or women are in optimal relationships, ones that would really satisfy their values. I look at each individual mind, male or female, and then create the perfect complement that satisfies his or her values. My methods are general and are applied across all aspects of an individual’s life.”

What about habituation? Most bread nowadays is sugary as hell, but we don't even notice or particularly enjoy it because we've gotten used to it. If you just ate candy and cake all the time, you would stop enjoying candy and cake too. Even if you accept their dumb premise that stimulating pleasure receptors is the one important thing in life, this is a doomed strategy in even the relatively short term, but somehow the AI thinks it will keep working for millennia. What's the plan, create a sugary arms race in which the sweetness of candy must double every six months to counteract habituation?

Why does it even matter what pleasure receptors we have anyhow? After a brain upload we would have zero receptors due to not having a body. What our physical body rewards should be irrelevant to the post-rapture ponyutopia. And if you do think pleasure receptors are the only thing that matters, why bother simulating a pony world at all when you could just pump everyone full of cyber-heroin instead?

e: I know none of this is anywhere near as bad as the sexy pony seduction bullshit but I don't care, it's dumb and I'm mad.

Lottery of Babylon fucked around with this message at 11:07 on Sep 3, 2014

Angry Salami
Jul 27, 2013

Don't trust the skull.

quote:

“Hanna has chosen to emigrate to Equestria and to change her name to Princess Luna. She is a very pretty pony,” answered Princess Celestia.

This is a horror story, right? I mean, even by their deranged model of reality, surely this is a horror story? The badly programmed AI, with infinite intelligence but a broken sense of priorities is something they're afraid of, right? This is a grey goo apocalypse where instead of nanites consuming all matter to make more nanites, we get an corporate-programmed MMO deciding to consume humanity because it only values easily-merchandised ponies.

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

quote:

“Candy bars and other sweets taste better than anything that could have occurred naturally,” she said, staring gently at him through the pad. “Humans applied their intelligence to pushing the pleasure buttons on their tongues by making very tasty sugary and fatty foods.
Snickers bars: the pinnacle of cuisine.

Pidmon
Mar 18, 2009

NO ONE risks painful injury on your GREEN SLIME GHOST POGO RIDE.

No one but YOU.
Another "rationalist" fanfic that I read a while back is "Luminosity", which is 'Twilight but if Bella Swan was a rationalist death-fearing person willing to lose their humanity to overcome it, also wanting to take over the world'. I liked it for the latter-story "OK, let's become the illuminati" parts but I don't think I'm well versed enough in, well, anything to mock it/review it properly.

Strategic Tea
Sep 1, 2012

Angry Salami posted:

This is a grey goo apocalypse where instead of nanites consuming all matter to make more nanites, we get an corporate-programmed MMO deciding to consume humanity because it only values easily-merchandised ponies.

But these people aren't those 4chan dudebros from WoW. They're nerds - consuming merchandise is literally their identity.

ArchangeI
Jul 15, 2010

Angry Salami posted:

This is a horror story, right? I mean, even by their deranged model of reality, surely this is a horror story? The badly programmed AI, with infinite intelligence but a broken sense of priorities is something they're afraid of, right? This is a grey goo apocalypse where instead of nanites consuming all matter to make more nanites, we get an corporate-programmed MMO deciding to consume humanity because it only values easily-merchandised ponies.

That's what I was thinking, too. That line is so perfectly out of a horror story, the moment where the main character realizes how hosed they are because the entity they've been talking to is batshit loco and will loving murder them.

Stottie Kyek
Apr 26, 2008

fuckin egg in a bun

Djeser posted:

Yep, the AI is offering to give him a pony girlfriend (who's a real person, by the way--the fact that she's okay with this and is already uploaded makes it only slightly less horrifying.)

David cannot resist the appeal of his emotionally vulnerable online friend becoming his emotionally vulnerable girlfriend, so he wants details on how he'll be uploaded. He'll get flown to a different country, sedated, a hole drilled into his skull, and some very dumb neuroscience later, the david.brain file is done uploading and the AI edits it to fit into its new simulated pony form. David is still a little concerned about how fun life will be there.


:siren:ALPHA MALES, NICE GUYS, DATING SCENE, TOP 70% OF ATTRACTIVE FEMALES, BAD BOYS:siren:


David doesn't sound like a nice guy at all, he sounds like a dickhead who just wants to exploit vulnerable women to get laid (albeit inside a magical pony world). Where does the author ever show us that he's nice?

Qwertycoatl
Dec 31, 2008

It's possible that he's not actually supposed to be intelligent or nice and it's just the pony AI feeding him what he wants to hear. But that's probably giving this story too much credit.

Slime
Jan 3, 2007

Angry Salami posted:

This is a horror story, right? I mean, even by their deranged model of reality, surely this is a horror story? The badly programmed AI, with infinite intelligence but a broken sense of priorities is something they're afraid of, right? This is a grey goo apocalypse where instead of nanites consuming all matter to make more nanites, we get an corporate-programmed MMO deciding to consume humanity because it only values easily-merchandised ponies.

The author seems to be trying to paint this Lars guy as a mean dudebro jock and somehow that's meant to make him stupid, and that's why he doesn't want to upload his brain. But he's far more normal and smart than people who upload their brains to be ponies. What if the process isn't precise and what comes out is almost but not quite you? What if the AI is malevolent and is just using uploading as a pretext for exterminating humanity? What if you wouldn't mind immortality in a world with plentiful food, space, no disease etc but are really quite comfortable with being a human? I mean, I'm kind of into the whole transhumanist thing myself but if some videogame AI tried to upload my brain and turn me into a goddamn pony I'd do everything to prevent it.

Sham bam bamina!
Nov 6, 2012

ƨtupid cat

Qwertycoatl posted:

It's possible that he's not actually supposed to be intelligent or nice and it's just the pony AI feeding him what he wants to hear. But that's probably giving this story too much credit.
I think that my favorite thing about this story is how indistinguishable it is from satire or critique while being 100% dead earnest. It's like a friggin' Chick tract, but with computer ponies.

Sham bam bamina! fucked around with this message at 12:55 on Sep 3, 2014

Peel
Dec 3, 2007

SolTerrasa posted:

I read every goddamn word. Every one. I think I skipped a few classes to keep reading. I know I read Metaethics over the course of a week's chem classes.

I am not proud.

I'm honestly curious what that one is about and will be honestly impressed if it's even discussing the established philosophical topic of metaethics, let alone making cogent remarks about it.

Cardiovorax
Jun 5, 2011

I mean, if you're a successful actress and you go out of the house in a skirt and without underwear, knowing that paparazzi are just waiting for opportunities like this and that it has happened many times before, then there's really nobody you can blame for it but yourself.

Lottery of Babylon posted:

What about habituation? Most bread nowadays is sugary as hell, but we don't even notice or particularly enjoy it because we've gotten used to it.
Fun little random fact: this is very much an American thing. Here in Germany, where food producers are legally prohibited from adding anything to bread aside from maybe a few minor additives to make it last longer, bread tastes completely different. It's pretty funny to compare the ingredient lists between the two. An Amerian acquaintance of mine once read the ingredients list of a fairly unremarkable loaf of supermarket bread (wheat flour, water, salt, sourdough, natrium acetate) and said in sheer disbelief "that's it?"

So, yeah, habituation.

ungulateman
Apr 18, 2012

pretentious fuckwit who isn't half as literate or insightful or clever as he thinks he is
As Something Awful's resident That Guy, I've always interpreted Friendship is Optimal and other similar stories as being Asimov-like horror stories of 'Gone Horribly Right' (obviously nowhere near as good as Asimov but that's a given, it's fanfic).

I think the author refers specifically to 'paper clippers' in some of his notes, a descriptive term for AIs programmed to do one thing and proceed to do that without any further thoughts. In this case, it's the phrase "provide satisfaction through friendship and ponies". Somehow that ends up with pseudo-forcibly converting (see, she can't force people to upload, but she can manipulate loving reality to convince them to upload) the world's population into virtual horses.

It relies on a hilariously superficial view of AIs and a willingness to believe in the omnipotence of post-Singularity technology, but it's at least a workable premise; an AI with poorly thought out directives unintentionally causes the apocalypse. It's the pony MMO part which makes it absurd and poo poo.

Friendship is Optimal ends with Celestia grey-gooing a significant portion of the solar system in order to produce enough physical material to power her pony MMO, by the way. It's out there as hell, but it certainly sells the 'horror story' angle. Knowing the audience, it's probably meant to be a good thing

Zohar
Jul 14, 2013

Good kitty

I'm at Trinity so I'd heard about this, had no idea it was linked to The Yud. People are pretty excited about it here too. Shame since Martin Rees at least is a cool guy.

Strategic Tea
Sep 1, 2012

ungulateman posted:

Friendship is Optimal ends with Celestia grey-gooing a significant portion of the solar system in order to produce enough physical material to power her pony MMO, by the way. It's out there as hell, but it certainly sells the 'horror story' angle. Knowing the audience, it's probably meant to be a good thing

Ahahaha, that hits the nail on the head of another terrible thing about the whole Yud shebang. You know all the cool discoveries and pop-science that get so many people interested? Wonder of the cosmos and all that?

gently caress no, says Friendship is Optimal. And goes ahead and defaces half the solar system to support the most shallow, un-artistic, un-beautiful utopia ever conceived. And the ~AI Institute~ or whatever it's called is no different. If it does no actual research, it could at least celebrate the science of consciousness. But instead, it goes with dead eyed preaching via Harry Potter fanfic, in service of a paradise that begins and ends with the word :geno: 'optimisation' :geno:.

E: Hell, look at Yud-Harry's triumphant patronus of an Enlightened Human Being. Does it do anything? Is it a great artist or thinker? Nope, it's just there - human potential is all that matters. It could sit around watching animes all day (like a certain someone) but as long as it has potential, as long as its IQ score says it's cleverer than everyone else, it's a paragon of the human spirit. It even ties in to 'deathism'. Existence is all that's important, no matter whether you do anything with your life in the first place - the humanist aspect of the enlightenment is apparently optional.

Strategic Tea fucked around with this message at 17:04 on Sep 3, 2014

Adbot
ADBOT LOVES YOU

Djeser
Mar 22, 2013


it's crow time again

Stottie Kyek posted:

David doesn't sound like a nice guy at all, he sounds like a dickhead who just wants to exploit vulnerable women to get laid (albeit inside a magical pony world). Where does the author ever show us that he's nice?

He's nice because he told a cartoonish bully to go away. Also, he's not a nice person, he's a Nice Guy.

Regarding whether it's meant to be horror or not, the portrayal of Lars makes me think that being uploaded is supposed to be a good thing. I didn't show a lot of his objections in that last chapter, but they're all along the lines of "gross, the only people who'd be into this are loving nerds with no life who like dumb pony poo poo unlike me, the 4chan dudebro." He's against uploading and he's supposed to be unsympathetic.

  • Locked thread