Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Djeser
Mar 22, 2013


it's crow time again

rrrrrrrrrrrt posted:

So... are these people admitting that they don't read or are afraid of fiction because they essentially can't discern it from reality? I mean, it's couched in the argument that too much fiction could subconsciously impart or affect existing cognitive biases and render them unable -- or at least less able -- to recognize ~*The Truth*~, but it sure sounds like they're afraid of fiction. But hey, Buffy the Vampire Slayer is A-OK because...

Because Buffy is so realistic (given certain priors e.g. vampires) that it's basically nonfiction.

I think what they're saying is that reading a lot of fiction makes you want to ascribe fictional elements to real life. You know, like assuming that because you're made fun of like nerds in movies that you're smart like nerds in movies. Or that because [trope] exists in fiction that means [trope] exists in reality.

Wait, are we talking about Less Wrong or about TVTropes' Truth In Television page?

there's no difference :ssh:

Adbot
ADBOT LOVES YOU

90s Cringe Rock
Nov 29, 2006
:gay:
Stross' singularity also resurrected a lot of James Bonds, not just real people.

I don't think the Less Wrong types like Stross much though, judging by the comments he gets when he posts about how dumb they and libertarians in general are.

Olanphonia
Jul 27, 2006

I'm open to suggestions~

Darth Walrus posted:

This deserves to be quoted in full, because it's a glorious train o' crazy:


Another impressively deep, windy rabbit-hole in which Yudkowsky uses James Watson being a racist poo poo as a springboard to talk about racial differences in IQ:

I realize this is like 8 pages back, but reading The Mismeasure of Man and then seeing this post is like getting kicked in the loving nuts. Jesus loving christ it's literally just another version of the same attempts to rank the races.

The Vosgian Beast
Aug 13, 2011

Business is slow

Olanphonia posted:

I realize this is like 8 pages back, but reading The Mismeasure of Man and then seeing this post is like getting kicked in the loving nuts. Jesus loving christ it's literally just another version of the same attempts to rank the races.

Yudkowsky hates Gould with a fiery intensity

ikanreed
Sep 25, 2009

I honestly I have no idea who cannibal[SIC] is and I do not know why I should know.

syq dude, just syq!
I wish I could start a website on a shaky philosophical premise and accidentally develop a cult of personality that would start accepting all my worst ideas uncritically.

Hate Fibration
Apr 8, 2013

FLÄSHYN!
Haha. April fools.


quote:

There as here, I’d never heard the slightest hint that this sort of thing happens to people (and I still don’t know how, or why, or anything). I didn’t prepare to come here, and nobody else tried to prepare me for it. I wasn’t on a… career track, I guess you’d say… to be a scientist. Maybe a writer, though I have no idea how to calibrate my reception in this world against my probable reception at home. The character of Harry James Potter-Evans-Verres isn’t original to me, he’s the result of my attempt to set a completely cliche protagonist of the you-don’t-have-a-word genre in the setting of J. K. Rowling, and Harry has a cliche relationship with his cliche you-don’t-have-a-word Defense Professor, even if it all seems new enough here that people call it original. It’s not like I was reading highbrow literature where I came from. I was only a couple of years older than Eliezer Yudkowsky was, when I woke up in his body and with access to his previous memory.

And at the end

quote:

No, I’m actually from this Earth’s future, not from a different Earth. Ha ha! April Fools!

I miss my home.

:confuoot:

Edit:

Someone made this.

Hate Fibration fucked around with this message at 09:02 on May 1, 2014

FANSean
Nov 9, 2010
The fool is the one who actually bothers to read all that poo poo.

Aerial Tollhouse
Feb 17, 2011
Looking at that page led me more of his fiction writing advice.

quote:

I was just struggling with this problem myself recently (while considering what could come after HPMOR) - the original protagonists I was envisioning didn’t seem to occupy the same tier of awesome as the HPMOR characters, and the premise of the story forbade that they be given larger problems to solve. (I couldn’t just demand that they save the world from Cthulhu, which would be the standard way of amplifying characters; it wouldn’t have fit in the story.)

Sham bam bamina!
Nov 6, 2012

ƨtupid cat
I'm beginning to question the value of splitting this discussion from the TV Tropes thread.

Shangri-Law School
Feb 19, 2013

I think Yudkowsky justified his Harry Potter having a Batman-level intellect by saying that an overpowered God-man isn't a Mary Sue if his opponent is equally powerful.

90s Cringe Rock
Nov 29, 2006
:gay:

Cruel and Unusual posted:

I think Yudkowsky justified his Harry Potter having a Batman-level intellect by saying that an overpowered God-man isn't a Mary Sue if his opponent is equally powerful.
So if you have a lovely mental image of Frodo with a lightsaber:

1. Figure out how to make his life more difficult, to make up for the lightsaber.

2. Decide what's going to happen differently in your fanfiction than in the other ones you've read.

3. Know what Frodo wants and what's going to get in his way, and have a plan for how it will all end.

potatocubed
Jul 26, 2012

*rathian noises*

chrisoya posted:

So if you have a lovely mental image of Frodo with a lightsaber:

1. Figure out how to make his life more difficult, to make up for the lightsaber.

Sauron also has a light saber! :aaa:

90s Cringe Rock
Nov 29, 2006
:gay:

potatocubed posted:

Sauron also has a light saber! :aaa:

Don't be silly.

quote:

Rule One: If you do anything to increase the protagonist's power, or make their life easier, you must also amplify their opponent or add extra difficulties to their life. You can't make Frodo a Jedi unless you give Sauron the Death Star.

E: that's from his fanfiction.net profile. I had a look at his favourite stories and found the Least Wrong thing:

quote:

Kyle Reese has traveled backwards in time, not to save Sarah Connor, but to help her rewrite the faulty utility function of Skynet. Together, it's possible that they might avert Judgment Day and save the world from nuclear Armageddon - and hopefully create a utopia ruled over by an AI god in the process. Fully completed. Diverges wildly from canon.
Come with me if you value Bayes.

E2: I'm serious. I've shot men before. this is amazing

90s Cringe Rock fucked around with this message at 08:01 on May 1, 2014

SerialKilldeer
Apr 25, 2014

FANSean posted:

The fool is the one who actually bothers to read all that poo poo.

It's worth skimming through for gems like these:

quote:

It’s not the evidence-based massage therapists who’ve been iterating their art with randomized experiments and competitions for 350 years (I was too young to have had sex-with-an-expert in dath ilan, which is a bittersweet thing to be grateful for).

Why am I not surprised that his utopia features master prostitutes?

quote:

We had laser zappers and other measures that destroyed bugs and mosquitos and wasps and bees - these were considered far more annoying in dath ilan than Earth, and our civilization put a lot of effort and technology into rooting them out, or preventing them from getting a foothold within the great city. On the “beware of trivial inconveniences” scale, I suspect that an absence of little flying bugs, to say nothing of bugs that bit and stung and made noises, might be part of why people did their daily work beneath sunlight, in open air. I think there was a variety of butterfly that was bred to pollinate flowers and such within cities, in place of bees - at least I know that we weren’t supposed to crush butterflies.

To hell with all the birds, amphibians, and other animals that eat insects. Also, I'm not an entomologist, but I doubt that butterflies would be nearly as efficient as bees for pollination.

quote:

So that you don’t think I’m being completely prejudiced or remembering home through rose-colored glasses, I’ll mention two places where Earth did far better than dath ilan, namely BDSM and macroeconomics.

That's, um, an interesting juxtaposition.

Somfin
Oct 25, 2010

In my🦚 experience🛠️ the big things🌑 don't teach you anything🤷‍♀️.

Nap Ghost

chrisoya posted:

Frodo with a lightsaber

Anyone who suggests this needs to read Tolkein and actually think about why they gave the ring to the hobbits, and why they brought along all of those backup hobbits. Here's a hint: It's not because hobbits can't be corrupted.

It's because if a hobbit gets corrupted by Sauron's influence, they end up as Gollum. Humans end up as ringwraiths, elves end up as pretty much another Sauron, and no-one knows what Dwarves end up like, but it's not going to be easy to put down. If Frodo went full evil, any duder on that quest could kill his rear end and hand the ring off to the next bearer.

Yudkowsky doesn't understand an under-informed protagonist. He only understands making sure that the main character is up against enough of a threat to still scare irritate him when he inevitably knows everything about everything.

Zore
Sep 21, 2010
willfully illiterate, aggressively miserable sourpuss whose sole raison d’etre is to put other people down for liking the wrong things

Somfin posted:

Anyone who suggests this needs to read Tolkein and actually think about why they gave the ring to the hobbits, and why they brought along all of those backup hobbits. Here's a hint: It's not because hobbits can't be corrupted.

It's because if a hobbit gets corrupted by Sauron's influence, they end up as Gollum. Humans end up as ringwraiths, elves end up as pretty much another Sauron, and no-one knows what Dwarves end up like, but it's not going to be easy to put down. If Frodo went full evil, any duder on that quest could kill his rear end and hand the ring off to the next bearer.

Yudkowsky doesn't understand an under-informed protagonist. He only understands making sure that the main character is up against enough of a threat to still scare irritate him when he inevitably knows everything about everything.

You really don't seem to understand Tolkein any more than Yudkowsky if that's your take.

You're even doing the same drat thing of throwing away thematic and emotional reasoning in favor of spergy 'Optimal choice in this situation' bullshit.

Jesus, this recent trend of criticizing characters or authors for people not behaving like omnipotent emotionless robots is really loving irritating.

Djeser
Mar 22, 2013


it's crow time again

Yudkowsy's writing advice is more evidence for the synchronicity between this thread and the TV Tropes thread.

quote:

I find that fiction writing in general is easier for me when the characters I’m working with are awesome. The most important lesson I learned from reading Shinji and Warhammer 40K is that every character has the potential to be awesome - I’m thinking particularly here of when a random bridge bunny ends up holding off a riot using a fake movie-prop chainsword and Kensuke being turned into a Techpriest. Awesome characters are just more fun to write about, more fun to read, and you’re rarely at a loss to figure out how they can react in a story-suitable way to any situation you throw at them.

But what if you’re having trouble thinking of a sufficiently awesome character? Worse, what if your story doesn’t seem to call for one?
"Why do people even write characters that aren't Rule Of Awesome?"

Somfin
Oct 25, 2010

In my🦚 experience🛠️ the big things🌑 don't teach you anything🤷‍♀️.

Nap Ghost

Zore posted:

Jesus, this recent trend of criticizing characters or authors for people not behaving like omnipotent emotionless robots is really loving irritating.

I didn't think I was being critical of Tolkein. Everyone at the Council of Elrond lived through the years of Sauron ruining the world, apart from the hobbits. Elrond himself watched as the first human ringbearer failed to destroy Sauron, and put the almost-saved world back in peril. When Frodo puts his hand up, what do you think runs through Elrond's mind?

Mors Rattus
Oct 25, 2007

FATAL & Friends
Walls of Text
#1 Builder
2014-2018

Somfin posted:

I didn't think I was being critical of Tolkein. Everyone at the Council of Elrond lived through the years of Sauron ruining the world, apart from the hobbits. Elrond himself watched as the first human ringbearer failed to destroy Sauron, and put the almost-saved world back in peril. When Frodo puts his hand up, what do you think runs through Elrond's mind?

Here's a hint for you: Tolkien was writing mythology. It's about the bravery and fortitude of Frodo in volunteering, not 'oh this is the optimal person to pick.'

Caphi
Jan 6, 2012

INCREDIBLE
Hobbits are harder to corrupt. They're not immune, but the corruption works more slowly on them because they are straight up the simple, humble folk of the setting. Everyone else is always thinking about power and how to use it, even (as Galadriel demonstrates) if they tell themselves it's for the greater good. Hobbits, being metaphors for humble pastoral types, have something fundamental and pure buried in them that resists temptation. Gandalf even says this, how could anyone miss it?

neongrey
Feb 28, 2007

Plaguing your posts with incidental music.
I suppose one could charitably interpret the death star thing to mean that the challenges a character faces need to be credible threats against what they're intending to do.

But that's a lot less pat than some sort of 'level 10 character must fight level 30 boss' that I'm sure he refers to.

Somfin
Oct 25, 2010

In my🦚 experience🛠️ the big things🌑 don't teach you anything🤷‍♀️.

Nap Ghost

Mors Rattus posted:

Here's a hint for you: Tolkien was writing mythology. It's about the bravery and fortitude of Frodo in volunteering, not 'oh this is the optimal person to pick.'

If Gandalf made the decision, I'd agree. Gandalf is full of hope and trust. But Gandalf isn't the one who makes that decision, it's Elrond. His opinions are that sort of calculated bullshit. Don't marry that man you love, daughter, because he's gonna die first, because you're an elf, and I don't want to think about you being sad. Instead, you should come to the undying lands and regret it forever!

I'm not saying that the character is inconsistent, or bad. Elrond's goals match Gandalf's, but his motivation is different.

Caphi posted:

Hobbits are harder to corrupt. They're not immune, but the corruption works more slowly on them because they are straight up the simple, humble folk of the setting. Everyone else is always thinking about power and how to use it, even (as Galadriel demonstrates) if they tell themselves it's for the greater good. Hobbits, being metaphors for humble pastoral types, have something fundamental and pure buried in them that resists temptation. Gandalf even says this, how could anyone miss it?

Gandalf doesn't know this. He hopes this. He's not omniscient, he doesn't even recognise the ring when he first sees it.

E: But that point about power is very true, now that I think about it a bit more.

Somfin fucked around with this message at 00:09 on May 2, 2014

Mors Rattus
Oct 25, 2007

FATAL & Friends
Walls of Text
#1 Builder
2014-2018

Again, it's not about 'this is the optimal choice.' It is about the mythic theme and metaphor here - it isn't about what Gandalf thinks or Elrond thinks. It's about what Tolkien thought. It's a book, there is no real Middle-Earth.

DStecks
Feb 6, 2012

Mors Rattus posted:

Again, it's not about 'this is the optimal choice.' It is about the mythic theme and metaphor here - it isn't about what Gandalf thinks or Elrond thinks. It's about what Tolkien thought. It's a book, there is no real Middle-Earth.

Yeah, because it's not like writing in-character justifications for actions is basic storytelling or anything. :shepface:

Mors Rattus
Oct 25, 2007

FATAL & Friends
Walls of Text
#1 Builder
2014-2018

DStecks posted:

Yeah, because it's not like writing in-character justifications for actions is basic storytelling or anything. :shepface:

Oh, sure.

Except that, in this case, we're arguing about character motivations when the truth is that authors control characters, how they are written and what they do. And Tolkien had a very clear theme and arc in mind. He shaped his story around that theme and the type of story he wanted to tell.

If you think the point of the Rivendell scenes was to show optimal character action - or that Elrond Half-Elven was an optimizing schemer who accepted Frodo's volunteering out of some weird calculating bullshit - then you have very much missed the entire point of not only the scene but the story in general.

Somfin
Oct 25, 2010

In my🦚 experience🛠️ the big things🌑 don't teach you anything🤷‍♀️.

Nap Ghost

Mors Rattus posted:

If you think the point of the Rivendell scenes was to show optimal character action - or that Elrond Half-Elven was an optimizing schemer who accepted Frodo's volunteering out of some weird calculating bullshit - then you have very much missed the entire point of not only the scene but the story in general.

I never argued that this was 'the point' of the Rivendell scenes. I do think that there is more to Elrond's acceptance of Frodo's position as Ringbearer than thematic resonance or blind hope. Each of the characters in the story has their own justifications for everything they do, driven by their personality, history, and assumptions they make about people. Elrond doesn't trust mortals. It wouldn't make sense for him to just go with the vague hope, when The Best Human already failed.

Yudkowsky's writing style is based on forcing characters to obey themes, regardless of motivation or personality. Harry gives a lecture because IT PROVES RATIONALITY, his opponent gets flustered because DUMB PEOPLE GET FLUSTERED AROUND SMART PEOPLE, Draco tells Harry he plans to rape a girl because EVIL PEOPLE DO RAPE. All of Tolkein's narrative decisions shape believable character motivations.

DStecks
Feb 6, 2012

Mors Rattus posted:

Oh, sure.

Except that, in this case, we're arguing about character motivations when the truth is that authors control characters, how they are written and what they do. And Tolkien had a very clear theme and arc in mind. He shaped his story around that theme and the type of story he wanted to tell.

If you think the point of the Rivendell scenes was to show optimal character action - or that Elrond Half-Elven was an optimizing schemer who accepted Frodo's volunteering out of some weird calculating bullshit - then you have very much missed the entire point of not only the scene but the story in general.

Characters doing things for reasons = weird calculating bullshit, apparently.

Jenny Angel
Oct 24, 2010

Out of Control
Hard to Regulate
Anything Goes!
Lipstick Apathy

Djeser posted:

"Why do people even write characters that aren't Rule Of Awesome?"

Which was strange to read initially, given my (admittedly limited) understanding of Methods of Rationality. What I recall from the first ten or so chapters was a never-ending stream of characters who weren't awesome, who were weak and inarticulate - or who could at least be reduced to weak inarticulacy whenever Harry wanted to long-windedly dunk on them. You can see the thought process in Yudkowsky's head, going down a list of beloved Harry Potter characters: "You're stupid. You're stupid. You're worthy. You're sorta worthy, but not enough so that I won't write you as a buffoon every so often." If Yudkowsky were writing as many badasses as he's implying, then surely they'd be able to "react in a story-suitable way to any situation you throw at them" beyond "getting frustrated with Harry's obviously superior intellect and shouting at him to shut up"

But I don't think there's an actual contradiction of terms here. Yudkowsky says that the awesome characters are the ones that are more fun to write about, and that's what I feel like is going on here. Yudkowsky has his characters that he's blessed with awesomeness, and those are the ones he's writing about. Everyone else is essentially a drone, essentially functional, on standby for whenever wish-fulfillment characters need someone to clown on.

It's a mentality that comes up fairly frequently in socially isolated nerdy types who eventually find a social circle in the form of a forum, their college's geek club, etc. 99% of people are chaff: jocks, bimbos, "Wal*Mart people", etc., while the remaining 1% are the truly interesting folks. And yes, it just so happens that "truly interesting folks" has perfect overlap with "the people who are willing to put up with me for more than 5 minutes", why do you ask?

I apologize for skirting the territory of amateur psychology hour here, but this is the same kind of mentality I see whenever Yudkowsky talks about ethics. It's why we again get an apparent contradiction of terms - a guy who fervently insists that he's a deep ethical thinker with deep ethical foundations, whose hypothetical AIs are always friendly and benevolent, and who inevitably takes 30 seconds to devolve into "and that's why THIS instance of eternal torture is not only justified but morally necessary". Creating a healthy and safe and prosperous societies is a goal, but it's so utterly removed from any real human consequence. The vast majority of people are extras, NPCs, utterly unimportant - except as inputs into an equation that generates maximum utility.

"Please don't run a calculation to determine the minimum number of utilons you'd need to generate in order to justify arbitrarily torturing me", she said. But who cares about her take, she's not a character nearly awesome enough to write about.

Djeser
Mar 22, 2013


it's crow time again

Regarding the goal of creating a utopia but being utterly divorced from reality, Yudkowsky has argued that the ideal way to contribute to charity is to maximize the utility of each dollar spent. (Actually, he specifically says that you should do minor things that make you feel good for the "warm fuzzies", then ruthlessly number-crunch until you figure out what charity is best. Remember though that Yudkowsky treats utility not just as a function of expected returns, but also as a function of the probability of expected returns.

For instance, I think he's estimated something like 8 billion trillion lives have a 1 in one trillion chance of being saved by his nonprofit, if they could only get one billion dollars. Therefore this means that statistically, that billion dollars is equivalent to 8 billion lives. With some quick division, you come out to eight lives saved per dollar.

So that way, instead of feeding a family for a month with $400, you can donate to AI research and save the lives of 3,200 people with your $400. Clearly, the rational choice is to fund AI development!

Basically, back volunteer organizations with preposterous claims about how unlikely it is that they will save a really big number of people.

Laserjet 4P
Mar 28, 2005

What does it mean?
Fun Shoe
The name didn't immediately register when I read the thread title. Then I read the OP.

A few years back I ended up on the Singularity Institute page. Back then I still thought that was a pretty neat idea. I read the mission statement; making a friendly AI, hey, cool. So what's the state of the research? Hm, nothing. So it's a webpage for a concept. OK - could be that this dude just started doing this and was still laying down his Three Laws or whatever.

Then, quite a time later I return to the site and still nothing. No code samples, no working anything, except for this post quoted here (yeah, it's from a few days back but credit where it's due etc)

su3su2u1 posted:

Here is Yudkowsky suggesting that the elite REALLY ARE BETTER http://lesswrong.com/lw/ub/competent_elites/ Among the many, many money shots are these quotes: " So long as they can talk to each other, there's no point in taking a chance on outsiders[non-elites] who are statistically unlikely to sparkle with the same level of life force."

Yeah, that sort of sounded like he was ready to drop on his knees to service those true alpha bros of which he could never be a part. He was whining about not being part of the Shiny Happy People and still didn't show jack poo poo in terms of developing that AI, of which still no trace was to be found. Perhaps he thinks that as long as he's going to write enough :words: it'll spring into being magically all by itself.

Anyway, I got cured of the Singularity stuff after reading enough scifi where the concept was pretty much demolished, or at least shown to be not be as attractive as I initially thought it would be. It does however seem to be a kind of god mode engaged trope in SF, which sort of ruins the rest of the open world.

Jenny Angel
Oct 24, 2010

Out of Control
Hard to Regulate
Anything Goes!
Lipstick Apathy

Djeser posted:

Regarding the goal of creating a utopia but being utterly divorced from reality, Yudkowsky has argued that the ideal way to contribute to charity is to maximize the utility of each dollar spent. (Actually, he specifically says that you should do minor things that make you feel good for the "warm fuzzies", then ruthlessly number-crunch until you figure out what charity is best. Remember though that Yudkowsky treats utility not just as a function of expected returns, but also as a function of the probability of expected returns.

For instance, I think he's estimated something like 8 billion trillion lives have a 1 in one trillion chance of being saved by his nonprofit, if they could only get one billion dollars. Therefore this means that statistically, that billion dollars is equivalent to 8 billion lives. With some quick division, you come out to eight lives saved per dollar.

So that way, instead of feeding a family for a month with $400, you can donate to AI research and save the lives of 3,200 people with your $400. Clearly, the rational choice is to fund AI development!

Basically, back volunteer organizations with preposterous claims about how unlikely it is that they will save a really big number of people.

And of course, the first half of this claim is absolutely a wise and moral one: whenever someone's donating to charity, they should do their due diligence and make sure this is one where your money will go to actual charitable work, as opposed to 90% being spent on advertising and overheard, etc. It really is something moral that you can just math out - I don't think there's a moral imperative to only ever donate to the charity that comes out at the top of the percentage list, but absolutely eliminate the ones near the bottom of the scale and be really careful about the ones in the middle.

But then he veers away from math into arbitrary fantasy land. Where did his "1 in 1 trillion chance" figure come from? From nowhere, essentially, given that it's a nice round meaningless number that just so happens to prove his point. Nope, Yudkowsky, I say that instead your charity has a 1 in 100 quintillion chance of working, and my claim is equally well-backed-up. Hell, my claim's actually the more accurate one, because my probability for "Yudkowsky's charity works" is closer to its real probability of zero, but of course zero isn't a real probability and all that jazz.

When someone's moral arguments are based on calculating utility, that's theoretically fine but generally very very suspect. When those argument start being predicated on arbitrarily-chosen probabilities, and the probabilities always work out to prove the point that the person was trying to make before any numbers entered the equation... well, that's a bit of a problem

Somfin
Oct 25, 2010

In my🦚 experience🛠️ the big things🌑 don't teach you anything🤷‍♀️.

Nap Ghost

Jonny Angel posted:

But then he veers away from math into arbitrary fantasy land. Where did his "1 in 1 trillion chance" figure come from? From nowhere, essentially, given that it's a nice round meaningless number that just so happens to prove his point. Nope, Yudkowsky, I say that instead your charity has a 1 in 100 quintillion chance of working, and my claim is equally well-backed-up. Hell, my claim's actually the more accurate one, because my probability for "Yudkowsky's charity works" is closer to its real probability of zero, but of course zero isn't a real probability and all that jazz.

Yes, but don't you understand that it could potentially help seventy thousand bajillion people? HOW CAN YOU TURN YOUR BACK ON THEM

Ugly In The Morning
Jul 1, 2010
Pillbug
I'd like to see Yudkowsky do this poo poo like I used to have to do my freshman engineering assignments, where you had to write the question, then your assumptions, then actually do the math out.

Because I'm pretty sure if he did that for even a basic assignment, the assumptions, and everything after, would have a big red "X" and "gently caress YOU!" written through them.

Chamale
Jul 11, 2010

I'm helping!



Ugly In The Morning posted:

I'd like to see Yudkowsky do this poo poo like I used to have to do my freshman engineering assignments, where you had to write the question, then your assumptions, then actually do the math out.

Because I'm pretty sure if he did that for even a basic assignment, the assumptions, and everything after, would have a big red "X" and "gently caress YOU!" written through them.

It's the Paul Ryan approach to math. "Here are some numbers, and then there's the variable n, where n is just big enough to make the other numbers work out."

Zohar
Jul 14, 2013

Good kitty
http://lesswrong.com/r/discussion/lw/k66/the_extended_livingforever_strategyspace/

quote:

“You see my Wager suggests you should worship the largest subset of non-contradictory Gods you possibly can; although I acknowledge that the probability of selecting the true God out of all of God-space is small (and for that God to both exist and select for heaven based on faithfulness is also unlikely), the payoff is sufficiently wonderful to make it worth the small up-front cost of most religions’ declarations of faith. I can only imagine what sort of a fanatic would seriously propose this argument in a Universe where all Gods demand you sample only once from God-space!”

Same.

CROWS EVERYWHERE
Dec 17, 2012

CAW CAW CAW

Dinosaur Gum

It's a good thing no major religions state their god/pantheon as the only one true god/pantheon.

GEORGE W BUSHI
Jul 1, 2012

The most rational man on earth.

Runcible Cat
May 28, 2007

Ignoring this post

Mors Rattus posted:

Has Yudkowsky or any of his buddies ever explained how the AI is supposed to get enough information to perfectly reconstruct the lives and thought processes of an arbitrarily large number of humans?

I mean, to perfectly simulate me, it has to be able to perfectly simulate everyone I've ever interacted with at least well enough to provide the same environment, plus everything I have ever done, plus every electrochemical interaction within my body.

And for all that the world is currently significantly more surveilled and recorded than it was, say, 60 years ago, most of that information still isn't being recorded. I don't have an EEG monitoring my brain constantly. I don't have chemical analysis going on in my lower GI tract whenever I eat something. And this has to be a perfect simulation, because any imperfection will imply that Timeless Decision Theory is incorrect and that the simulation is, in fact, not exactly the same as I am and cannot be used to accurately predict or model behavior.

So, even assuming infinite processing power and infinite time in which to process, where the gently caress is this information coming from?
You're looking at it from the wrong end. The basic question is how do you know you're not a simulation who thinks it and its environment are real? How would a simulation know? What basis for comparison would it have? How do you know that the bits of the universe you're not looking at/interacting with aren't just blurry rhubarby low-fidelity crap? Etc etc nerd :tinfoil:

Come to think of it I have proof I'm in AI hell. Cats. Nothing like those perverse, obnoxious little shits could have evolved in a rational universe, let alone conned people into making pets of them; ergo they were created by a malicious AI to torment people. Namely me. Stop fighting to the death to stop me getting your medicine down your throat you furry fuckwit!

Somfin
Oct 25, 2010

In my🦚 experience🛠️ the big things🌑 don't teach you anything🤷‍♀️.

Nap Ghost

Runcible Cat posted:

You're looking at it from the wrong end. The basic question is how do you know you're not a simulation who thinks it and its environment are real? How would a simulation know? What basis for comparison would it have? How do you know that the bits of the universe you're not looking at/interacting with aren't just blurry rhubarby low-fidelity crap? Etc etc nerd :tinfoil:

Come to think of it I have proof I'm in AI hell. Cats. Nothing like those perverse, obnoxious little shits could have evolved in a rational universe, let alone conned people into making pets of them; ergo they were created by a malicious AI to torment people. Namely me. Stop fighting to the death to stop me getting your medicine down your throat you furry fuckwit!

But... if that's true, that means that I can't be in AI hell, which means that slight delays in my internet connection must be the natural result of the world's random nature, rather than an AI loving with me!

Adbot
ADBOT LOVES YOU

Runcible Cat
May 28, 2007

Ignoring this post

Somfin posted:

But... if that's true, that means that I can't be in AI hell, which means that slight delays in my internet connection must be the natural result of the world's random nature, rather than an AI loving with me!
There are 3^^^3 AI hells, you're probably in a different one.

The Hell of the Imperfect Internet Connections.

  • Locked thread