Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
BENGHAZI 2
Oct 13, 2007

by Cyrano4747

ate all the Oreos posted:

It took me a while to realize we weren't talking about Electronic Arts.

...mostly because I don't doubt for a second that Electronic Arts is rampant with sexual harassment.

Oh thank Christ I wasn't the only one

Adbot
ADBOT LOVES YOU

BENGHAZI 2
Oct 13, 2007

by Cyrano4747
I had a hard time reading that post because the lady was super far down the rabbit hole of rationalist woo woo poo poo and talking about being targeted based on facial structure and pheromones, so I ended up skimming it and the main takeaway imo is that rationalist dudes should be murdered

Darth Walrus
Feb 13, 2012

BENGHAZI 2 posted:

I had a hard time reading that post because the lady was super far down the rabbit hole of rationalist woo woo poo poo and talking about being targeted based on facial structure and pheromones, so I ended up skimming it and the main takeaway imo is that rationalist dudes should be murdered

I mean, that’s the whole thing, isn’t it? The rationalist mindset not only aids sexual predators, but also leaves victims dangerously underequipped to deal with sexual assault.

Syd Midnight
Sep 23, 2005

eschaton posted:

he’s of a kind with the Yudkowsky cult which definitely believes that

How do they rationalize thresholds? That "line on a chart can be extrapolated infinitely" is a bedrock of superstition and anti-scientific nonsense, but it's also a foundation of a lot of their beliefs. I'm so used to seeing it come from a different direction that things like "speck of dust" and "Moores Law forever" just blend right in with satire like "Americans consume over 500 grams of deadly cyanide and 25 liters of toxic methanol each day in the form of applesauce, enough to kill thousands of people! [see TheTruthAboutApples.com] " or "By plotting maximum vehicle speeds from 1850-1950, we can see that relativistic interstellar travel will be commonplace by the year 2000."

The aforementioned suicide has been bothering me because it sounds like she was about to start outgrowing it due to life experience, the usual "I was into that once, it was pretty hosed up". You'd think that sort of death-before-disbelief would set off a few alarm bells among people worried about harmful memetics, but not when their mindset is the one that bypasses safety devices and disables alarms because those are meant for other people and are preventing them from carrying out their clever plan that will eventually be the subject of a somberly narrated CGI recreation.

chitoryu12
Apr 24, 2014

Sax Solo posted:

a dust of speck

Can't say I've ever gotten bacon in my eyes, but there's a first time for everything.

there wolf
Jan 11, 2015

by Fluffdaddy

BENGHAZI 2 posted:

I had a hard time reading that post because the lady was super far down the rabbit hole of rationalist woo woo poo poo and talking about being targeted based on facial structure and pheromones, so I ended up skimming it and the main takeaway imo is that rationalist dudes should be murdered

She had this brand new idea for someone big and strong who would protect her from predators and she would give them sex in return. Patriarchy from first principles.

divabot
Jun 17, 2015

A polite little mouse!

Syd Midnight posted:

The aforementioned suicide has been bothering me because it sounds like she was about to start outgrowing it due to life experience, the usual "I was into that once, it was pretty hosed up". You'd think that sort of death-before-disbelief would set off a few alarm bells among people worried about harmful memetics, but not when their mindset is the one that bypasses safety devices and disables alarms because those are meant for other people and are preventing them from carrying out their clever plan that will eventually be the subject of a somberly narrated CGI recreation.

yeah, she was getting out of it, but:

quote:

I’ve lost a lot of the rationality I started with. I’m half way to social justice warrior. I am burnt out, heart broken, and filled with despair.

so she literally felt that gaining empathy would mean she couldn't be part of the only subculture she'd ever felt at home in.

Tunicate
May 15, 2012

Sax Solo posted:

The anti-deathist faction of these guys is so weird, like this obnoxious "Fable of the Dragon Tyrant". This stupid thing made me almost anti-allegory. I think, as a general rule, if you have to reduce human society to babby fairy tale idiot mode to explain your concept, you also might demonstrate that you are only thinking of human society in babby fairy tale idiot terms.


I remember seeing a similarly awful libertarian allegory one comparing building roads to giving away free cheeseburgers at the whitehouse, in order to prove that increasing supply will infinitely increase demand or some poo poo

Tacky-Ass Rococco
Sep 7, 2010

by R. Guyovich
There should be a Mock the Alt-Right thread and a Mock the Rationalists thread, because the OGs of the DE are gone now. What even happened to Mencius Moldbug?

Silver2195
Apr 4, 2012

Tacky-rear end Rococco posted:

There should be a Mock the Alt-Right thread and a Mock the Rationalists thread, because the OGs of the DE are gone now. What even happened to Mencius Moldbug?

Yeah, the DE proper no longer really exists. Even Nick Land's blog hasn't updated in months. Jacobite Magazine is still around, though.

Relevant Tangent
Nov 18, 2016

Tangentially Relevant

Syd Midnight posted:

How do they rationalize thresholds? That "line on a chart can be extrapolated infinitely" is a bedrock of superstition and anti-scientific nonsense, but it's also a foundation of a lot of their beliefs. I'm so used to seeing it come from a different direction that things like "speck of dust" and "Moores Law forever" just blend right in with satire like "Americans consume over 500 grams of deadly cyanide and 25 liters of toxic methanol each day in the form of applesauce, enough to kill thousands of people! [see TheTruthAboutApples.com] " or "By plotting maximum vehicle speeds from 1850-1950, we can see that relativistic interstellar travel will be commonplace by the year 2000."

The aforementioned suicide has been bothering me because it sounds like she was about to start outgrowing it due to life experience, the usual "I was into that once, it was pretty hosed up". You'd think that sort of death-before-disbelief would set off a few alarm bells among people worried about harmful memetics, but not when their mindset is the one that bypasses safety devices and disables alarms because those are meant for other people and are preventing them from carrying out their clever plan that will eventually be the subject of a somberly narrated CGI recreation.

They've got a whole thing about this called noticing skulls. Basically the argument is that they're smart enough to notice the skulls and that the skulls are their for lesser intellects, but as luminaries of the mind they'll be okay. The fact that none of them can function in society as it is and have to hide in a group of like-minded individuals ought to be a warning but is instead taken as proof that they're as deviated from the norms as they think they are. Not one of these people thinks they're on the left end of any bell curves.

Tunicate
May 15, 2012

The world is like a point buy RPG so being below average in the *snort* physical stats is more proof you are smart

Pornographic Memory
Dec 17, 2008

Tacky-rear end Rococco posted:

There should be a Mock the Alt-Right thread and a Mock the Rationalists thread, because the OGs of the DE are gone now. What even happened to Mencius Moldbug?

there is a thread for making fun of the alt-right but it's in c-spam

SIGSEGV
Nov 4, 2010


The way those guys write is really frustrating, I don't want ten thousand pages of rhetorical farting around, I just want to see a proper MIRI research paper. Emphasis on the research side.

And one that isn't outsourced, if possible.

Slime
Jan 3, 2007

Tunicate posted:

The world is like a point buy RPG so being below average in the *snort* physical stats is more proof you are smart

By that logic I'm a loving genius, and we all know I'm a dipshit at best.

loving all 8s over here

SolTerrasa
Sep 2, 2011

SIGSEGV posted:

The way those guys write is really frustrating, I don't want ten thousand pages of rhetorical farting around, I just want to see a proper MIRI research paper. Emphasis on the research side.

And one that isn't outsourced, if possible.

MIRI doesn't really do research, they're all about the rhetorical farting around. You could read the only good paper on the topic, Concrete Problems in AI Safety. Highly recommend, written by some Google folks and some dude from OpenAI. https://arxiv.org/abs/1606.06565

SIGSEGV
Nov 4, 2010


SolTerrasa posted:

MIRI doesn't really do research, they're all about the rhetorical farting around. You could read the only good paper on the topic, Concrete Problems in AI Safety. Highly recommend, written by some Google folks and some dude from OpenAI. https://arxiv.org/abs/1606.06565

Yeah, that's pretty much my take on it.

They've given us nothing but farting around and bad fanfic. And a cult and several cases of abuse.

Math pets.

I guess I'm trying to evaluate the effectiveness of the effective altruism charity.

divabot
Jun 17, 2015

A polite little mouse!

SIGSEGV posted:

I guess I'm trying to evaluate the effectiveness of the effective altruism charity.

* Some charities are more effective than others, and you should donate to the more effective ones.

yeah, sounds obvious and sensible

* As a first-worlder, you are basically rich, even if you don’t feel like it, and you therefore have an ethical obligation to contribute to those who aren’t - almost certainly more than you do now.

this is pretty sound reasoning actually, I can get behind this

* Therefore, we can and should stack-rank every charitable initiative in the world according to an objective numerical scoring system,

wait what

* and clearly the most cost-effective initiative possible for all of humanity is donating to fight the prospect of unfriendly artificial intelligence,,

what the

* and oh look, there just happens to be a charity for that precise purpose right here! WHAT ARE THE ODDS,,,,

fuckin

Eox
Jun 20, 2010

by Fluffdaddy
Rationalists steadily come to some of same conclusions as the rest of the world, only without any emotional backing and massive blind spots when interpreting them. It's going to be a charnel house when they stumble on "The world is better off without rationalism"

SIGSEGV
Nov 4, 2010


divabot posted:

* Some charities are more effective than others, and you should donate to the more effective ones.

yeah, sounds obvious and sensible

* As a first-worlder, you are basically rich, even if you don’t feel like it, and you therefore have an ethical obligation to contribute to those who aren’t - almost certainly more than you do now.

this is pretty sound reasoning actually, I can get behind this

* Therefore, we can and should stack-rank every charitable initiative in the world according to an objective numerical scoring system,

wait what

* and clearly the most cost-effective initiative possible for all of humanity is donating to fight the prospect of unfriendly artificial intelligence,,

what the

* and oh look, there just happens to be a charity for that precise purpose right here! WHAT ARE THE ODDS,,,,

fuckin

I was merely thinking about how much money is going to the sex dungeon and El Yud 20 metres tall foam statues compared to how much is going in what they claim to be doing which at this point should rightly be writing new and updated copies of Weapons Of Math Destruction, except that that lady isn't one of them, obviously, she does socially useful acts.

I wonder if El Yud still claims to be able to rewire his brain in perfection just once (Apparently same for AIs? Dunno why.) and that the second law of thermodynamics is bunk because it makes living forever hard.

Syd Midnight
Sep 23, 2005

SIGSEGV posted:

I guess I'm trying to evaluate the effectiveness of the effective altruism charity.

My gut feeling is that they're just avoiding charity by making the problem needlessly complex. Like there's a chore to do, but they sit around daydreaming about innovative new methods and most efficient routes for achieving maximal results until someone else does it for them. This can backfire if it's something you personally need done like paying bills or taking out the garbage, but if it's something that someone else needs then logically its the belief that you're helping that benefits you the most. It's a galaxy brain version of Thoughts and Prayers.

Or just imagine you're a parent and they're a kid trying to avoid chores, and didn't mow the lawn because they spent all day thinking about better way to do it and decided it would be best if you bought some goats to eat the grass instead.

Syd Midnight
Sep 23, 2005

divabot posted:

and clearly the most cost-effective initiative possible for all of humanity is donating to fight the prospect of unfriendly artificial intelligence,,

They can never make it past Pascals Wager, can they?

Shame Boy
Mar 2, 2010

Syd Midnight posted:

My gut feeling is that they're just avoiding charity by making the problem needlessly complex. Like there's a chore to do, but they sit around daydreaming about innovative new methods and most efficient routes for achieving maximal results until someone else does it for them. This can backfire if it's something you personally need done like paying bills or taking out the garbage, but if it's something that someone else needs then logically its the belief that you're helping that benefits you the most. It's a galaxy brain version of Thoughts and Prayers.

Or just imagine you're a parent and they're a kid trying to avoid chores, and didn't mow the lawn because they spent all day thinking about better way to do it and decided it would be best if you bought some goats to eat the grass instead.

There's an episode of the old Dilbert cartoon where Dilbert reasons his way to the conclusion that all charity is pointless unless you're giving literally 100% of your money directly to poor people in the third world by hand yourself, it's pretty much that

pidan
Nov 6, 2012


So, the AI tells me it will torture a million simulations of me if I don't do what it wants. How should that convince me? If I'm one of the simulations that will be tortured, the matter is out of my hands because the AI will torture or not torture me based on what the original me did. If I'm the original, the AI won't be able to torture me regardless of what I do.

Am I missing something?

Shame Boy
Mar 2, 2010

pidan posted:

So, the AI tells me it will torture a million simulations of me if I don't do what it wants. How should that convince me? If I'm one of the simulations that will be tortured, the matter is out of my hands because the AI will torture or not torture me based on what the original me did. If I'm the original, the AI won't be able to torture me regardless of what I do.

Am I missing something?

You're missing a whole bunch of dumb baseless assumptions it's built on like "you can't tell the difference between real you and simulation you" and "you should care about what happens to simulation you because you could be them and robot devil could walk in the door at any minute and be all 'surprise!'" and most of all "robot devil knows you think this way, so it knows that doing all this will work on you"

SolTerrasa
Sep 2, 2011

pidan posted:

So, the AI tells me it will torture a million simulations of me if I don't do what it wants. How should that convince me? If I'm one of the simulations that will be tortured, the matter is out of my hands because the AI will torture or not torture me based on what the original me did. If I'm the original, the AI won't be able to torture me regardless of what I do.

Am I missing something?

Sort of, but the thing you're missing is stupid.

A baseline premise here is that you should treat a perfect simulation of yourself as morally equivalent to you. So preventing torture of your simulated self is morally equivalent to preventing real torture. Since you're supposed to care about that, and since the simulations are perfect, real you is capable of controlling what simulated you does (if you donate to MIRI, so will your simulation). Therefore real you is supposed to donate to MIRI because it will save a million simulations from robot hell.

Shame Boy
Mar 2, 2010

SolTerrasa posted:

Sort of, but the thing you're missing is stupid.

A baseline premise here is that you should treat a perfect simulation of yourself as morally equivalent to you. So preventing torture of your simulated self is morally equivalent to preventing real torture. Since you're supposed to care about that, and since the simulations are perfect, real you is capable of controlling what simulated you does (if you donate to MIRI, so will your simulation). Therefore real you is supposed to donate to MIRI because it will save a million simulations from robot hell.

Not just a million simulations, infinity many simulations because there's infinity many alternate universes all with the robot devil so the chance of this happening to you (and not only that but happening infinitely many times) is 100%, and you should care about what happens to alternate universe you for... reasons i guess?

We have a bit of a meme going in the bitcoin mock thread that applies here as well, where when you try to explain ~blockchain~ to someone they invariably respond with "that sounds too dumb to be true, you must be explaining it wrong."

Relevant Tangent
Nov 18, 2016

Tangentially Relevant

Syd Midnight posted:

They can never make it past Pascals Wager, can they?

Thanatophobia is a cruel master.

maswastaken
Nov 12, 2011

ate all the Oreos posted:

Not just a million simulations, infinity many simulations because there's infinity many alternate universes all with the robot devil so the chance of this happening to you (and not only that but happening infinitely many times) is 100%, and you should care about what happens to alternate universe you for... reasons i guess?
There's a 100% chance that one of the alternate universe mes will donate to the cause with sufficient generosity to save me.

Baby Babbeh
Aug 2, 2005

It's hard to soar with the eagles when you work with Turkeys!!



No they're perfect simulations of you so their behavior is 100% predetermined but you also have free will somehow, because you're rational.

pidan
Nov 6, 2012


If I'm a simulation it doesn't matter what I do

If I'm the original, the best way to prevent people getting tortured is to not help the AI

divabot
Jun 17, 2015

A polite little mouse!
The Dark Enlightenment might be falling into disuse - I notice Fahrenheit1488 has also had one post in six months, and it's a personal post from Meredith - but the LessWrong #Metoo has brought the quality. Here's the uncompromisingly rational Eric Wulff and his sensitive approach to this delicate issue. A take so bad that even /r/slatestarcodex wouldn't put up with it.

Shame Boy
Mar 2, 2010

maswastaken posted:

There's a 100% chance that one of the alternate universe mes will donate to the cause with sufficient generosity to save me.

Yeah that's basically Roko's proposed solution too, buy a lottery ticket and in some alternate universe you win and donate all the money.

Shame Boy
Mar 2, 2010

pidan posted:

If I'm a simulation it doesn't matter what I do

If I'm the original, the best way to prevent people getting tortured is to not help the AI

If you're a simulation it matters what you do because if you behave badly robot devil will come through the door at any moment and start the eternal torture, which he wouldn't do if you donated (as the simulation or as yourself). If you're the original and do nothing while knowing that the AI will torture you for doing nothing you will for sure be tortured.

If you're starting to think "wait this just breaks down to pascal's wager with a bunch of stupid unnecessary abstraction" then congratulations, you get Roko's basilisk

Shame Boy
Mar 2, 2010

One of the things to keep in mind is this is the good outcome, this is what happens when we get the benevolent AI, and the AI is doing all this torturing specifically because (using incredibly naive idiot utilitarianism) threatening you with torture infinitely many times is less bad than you not donating and therefore delaying the arrival of the benevolent AI god who will solve all the other pain and death in the universe.

Robot devil hurts you because it loves you and wants the best for humanity.

Hate Fibration
Apr 8, 2013

FLÄSHYN!
For people who hate religion so much they are surprisingly okay with obeying the disgusting whims of a vile and capricious god.

Baby Babbeh
Aug 2, 2005

It's hard to soar with the eagles when you work with Turkeys!!



It really drives home the degree to which the disgusting mess organized religions often become had more to do with certain kinds of people than certain kinds of doctrine.

pidan
Nov 6, 2012


ate all the Oreos posted:

If you're a simulation it matters what you do because if you behave badly robot devil will come through the door at any moment and start the eternal torture, which he wouldn't do if you donated (as the simulation or as yourself). If you're the original and do nothing while knowing that the AI will torture you for doing nothing you will for sure be tortured.

If you're starting to think "wait this just breaks down to pascal's wager with a bunch of stupid unnecessary abstraction" then congratulations, you get Roko's basilisk

Well if the AI is this kind of rear end in a top hat I'm not helping it on principle :colbert:

Wait, if I'm determined to not help the AI no matter what, the threat of torture also won't work so not helping is still the solution

Somfin
Oct 25, 2010

In my🦚 experience🛠️ the big things🌑 don't teach you anything🤷‍♀️.

Nap Ghost
LW exists in this weird state where computer programs just pop into being fully formed.

Like, even the original Goes Foom post started its timeline after a self-aware, self-improving, networked AI exists. Any one of those descriptors is a massive theoretical undertaking, let alone the engineering challenge. But since LWers are interested in neither, they just technobabble their way through, blustering about machine code and computation cycles as if human brains are just computers with Enough Power.

Adbot
ADBOT LOVES YOU

Slime
Jan 3, 2007

Hate Fibration posted:

For people who hate religion so much they are surprisingly okay with obeying the disgusting whims of a vile and capricious god.

It's honestly kind of amazing. People who claim to be rationalists and tend to be comprised mostly of atheists (because atheists are SMART guys, they like SCIENCE) managed to make their own loving religion about a robogod complete with a digital hell.

loving idiots

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply