Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Fututor Magnus
Feb 22, 2016

by FactsAreUseless

The Vosgian Beast posted:

https://twitter.com/nydwracu/status/718499072768913408

"What an unbelievable oval office"-St_Rev, after seeing a piece of architecture he doesn't like

Takin' after Lovecraft in more ways than one

EDIT: Oh my, this is a can of worms https://twitter.com/search?q=st_rev%20architecture&src=typd

Wesley, Rev, and architecture, is a can of worms in general. It's funny how far dunning-kruger extends.

Adbot
ADBOT LOVES YOU

BattleMaster
Aug 14, 2000

Is there a proper name for the "we are all a simulation and if we do even the slightest thing to annoy the AI we will be tortured forever" nerd religion?

Shame Boy
Mar 2, 2010

BattleMaster posted:

Is there a proper name for the "we are all a simulation and if we do even the slightest thing to annoy the AI we will be tortured forever" nerd religion?

The idea originated as Roko's Basilisk:

http://rationalwiki.org/wiki/Roko%27s_basilisk

e: Okay I didn't realize it didn't originate as that but it's what everyone calls it so there

Shame Boy has a new favorite as of 07:59 on Jun 5, 2016

Fututor Magnus
Feb 22, 2016

by FactsAreUseless

Dmitri-9 posted:

To him effective altruism is synonymous with MIRI because the Robot Devil might torture every atom in the observable universe and then create 2^16 simulations where each atom is tortured 2^16 times per simulation. If that is possible than the expected value of MIRI approaches infinity.

More basically, the idea that donating money to a bunch of pro-fascist libertarians to save a fascist programmer's talk on his ultimately useless vanity project is better than giving it to a EA cause, so implies Scott Alexander.

BattleMaster
Aug 14, 2000

Parallel Paraplegic posted:

The idea originated as Roko's Basilisk:

http://rationalwiki.org/wiki/Roko%27s_basilisk

e: Okay I didn't realize it didn't originate as that but it's what everyone calls it so there

Thanks! I've referenced it a few times lately but I never knew for sure what the name was. I saw "the basilisk" name a bunch in this thread but I never knew for sure that it was the same thing.

edit: oh wow my wrong version of it is actually less stupid than the real thing; they actually think that the pain from the copy of you being tortured will go back in time an affect the real you in a time before the AI was even created :psyduck:

BattleMaster has a new favorite as of 08:11 on Jun 5, 2016

divabot
Jun 17, 2015

A polite little mouse!
That new other thread is not entirely terrible (and everyone hates Imm*rt*n there too), and the link to http://www.amerika.org/politics/what-if-the-alternative-right-took-over/ was suitably hilarious.

Tesseraction
Apr 5, 2009

BattleMaster posted:

Thanks! I've referenced it a few times lately but I never knew for sure what the name was. I saw "the basilisk" name a bunch in this thread but I never knew for sure that it was the same thing.

edit: oh wow my wrong version of it is actually less stupid than the real thing; they actually think that the pain from the copy of you being tortured will go back in time an affect the real you in a time before the AI was even created :psyduck:

And remember, this is because it's a benevolent AI that's punishing you for not inventing it faster.

neonnoodle
Mar 20, 2008

by exmarx
Just like Original God.

The Vosgian Beast
Aug 13, 2011

Business is slow

Fututor Magnus posted:

More basically, the idea that donating money to a bunch of pro-fascist libertarians to save a fascist programmer's talk on his ultimately useless vanity project is better than giving it to a EA cause, so implies Scott Alexander.

P sure Scott is the kind of person who segregates things happening to people he doesn't know and things happening to his ingroup onto entirely different levels of reality.

Heresiarch
Oct 6, 2005

Literature is not exhaustible, for the sufficient and simple reason that no single book is. A book is not an isolated being: it is a relationship, an axis of innumerable relationships.

divabot posted:

That new other thread is not entirely terrible (and everyone hates Imm*rt*n there too), and the link to http://www.amerika.org/politics/what-if-the-alternative-right-took-over/ was suitably hilarious.

The guy who wrote that piece is amazing.

quote:

Brett Stevens has worked in information technology while maintaining a dual life as a writer for over 20 years. His writings, generally considered too realistic and not emotional enough for publication, cover the topics of politics and environmentalism through a philosophical filter.
Also not surprising that a racist IT dude sometimes publishes with a group founded by a racist IT dude.

Zemyla
Aug 6, 2008

I'll take her off your hands. Pleasure doing business with you!

Tesseraction posted:

And remember, this is because it's a benevolent AI that's punishing you for not inventing it faster.

Well, it'll only punish you if you believe it should. It's the hell you deserve.

A Festivus Miracle
Dec 19, 2012

I have come to discourse on the profound inequities of the American political system.

Lemme get this straight: Basically, Roko's Basilisk is the Old Hebrew God, except now he's real and is going to punish you forever because he loves you.

darthbob88
Oct 13, 2011

YOSPOS

A White Guy posted:

Lemme get this straight: Basically, Roko's Basilisk is the Old Hebrew God, except now he's real and is going to punish you forever because he loves you.

More or less. Except it's not going to punish you, it's going to punish a simulation of you, because you'll have been dead for years/decades/centuries by the time all this happens. And it's only going to punish (simulated) you forever if you knowingly turned from the path of hastening the arrival of friendly AI; if you didn't know it mattered, you're safe, but if you know that FAI will save billions of future lives and do not do everything you can to make it happen, you're condemning millions of future people to suffering and death because you wanted to spend that money on hookers and blow instead.

Terrible Opinions
Oct 18, 2013



Why anyone should care is only answered by very strange cult think.

neonnoodle
Mar 20, 2008

by exmarx

Terrible Opinions posted:

Why anyone should care is only answered by very strange cult think.
Unfortunately we'll all have to deal with it if the Emperor converts before a big battle.

Shame Boy
Mar 2, 2010

darthbob88 posted:

More or less. Except it's not going to punish you, it's going to punish a simulation of you, because you'll have been dead for years/decades/centuries by the time all this happens. And it's only going to punish (simulated) you forever if you knowingly turned from the path of hastening the arrival of friendly AI; if you didn't know it mattered, you're safe, but if you know that FAI will save billions of future lives and do not do everything you can to make it happen, you're condemning millions of future people to suffering and death because you wanted to spend that money on hookers and blow instead.

I know it's a stupid idea for a lot of reasons but why would torturing a simulation of you based on whatever facts it can glean from (presumably) the internet's past be any different than torturing a completely different person who happens to also play TF2 at 4 AM every night and fap to diaperfur or whatever? Why does it even have to be a simulation of you, or a simulation at all - why can't it just threaten to torture a percentage of future people unless you help it because you're such an effective altruist you'd do whatever you can to reduce the suffering of hypothetical future people?

I mean I realize the answer is actually that it's a stupid poorly-thought-out idea based on dumb science fiction but do they have a "real" reason?

Woolie Wool
Jun 2, 2006


divabot posted:

That new other thread is entirely terrible and full of nazis who think Jewish echoes are a funny meme

Fixed that for you.

90s Cringe Rock
Nov 29, 2006
:gay:

Parallel Paraplegic posted:

I know it's a stupid idea for a lot of reasons but why would torturing a simulation of you based on whatever facts it can glean from (presumably) the internet's past be any different than torturing a completely different person who happens to also play TF2 at 4 AM every night and fap to diaperfur or whatever? Why does it even have to be a simulation of you, or a simulation at all - why can't it just threaten to torture a percentage of future people unless you help it because you're such an effective altruist you'd do whatever you can to reduce the suffering of hypothetical future people?

I mean I realize the answer is actually that it's a stupid poorly-thought-out idea based on dumb science fiction but do they have a "real" reason?

Because you could be that simulation, and in thirty seconds it'll move from simulating your historical normal mortal life into infinite torture mode.

This isn't very likely, but if you multiply the odds by a trillion trillion trillion then it's practically certain!

I AM GRANDO
Aug 20, 2006

Parallel Paraplegic posted:

I know it's a stupid idea for a lot of reasons but why would torturing a simulation of you based on whatever facts it can glean from (presumably) the internet's past be any different than torturing a completely different person who happens to also play TF2 at 4 AM every night and fap to diaperfur or whatever? Why does it even have to be a simulation of you, or a simulation at all - why can't it just threaten to torture a percentage of future people unless you help it because you're such an effective altruist you'd do whatever you can to reduce the suffering of hypothetical future people?

I mean I realize the answer is actually that it's a stupid poorly-thought-out idea based on dumb science fiction but do they have a "real" reason?

Wasn't the big scary there that you are the simulation but you won't know it until the ai pulls back the curtain? Like, it's all techno-Calvinism based on that bullshit claim about the odds being in favor of our universe being a simulation and not an actual universe.

Strom Cuzewon
Jul 1, 2010

Parallel Paraplegic posted:

I know it's a stupid idea for a lot of reasons but why would torturing a simulation of you based on whatever facts it can glean from (presumably) the internet's past be any different than torturing a completely different person who happens to also play TF2 at 4 AM every night and fap to diaperfur or whatever? Why does it even have to be a simulation of you, or a simulation at all - why can't it just threaten to torture a percentage of future people unless you help it because you're such an effective altruist you'd do whatever you can to reduce the suffering of hypothetical future people?

I mean I realize the answer is actually that it's a stupid poorly-thought-out idea based on dumb science fiction but do they have a "real" reason?

The simulation is a perfect duplicate of you, and in terms of information-flow is indistinguishable. There was a pattern of matter that made up "you". It was duplicated, and there are now two. You therefore can't distinguish between them. Nevermind that one is made of out of meat, and the other exists only in the memory banks of some future robot-deity.

Yud's response to the basilisk is as well thought out and coherent as you'd expect from him:

quote:

One might think that the possibility of CEV punishing people couldn't possibly be taken seriously enough by anyone to actually motivate them. But in fact one person at SIAI was severely worried by this, to the point of having terrible nightmares, though ve wishes to remain anonymous. I don't usually talk like this, but I'm going to make an exception for this case.
Listen to me very closely, you idiot.
YOU DO NOT THINK IN SUFFICIENT DETAIL ABOUT SUPERINTELLIGENCES CONSIDERING WHETHER OR NOT TO BLACKMAIL YOU. THAT IS THE ONLY POSSIBLE THING WHICH GIVES THEM A MOTIVE TO FOLLOW THROUGH ON THE BLACKMAIL.
There's an obvious equilibrium to this problem where you engage in all positive acausal trades and ignore all attempts at acausal blackmail. Until we have a better worked-out version of TDT and we can prove that formally, it should just be OBVIOUS that you DO NOT THINK ABOUT DISTANT BLACKMAILERS in SUFFICIENT DETAIL that they have a motive toACTUALLY BLACKMAIL YOU.
If there is any part of this acausal trade that is positive-sum and actually worth doing, that is exactly the sort of thing you leave up to an FAI. We probably also have the FAI take actions that cancel out the impact of anyone motivated by true rather than imagined blackmail, so as to obliterate the motive of any superintelligences to engage in blackmail.
Meanwhile I'm banning this post so that it doesn't (a) give people horrible nightmares and (b) give distant superintelligences a motive to follow through on blackmail against people dumb enough to think about them in sufficient detail, though, thankfully, I doubt anyone dumb enough to do this knows the sufficient detail. (I'm not sure I know the sufficient detail.)
You have to be really clever to come up with a genuinely dangerous thought. I am disheartened that people can be clever enough to do that and not clever enough to do the obvious thing and KEEP THEIR IDIOT MOUTHS SHUT about it, because it is much more important to sound intelligent when talking to your friends. This post was STUPID.
(For those who have no idea why I'm using capital letters for something that just sounds like a random crazy idea, and worry that it means I'm as crazy as Roko, the gist of it was that he just did something that potentially gives superintelligences an increased motive to do extremely evil things in an attempt to blackmail us. It is the sort of thing you want to be EXTREMELY CONSERVATIVE about NOT DOING.)

Shame Boy
Mar 2, 2010

Strom Cuzewon posted:

The simulation is a perfect duplicate of you, and in terms of information-flow is indistinguishable. There was a pattern of matter that made up "you". It was duplicated, and there are now two. You therefore can't distinguish between them. Nevermind that one is made of out of meat, and the other exists only in the memory banks of some future robot-deity.

Yud's response to the basilisk is as well thought out and coherent as you'd expect from him:

I thought the whole point was it wasn't a perfect duplicate of you, it was a duplicate created with the extant information left over after x hundred years, and the AI just filled in the blanks for the rest using probabilities, and this was "close enough" to being you that you should treat it like it was actually you or something.

Because if robot devil is creating exact duplicates of you then it becomes even dumber, somehow.

chrisoya posted:

Because you could be that simulation, and in thirty seconds it'll move from simulating your historical normal mortal life into infinite torture mode.

This isn't very likely, but if you multiply the odds by a trillion trillion trillion then it's practically certain!

Jack Gladney posted:

Wasn't the big scary there that you are the simulation but you won't know it until the ai pulls back the curtain? Like, it's all techno-Calvinism based on that bullshit claim about the odds being in favor of our universe being a simulation and not an actual universe.

Oh okay, so it is in fact significantly dumber than I thought it was.

I AM GRANDO
Aug 20, 2006

Is Yud fat? I feel like he has to be fat.

Curvature of Earth
Sep 9, 2011

Projected cost of
invading Canada:
$900
"What if we're all a simulation?" is honestly just Last Thursdayism for sci-fi nerds. The fact that it gets discussed seriously goes to show how little difference there is between them and the biblical literalists they look down upon.

Curvature of Earth
Sep 9, 2011

Projected cost of
invading Canada:
$900

Jack Gladney posted:

Is Yud fat? I feel like he has to be fat.

His ego's certainly big.

I AM GRANDO
Aug 20, 2006

They're still high on the thrill of being in the gifted program in elementary school and remain at the same level of intellectual development, which is why the most basic concepts continually fascinate them. Has anyone ever suggested to them that they read a book?

I AM GRANDO
Aug 20, 2006

Curvature of Earth posted:

His ego's certainly big.


He has the beard/glasses/ponytail trio of a fat man. And the standard-issue dress-up polo of a fat man. Curious.

GIANT OUIJA BOARD
Aug 22, 2011

177 Years of Your Dick
All
Night
Non
Stop
Remember, Big Yud can rewire his brain through sheer force of will but he can only do it once so he's saving it for wen the earth really needs if

Qwertycoatl
Dec 31, 2008

Jack Gladney posted:

Is Yud fat? I feel like he has to be fat.

What does your heart tell you?

Eliezer Yudkowsky posted:

I am questioning the value of diet and exercise. Thermodynamics is technically true but useless, barring the application of physical constraint or inhuman willpower to artificially produce famine conditions and keep them in place permanently. You, clearly, are one of the metabolically privileged, so let me assure you that I could try exactly the same things you do to control your weight and fail. My fat cells would keep the energy that yours release; a skipped meal you wouldn't notice would have me dizzy when I stand up; exercise that grows your muscle mass would do nothing for mine.

I AM GRANDO
Aug 20, 2006

Qwertycoatl posted:

What does your heart tell you?

Are you loving kidding me? I want to bully the poo poo out of him so badly. This thread does terrible things to me.

Pieces of Peace
Jul 8, 2006
Hazardous in small doses.

Parallel Paraplegic posted:

I thought the whole point was it wasn't a perfect duplicate of you, it was a duplicate created with the extant information left over after x hundred years, and the AI just filled in the blanks for the rest using probabilities, and this was "close enough" to being you that you should treat it like it was actually you or something.

Because if robot devil is creating exact duplicates of you then it becomes even dumber, somehow.

But if the super AI can't create exact duplicates how will white male nerds the intellectual worthy become truly immortal like my fear of death demands they deserve to be???

The Vosgian Beast
Aug 13, 2011

Business is slow
https://twitter.com/AltRightWho/status/735314528968269825

90s Cringe Rock
Nov 29, 2006
:gay:
Which ones the doctor?

Woolie Wool
Jun 2, 2006


If London turns into Mos Eisley in 2060 I think that will be an improvement over its current status.

TinTower
Apr 21, 2010

You don't have to 8e a good person to 8e a hero.
Roko's basilisk is basically the transporter problem but stupider.

Wanamingo
Feb 22, 2008

by FactsAreUseless

TinTower posted:

Roko's basilisk is basically the transporter problem but stupider.

Frogisis
Apr 15, 2003

relax brother relax
I always got "the Monty Hall Problem + the Game" vibes. Now that you know about it you're playing and you have to gamble on some choice to get the good payoff / avoid the bad one. I think when I first heard about it that was the original idea and it was just supposed to be some guy named Roko's entry in a "post ur ideas for dangerous knowledge hazards lol" thread but then LW being LW people took it way too seriously and reinvented Pascal's loving Wager.

BENGHAZI 2
Oct 13, 2007

by Cyrano4747
The Monty hall problem is actually sensical and easily figures out though

This poo poo is retarded

Frogisis
Apr 15, 2003

relax brother relax

Literally The Worst posted:

The Monty hall problem is actually sensical and easily figures out though

This poo poo is retarded

Yeah, it just kinda triggers a free association to it, the way the vague shape of a cloud might remind you of a tubby internet cult leader.

I AM GRANDO
Aug 20, 2006


Well, I can't think of anything whiter than Dr. Who. I think he's had sex though.

Adbot
ADBOT LOVES YOU

TinTower
Apr 21, 2010

You don't have to 8e a good person to 8e a hero.

Jack Gladney posted:

Well, I can't think of anything whiter than Dr. Who. I think he's had sex though.

He has a granddaughter, you know.

Do you know what she's called? She's called Incontinentia. Incontinentia Buttocks.

The Doctor was basically one of the few fictional characters that the fandom wants to be asexual. Which I admit, is better than people wanting Sam and Dean from Supernatural to gently caress each other with dog penises.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply