Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Tiberius Thyben
Feb 7, 2013

Gone Phishing


Ronwayne posted:

Okay, what the gently caress is this Roko Basilisk thing and why are nerds afraid of it? From what I can tell its basically:

Correct?

It's a really stupid, impossible idea build upon a pile of other really stupid, impossible a prioris.

These idiot futurists believe that a technological singularity is fast approaching.
This will bring about an omnibenevolent AI with literally infinite computational power.
This Omnibenevolent AI will solve all problems ever.
However, realizing that it cannot solve problems before it was created, it will want to have been designed as early as possible.
In order to encourage this, these idiots believe, this computer will 'logically' create infinite perfect simulations of anyone who is aware that the above will happen, using it's infinite computational power.
The reason it does this is to torture these simulated people for all eternity if they wouldn't have donated to it's development to the best of their ability (read: donated to Yukowhatever's AI research scam.)
Since you cannot tell if you are one of the simulations, and 'logically' it is far more likely than not that you are one, if you are aware of the coming of such an AI, you are afflicted with 'Roko's Basilisk' and are in danger of eternal suffering unless you donate.
People who are not aware of it, obviously, will not be tortured, because they don't realize they need to donate to avoid torture.

All the logical faults and leaps taken and physical impossibilities are ignored, and any attempt to list them on my part would be absurd because it is incorrect on every level.

Now that you have read this, you are doomed to an eternity in Robot Hell.

Tiberius Thyben has a new favorite as of 07:21 on Aug 3, 2015

Adbot
ADBOT LOVES YOU

Ronwayne
Nov 20, 2007

That warm and fuzzy feeling.
So weird reactionary wannabe space nobles battle against the concept of space-future-satan. Add some potato men wearing refrigerators and hitting each other with chainsaws and you basically have warhammer 40k.

Qwertycoatl
Dec 31, 2008

Ronwayne posted:

So weird reactionary wannabe space nobles battle against the concept of space-future-satan. Add some potato men wearing refrigerators and hitting each other with chainsaws and you basically have warhammer 40k.

Against? These people want space-future-satan to arrive as soon as possible, before the world is ruined by the low-IQ masses.

Munin
Nov 14, 2004


Ronwayne posted:

So weird reactionary wannabe space nobles battle against the concept of space-future-satan. Add some potato men wearing refrigerators and hitting each other with chainsaws and you basically have warhammer 40k.

They'd probably also be up for a giant space furnace to feed plebs into to power their omniscient godhead.

Cingulate
Oct 23, 2012

by Fluffdaddy

Tiberius Thyben posted:

It's a really stupid, impossible idea build upon a pile of other really stupid, impossible a prioris.

These idiot futurists believe that a technological singularity is fast approaching.
This will bring about an omnibenevolent AI with literally infinite computational power.
This Omnibenevolent AI will solve all problems ever.
However, realizing that it cannot solve problems before it was created, it will want to have been designed as early as possible.
In order to encourage this, these idiots believe, this computer will 'logically' create infinite perfect simulations of anyone who is aware that the above will happen, using it's infinite computational power.
The reason it does this is to torture these simulated people for all eternity if they wouldn't have donated to it's development to the best of their ability (read: donated to Yukowhatever's AI research scam.)
Since you cannot tell if you are one of the simulations, and 'logically' it is far more likely than not that you are one, if you are aware of the coming of such an AI, you are afflicted with 'Roko's Basilisk' and are in danger of eternal suffering unless you donate.
People who are not aware of it, obviously, will not be tortured, because they don't realize they need to donate to avoid torture.

All the logical faults and leaps taken and physical impossibilities are ignored, and any attempt to list them on my part would be absurd because it is incorrect on every level.

Now that you have read this, you are doomed to an eternity in Robot Hell.
What bugs me so much about it is not, I think, the more obvious fallacies, but actually the misapplication of Bayesian stochastic reasoning. Bayes means the optimal redistribution of subjective belief away from prior assumptions based on evidence. A core element of applied Bayesian inference is for your priors to always have support over the whole logically possible range - that means, always consider that you might be wrong, because if your priors disallow something that is logically possible, you will be unable to, no matter how strong your evidence is, reach the true solution. Furthermore, evidence from nature is always distributed - the truth is normally distributed. This means that your posterior is also always some distribution with non-zero spread - unless your prior is, for example, 0 or "The Singularity Will Happen Tomorrow", in which case your posterior is 0 or "The Singularity Will Happen Tomorrow", regardless of the evidence.
So Bayes always leaves us with some uncertainty. It tells us how certain we should be, but if it tells you you should be 100% certain, you either did something wrong, or are not talking about the real world.

So it is quite reasonable for us, as Bayesian agents, to have some uncertainty about what the future will bring. Similarly, a Bayesian AI will have some uncertainty left. Also, a Bayesian AI will probably understand that it is rational for us to never have a zero-spread posterior (ie., be 100% certain).

Not so Yudhowsky's AI, which seemingly assumes that we should be 100% certain, even though it presumably knows how loving dumb humans are and how it is quite sensible to be skeptical of, first and foremost, ones own convictions.

neonnoodle
Mar 20, 2008

by exmarx
So I'm guessing pretty much that LW is to Bayes Theorem what Objectivism is to Aristotle's axioms.

A whole pseudophilosophy based on complete and total misreading.

Cingulate
Oct 23, 2012

by Fluffdaddy

neonnoodle posted:

So I'm guessing pretty much that LW is to Bayes Theorem what Objectivism is to Aristotle's axioms.

A whole pseudophilosophy based on complete and total misreading.
That'd be my guess too - that it's a cargo cult - although I haven't bothered to check out their actual usage of Bayesian methods.

Somfin
Oct 25, 2010

In my🦚 experience🛠️ the big things🌑 don't teach you anything🤷‍♀️.

Nap Ghost
A reminder that Yud also believes that AI will simply 'happen' some day. Like, some day, some programmer will slap the run button and it'll be the future AI god, unstoppable and glorious. He refers to this as "hard takeoff" officially, but if there's one thing I love about Yud, it's his amazing tendency to reveal the childishness of his beliefs in unofficial channels. "AI-go-FOOM" is a tenet of Yudkowsky's view of AI.

Somfin has a new favorite as of 14:15 on Aug 3, 2015

neonnoodle
Mar 20, 2008

by exmarx

Somfin posted:

A reminder that Yud also believes that AI will simply 'happen' some day. Like, some day, some programmer will slap the run button and it'll be the future AI god, unstoppable and glorious. He refers to this as "hard takeoff" officially, but if there's one thing I love about Yud, it's his amazing tendency to reveal the childishness of his beliefs in unofficial channels. "AI-go-FOOM" is a tenet of Yudkowsky's view of AI.

Doesn't seem that different from Kurzweil...

Cingulate
Oct 23, 2012

by Fluffdaddy

Somfin posted:

A reminder that Yud also believes that AI will simply 'happen' some day. Like, some day, some programmer will slap the run button and it'll be the future AI god, unstoppable and glorious. He refers to this as "hard takeoff" officially, but if there's one thing I love about Yud, it's his amazing tendency to reveal the childishness of his beliefs in unofficial channels. "AI-go-FOOM" is a tenet of Yudkowsky's view of AI.
I'm mostly working on machine intelligence right now (from a purely applied perspective), and from what I can tell, in contrast, their whole idea of AI seems a "not-entirely-unlike-tea" thing through and through. Sure, these people have seen a computer before, and they've read a scifi book featuring AIs, and some of them are smart (albeit not, err, conventionally educated). But I don't see any contact with actual machine learning research.

It's a lot like William Gibson inventing the internet on a typewriter, only 30 years after the internet was invented and without the genius, and not intended as scifi.

Woolie Wool
Jun 2, 2006


coyo7e posted:

I'd like to push Charles Krauthammer into a wide, deep pool of thin concrete.

Am I a bad person?

Yes, because you should tilt the wheelchair so he falls into the concrete without the chair following him. Somebody else could use that chair.

neonnoodle posted:

Doesn't seem that different from Kurzweil...

That's because the entire singularity movement is just Kurzweil fanfiction.

Woolie Wool has a new favorite as of 14:51 on Aug 3, 2015

ArchangeI
Jul 15, 2010
I love how the entire thing is pretty much just extrapolating current trends into infinity and then freaking out when the results are terrifying. Computers have gotten increasingly powerful over the last few decades. if we extrapolate that trend into the future, eventually they will become so powerful they can simulate the position of every molecule in the universe with perfect precision, allowing them to predict the future perfectly! Why is no one taking this serious!?

Of course, there is a small problem. If we extrapolate the current trend of global warming, the Earth's atmosphere might start hydrogen fusion before then and turn the planet into a second sun.

divabot
Jun 17, 2015

A polite little mouse!

ArchangeI posted:

I love how the entire thing is pretty much just extrapolating current trends into infinity and then freaking out when the results are terrifying. Computers have gotten increasingly powerful over the last few decades. if we extrapolate that trend into the future, eventually they will become so powerful they can simulate the position of every molecule in the universe with perfect precision, allowing them to predict the future perfectly! Why is no one taking this serious!?

These people have never had someone tell them "don't fall in love with your model". And have never heard of reductio ad absurdum.

The Vosgian Beast
Aug 13, 2011

Business is slow
There's a quote by a Chinese author, I like, Lu Xun, who said “I thought: hope cannot be said to exist, nor can it be said not to exist. It is just like roads across the earth. For actually the earth had no roads to begin with, but when many men pass one way, a road is made.”

The Neo-Reactionary is someone who goes "Well, do you notice how roads all link up on a given continent, and that people keep building them, and sometimes they clear forests or hills? Clearly roads are a sentient memetic virus trying to spread themselves via humans. Furthermore, many forests are nice, and you can learn a lot more by just wandering around, roads make it too easy really. You don't want every square inch of the world to be covered by roads, do you? Well that's the logical roady end game. I say we censor the roady press, and abolish roads."

DStecks
Feb 6, 2012

PYF Dark Enlightenment Thinker: Less Wrong Thread 2.0

The Vosgian Beast
Aug 13, 2011

Business is slow

DStecks posted:

PYF Dark Enlightenment Thinker: Less Wrong Thread 2.0

Yeah kinda.

I mean there's the Book Barn HPMOR thread for Yudchat too, this thread doesn't really need Roko's Basilisk derails.

I mean, the old dead LW mock thread had a "wait, let's go over how Roko's Basilisk works in excruciating detail again" every five pages.

Evrart Claire
Jan 11, 2008
All that needs to be said about roko's basilisk or yud's ai theories is that it's nothing more than religion for smug nerds.

Chocolate Teapot
May 8, 2009

Tiberius Thyben posted:

It's a really stupid, impossible idea build upon a pile of other really stupid, impossible a prioris.

These idiot futurists believe that a technological singularity is fast approaching.
This will bring about an omnibenevolent AI with literally infinite computational power.
This Omnibenevolent AI will solve all problems ever.
However, realizing that it cannot solve problems before it was created, it will want to have been designed as early as possible.
In order to encourage this, these idiots believe, this computer will 'logically' create infinite perfect simulations of anyone who is aware that the above will happen, using it's infinite computational power.
The reason it does this is to torture these simulated people for all eternity if they wouldn't have donated to it's development to the best of their ability (read: donated to Yukowhatever's AI research scam.)
Since you cannot tell if you are one of the simulations, and 'logically' it is far more likely than not that you are one, if you are aware of the coming of such an AI, you are afflicted with 'Roko's Basilisk' and are in danger of eternal suffering unless you donate.
People who are not aware of it, obviously, will not be tortured, because they don't realize they need to donate to avoid torture.

All the logical faults and leaps taken and physical impossibilities are ignored, and any attempt to list them on my part would be absurd because it is incorrect on every level.

Now that you have read this, you are doomed to an eternity in Robot Hell.

That's whatcha whatcha get on level 5 (Jehovah's Witnesses apparently)

neonnoodle
Mar 20, 2008

by exmarx


"Actually we're Quantum Presbyterians."

divabot
Jun 17, 2015

A polite little mouse!
Yeah, the Basilisk is hilarious but is not in fact the Dark Enlightenment (EVEN IF MIKE ANISSIMOV, THE SLAVIC UNTERMENSCH WITH WHITE-NATIONALIST ENVY, SERIOUSLY DOES IN FACT ACTUALLY BELIEVE IN THE BASILISK.). Let's see what else is out there in the land of the Tumblr Endarkenment!

Nyan Sandwich: At this blog, we are interested in discovering and presenting the underreported truth in these areas. This reads like a college student whose mind has been set ablaze by attempting to understand Moldbug.

Sons of the Sun: some sort of white nationalist ramble that presumably has a point in there somewhere, gently caress knows what.

The Hestia Society, whichever the gently caress of these idiots that is (either Nyan or Wesley, looks like): an exciting photoblog. Well, my dear neoreactionary, reblogging Cathedral imagery is hardly going to bring about the monarchy, is it now/

(from their site: Currently Seeking: people who can actually do poo poo, unlike us. We're more your ideas men.)

Dunno who this dick is. Likes pretty white girls, hates blacks and Arthur Chu.

And just for the horribly unrational people from low-IQ gene pools who are pining for the LW thread, Scott wonders if there really are enough people to get HPMOR that Hugo by spamming the slate.

Curvature of Earth
Sep 9, 2011

Projected cost of
invading Canada:
$900

divabot posted:

And just for the horribly unrational people from low-IQ gene pools who are pining for the LW thread, Scott wonders if there really are enough people to get HPMOR that Hugo by spamming the slate.

Even if the World Science Fiction Society gets spammed with HPMOR votes it might be disqualified on the grounds that it has neither been copyrighted nor published anywhere within the eligible date range. If they want to rules-lawyer, then its last few chapters might still be eligible.

divabot
Jun 17, 2015

A polite little mouse!

Curvature of Earth posted:

Even if the World Science Fiction Society gets spammed with HPMOR votes it might be disqualified on the grounds that it has neither been copyrighted nor published anywhere within the eligible date range. If they want to rules-lawyer, then its last few chapters might still be eligible.

Serial novels published over multiple years are a tricky one, but I'm sure there's precedent for them. (Not just the four or five parts in Analog, but longer-running serial novels.)

The Vosgian Beast
Aug 13, 2011

Business is slow

divabot posted:

Yeah, the Basilisk is hilarious but is not in fact the Dark Enlightenment (EVEN IF MIKE ANISSIMOV, THE SLAVIC UNTERMENSCH WITH WHITE-NATIONALIST ENVY, SERIOUSLY DOES IN FACT ACTUALLY BELIEVE IN THE BASILISK.). Let's see what else is out there in the land of the Tumblr Endarkenment!

Nyan Sandwich: At this blog, we are interested in discovering and presenting the underreported truth in these areas. This reads like a college student whose mind has been set ablaze by attempting to understand Moldbug.

Sons of the Sun: some sort of white nationalist ramble that presumably has a point in there somewhere, gently caress knows what.

The Hestia Society, whichever the gently caress of these idiots that is (either Nyan or Wesley, looks like): an exciting photoblog. Well, my dear neoreactionary, reblogging Cathedral imagery is hardly going to bring about the monarchy, is it now/

(from their site: Currently Seeking: people who can actually do poo poo, unlike us. We're more your ideas men.)

Dunno who this dick is. Likes pretty white girls, hates blacks and Arthur Chu.

And just for the horribly unrational people from low-IQ gene pools who are pining for the LW thread, Scott wonders if there really are enough people to get HPMOR that Hugo by spamming the slate.

They're just never gonna give up on memetics, huh.

Sure why not, it's too fun to accuse your opponent of being infected by evil mind viruses.

The Vosgian Beast
Aug 13, 2011

Business is slow
Also I am slowly coming to realize that the entire Dark Enlightenment is that assignment in Social Studies where you and your group had to come up with ideas for a society and how it should be run, and you just kinda winged most of it and came up with National Video Games Day as your major holiday or whatever.

Except it's adult nazis instead of 11 year olds, thus removing the charm.

divabot
Jun 17, 2015

A polite little mouse!

The Vosgian Beast posted:

Also I am slowly coming to realize that the entire Dark Enlightenment is that assignment in Social Studies where you and your group had to come up with ideas for a society and how it should be run, and you just kinda winged most of it and came up with National Video Games Day as your major holiday or whatever.

Except it's adult nazis instead of 11 year olds, thus removing the charm.

Did I say pretty white girls? I meant of course pretty white girls with big eyes who are totally not like anime chicks (and who almost certainly don't deserve to be reblogged by a white nationalist neoreactionary).

I AM GRANDO
Aug 20, 2006

divabot posted:

Did I say pretty white girls? I meant of course pretty white girls with big eyes who are totally not like anime chicks (and who almost certainly don't deserve to be reblogged by a white nationalist neoreactionary).

I wonder how this Christian teenager feels about becoming an object of lust for a bunch of middle-aged nazis.

Curvature of Earth
Sep 9, 2011

Projected cost of
invading Canada:
$900

The Vosgian Beast posted:

Also I am slowly coming to realize that the entire Dark Enlightenment is that assignment in Social Studies where you and your group had to come up with ideas for a society and how it should be run, and you just kinda winged most of it and came up with National Video Games Day as your major holiday or whatever.

Except it's adult nazis instead of 11 year olds, thus removing the charm.

Once, in summer camp, we were split into "archeology" teams. Each team had to create fake artifacts for an imaginary civilization, then bury them. We were supposed to include information on what the government, religion, and language was like. We then dug up a different team's artifacts and tried to interpret them.

Because we were 12-year-olds, ideas included:
  • A civilization based entirely around shopping.
  • A civilization that was afraid of technology.
  • One weird attempt at merging democracy and theocracy.
The Dark Enlightenment is both dumber and less creative than any of these.

SurreptitiousMuffin
Mar 21, 2010
Isn't the Hestia Society that weird women-only school that trains them to be perfect Victorian housewives and also to hate minorities and degenerates?

There was an A/T Thread by a former member at some point. If it's not the same, the name is very similar.

I AM GRANDO
Aug 20, 2006

SurreptitiousMuffin posted:

Isn't the Hestia Society that weird women-only school that trains them to be perfect Victorian housewives and also to hate minorities and degenerates?

There was an A/T Thread by a former member at some point. If it's not the same, the name is very similar.

Like as a sex thing or as a legit part of some oppressive subculture? That sounds terrifying.

Hellequin
Feb 26, 2008

You Scream! You open your TORN, ROTTED, DECOMPOSED MOUTH AND SCREAM!

SurreptitiousMuffin posted:

Isn't the Hestia Society that weird women-only school that trains them to be perfect Victorian housewives and also to hate minorities and degenerates?

There was an A/T Thread by a former member at some point. If it's not the same, the name is very similar.

I think you're thinking of Aristasia, which had a weird fantasy world as part of it and involved a lot of Victorian S&M. https://en.wikipedia.org/wiki/Miss_Martindale

I've got some links saved somewhere because it was such a fascinating and bizarre thing.

Shame Boy
Mar 2, 2010

Hellequin posted:

I think you're thinking of Aristasia, which had a weird fantasy world as part of it and involved a lot of Victorian S&M. https://en.wikipedia.org/wiki/Miss_Martindale

I've got some links saved somewhere because it was such a fascinating and bizarre thing.

That wiki page is very sparse, care to post the links?

darkwasthenight
Jan 7, 2011

GENE TRAITOR

Parallel Paraplegic posted:

That wiki page is very sparse, care to post the links?

It was in the old 'Ask me about being friends with a girl who thinks she is married to an anime' thread and the poster was Hibiscus.

Somewhere around this part of the thread.

I AM GRANDO
Aug 20, 2006

Hellequin posted:

I think you're thinking of Aristasia, which had a weird fantasy world as part of it and involved a lot of Victorian S&M. https://en.wikipedia.org/wiki/Miss_Martindale

I've got some links saved somewhere because it was such a fascinating and bizarre thing.

http://www.youtube.com/watch?v=_3hxVnKUrHU

Seems right at home with nazi fantasies, really. I'm sure their sexy parties involve a lot of conversation about exterminating the brutes and the white man's burden.

Hellequin
Feb 26, 2008

You Scream! You open your TORN, ROTTED, DECOMPOSED MOUTH AND SCREAM!
Yeah sure, I was just looking for everything I had saved. There was a derail in the "Ask me about growing up with a girl who thinks she's married to anime guys" thread, around 100 or so pages in, about this stuff. Posts are here: http://forums.somethingawful.com/showthread.php?threadid=3444416&userid=207587

http://aristasia.info/
http://www.mother-god.com/aristasia.html
https://aristasia.wordpress.com/
http://aristasia-central.com/
http://tvtropes.org/pmwiki/pmwiki.php/ARG/Aristasia
http://aristasia.net/index.html
http://newsgroups.derkeiler.com/Archive/Talk/talk.religion.newage/2005-11/msg00000.html

Hellequin has a new favorite as of 01:40 on Aug 5, 2015

divabot
Jun 17, 2015

A polite little mouse!

Jack Gladney posted:

I wonder how this Christian teenager feels about becoming an object of lust for a bunch of middle-aged nazis.

If you click on the link he cites, you'll see she deleted the post. Edit: sorry, that was some intermediate reblogger.

divabot has a new favorite as of 23:22 on Aug 3, 2015

Hellequin
Feb 26, 2008

You Scream! You open your TORN, ROTTED, DECOMPOSED MOUTH AND SCREAM!
To be honest though, Aristasia is pretty benign next to the Dark Enlightenment stuff, even if it does have some of the telling signs of Ur-Fascism (http://www.publiceye.org/fascist/eco/ur-fascism.html). It comes off more the product of sexually repressed women creating escapist fantasies to express their closeted lesbianism.

Pope Guilty
Nov 6, 2006

The human animal is a beautiful and terrible creature, capable of limitless compassion and unfathomable cruelty.
Aristasians seem perfectly happy to have their little world and live in it with their fantasies rather than believe they're going to take over the world, though. They're weird but seem harmless.

divabot
Jun 17, 2015

A polite little mouse!

Pope Guilty posted:

Aristasians seem perfectly happy to have their little world and live in it with their fantasies rather than believe they're going to take over the world, though. They're weird but seem harmless.

They've yet to manifest their Peter Thiel or Justine Tunney?

Pope Guilty
Nov 6, 2006

The human animal is a beautiful and terrible creature, capable of limitless compassion and unfathomable cruelty.

divabot posted:

They've yet to manifest their Peter Thiel or Justine Tunney?

And never will, imo, since Aristasians would rather withdraw than conquer.

Adbot
ADBOT LOVES YOU

I AM GRANDO
Aug 20, 2006

That story of them getting destroyed by the internet is pretty sad. Vulnerable closeted people role-playing the last time in history that there were socially acceptable closets to live in. And then 6 years of broadband turned them all into anime people and they were gone.

I hope that pre-internet lady shows up again some day. It must have been so sad for her to see that all happen.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply