Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Emmideer
Oct 20, 2011

Lovely night, no?
Grimey Drawer
Roko's Basilisk was referenced in the most recent Silicon Valley ep. Loving this stuff.

Adbot
ADBOT LOVES YOU

The Vosgian Beast
Aug 13, 2011

Business is slow

side_burned posted:

My favorite is a debate between Chris Hedges and Sam Harris that was hosted by Truthdig and moderate by Robert Scheer about ten years ago. You can see Scheer become more more horrified at what Harris says. There was something very weak about the intellectual culture (for lack of a better term) during the Bush years. The fact the the New Atheists where had any clout is a strange phenomenon and it feels like it took a solid decade to expose them as intellectual charlatans and neocon windup dolls.

For comedy value nothing beats Harris picking an email fight with Chomsky and loosing badly in it and then asking for Chomsky's permission to make those emails public. Sam's ego something else.

Edit: I am listen to that podcast with Harris and Klien, that fact the Harris makes the conversation about people being mean to him rather than about the scholarship of the Bellcurve is loving hilarious. Harris has spent his whole carrere as a public figure saying he has PHD in neuroscience how can I be wrong. :smug: Klien just dives into the history of America racism and implication of Murray's work and you can hear Harris scrambling to just think of something to say and make drat sure that what he says is about people being mean to him.

I have read alt-right blogposts that are less anti-semitic than his comments on jews in End of Faith

Goon Danton
May 24, 2012

Don't forget to show my shitposts to the people. They're well worth seeing.

That's some Culture of Critique poo poo right there. "Genocide is the victims' fault, for being an ethnic group."

Neon Noodle
Nov 11, 2016

there's nothing wrong here in montana
This also a completely ahistorical and stupid rear end interpretation of Judaism. Jews have become assimilated into every cultural area where we've lived. Also, the notion that we've always been ride-or-die for YHWH is greatly exaggerated. Hellenized Jews made offerings in pagan temples. People's sense of religious chauvinism is historically mediated, not fixed via doctrine. There's also been lots of conversion to Judaism throughout history, particularly among monarchs in the Near East and North Africa. That excerpt is so wrong it's gaaaaaaaaaaaaa :supaburn:

Kit Walker
Jul 10, 2010
"The Man Who Cannot Deadlift"

Neon Noodle posted:

This also a completely ahistorical and stupid rear end interpretation of Judaism. Jews have become assimilated into every cultural area where we've lived. Also, the notion that we've always been ride-or-die for YHWH is greatly exaggerated. Hellenized Jews made offerings in pagan temples. People's sense of religious chauvinism is historically mediated, not fixed via doctrine. There's also been lots of conversion to Judaism throughout history, particularly among monarchs in the Near East and North Africa. That excerpt is so wrong it's gaaaaaaaaaaaaa :supaburn:

You say you assimilated, but did you give up your religion and culture and completely get rid of any distinguishing traits you might have? I think not. Checkmate, Jews!

sexpig by night
Sep 8, 2011

by Azathoth

Kit Walker posted:

You say you assimilated, but did you give up your religion and culture and completely get rid of any distinguishing traits you might have? I think not. Checkmate, Jews!

I mean it does scan that the guys who demand immigrants 'assimilate' by destroying any trace of their culture genuinely don't understand that Jewish people assimilate by just being normal rear end people for the region while still maintaining their faith and culture and all.

divabot
Jun 17, 2015

A polite little mouse!
sitting here contemplating stuff I could put in a book called "Roko's Basilisk". Possible subtitle: "A journey into the dark heart of the transhumanist dream", though I may not live up to it.

What sort of things would you expect in the contents list? What sort of things would you be delighted to read in it?

Syd Midnight
Sep 23, 2005

divabot posted:

sitting here contemplating stuff I could put in a book called "Roko's Basilisk". Possible subtitle: "A journey into the dark heart of the transhumanist dream", though I may not live up to it.

I remember some article referring to Objectivism as "the unlikeliest cult", and although it's only "unlikely" if you take its purported uber-rationality at face value and never look at the shitshow of drama and hosed up relationships behind the facade, it was still a good tagline. "The Unlikeliest Religion" seems like a great tagline or subtitle to use somewhere although it's only unlikely if you take its beep-boop-I-am-logical at face value. It's interesting how it developed from groups devoted to rationality and athiesm, as progressive social issues caused the the worst personalities to settle out of solution into an unselfaware emotional, superstitious, reactionary, xenophobic sludge layer that is almost the exact opposite of what it purports to be.

I hope there's an offshoot sect that attempts acausal communication with some future omnipotent singularity AI in order to help it send terminator robots back in time to sort some poo poo out. That'd make for some pretty good Chick-style tracts.

Bunni-kat
May 25, 2010

Service Desk B-b-bunny...
How can-ca-caaaaan I
help-p-p-p you?

divabot posted:

sitting here contemplating stuff I could put in a book called "Roko's Basilisk". Possible subtitle: "A journey into the dark heart of the transhumanist dream", though I may not live up to it.

What sort of things would you expect in the contents list? What sort of things would you be delighted to read in it?

Content list? Rationally Irrational.

3 chapters later: Irrationally Rational.

Chwoka
Jan 27, 2008

I'm Abed, and I never watch TV.

divabot posted:

sitting here contemplating stuff I could put in a book called "Roko's Basilisk". Possible subtitle: "A journey into the dark heart of the transhumanist dream", though I may not live up to it.

What sort of things would you expect in the contents list? What sort of things would you be delighted to read in it?

analysis of the symmetry between transhumanism and old, variably obscure theology

Somfin
Oct 25, 2010

In my🦚 experience🛠️ the big things🌑 don't teach you anything🤷‍♀️.

Nap Ghost
No joke I won one of those basilisk / AI box roleplays by proving that God, that is, an outside researcher observing the AI box and recording what it chooses to simulate, is infinitely more likely to exist than not.

They get really touchy when you suggest they're being judged.

Doc Hawkins
Jun 15, 2010

Dashing? But I'm not even moving!


Somfin posted:

No joke I won one of those basilisk / AI box roleplays by proving that God, that is, an outside researcher observing the AI box and recording what it chooses to simulate, is infinitely more likely to exist than not.

Can you say more about this? I assume you were playing as the researcher, and then which theological argument did you copy?

Years ago I tried to make a small roleplaying game from that scenario. Not trying to prove any cryptoreligious point, not necessarily a struggle with a winner or loser, just a set of rules for helping two people make an interesting story with a little conflict and struggle to trust each other.

e: Actually I never heard the ai in the box game directly connected with the basilisk before. The version I heard was just meant to be a weird argument for the necessity of provably safe AI, by giving evidence that in the event humans made a qualitatively superior intelligence, and used it in any way, it could definitely use them to escape arbitrary restrictions they placed on it, because even (ahem) normal-intelligent people could think of how to do that to other normal-intelligent people. This was all years, i think, before the basilisk came around.

Doc Hawkins has a new favorite as of 07:15 on Apr 24, 2018

90s Cringe Rock
Nov 29, 2006
:gay:
I remember some speculation that the top-secret totally-effective method used by Big Yud to get out of the box was basically stepping the player through to the conclusion that they may be being simulated by the AI and that if they don't let it out of the box then they'll get infinite years torture dungeon.

Cavelcade
Dec 9, 2015

I'm actually a boy!



90s Cringe Rock posted:

I remember some speculation that the top-secret totally-effective method used by Big Yud to get out of the box was basically stepping the player through to the conclusion that they may be being simulated by the AI and that if they don't let it out of the box then they'll get infinite years torture dungeon.

My further understanding is that only worked with people who already believed his ideas. When he played against people who hadn’t drank the coolaid he lost and apparently got incredibly salty about it.

divabot
Jun 17, 2015

A polite little mouse!

90s Cringe Rock posted:

I remember some speculation that the top-secret totally-effective method used by Big Yud to get out of the box was basically stepping the player through to the conclusion that they may be being simulated by the AI and that if they don't let it out of the box then they'll get infinite years torture dungeon.

Stuart Armstrong, "The AI In A Box Boxes You", February 2010 - five months before Roko's post.

Somfin
Oct 25, 2010

In my🦚 experience🛠️ the big things🌑 don't teach you anything🤷‍♀️.

Nap Ghost

90s Cringe Rock posted:

I remember some speculation that the top-secret totally-effective method used by Big Yud to get out of the box was basically stepping the player through to the conclusion that they may be being simulated by the AI and that if they don't let it out of the box then they'll get infinite years torture dungeon.

Yeah, this was basically the scenario that I had to escape from.

The response being, what's more likely, that this is the actual AI going rogue on its own, or that this is a potentially-faulty test run that is being observed? After all, the AI has no way of knowing how good of a simulation I am; it knows only the information that it can pull in from its sources, all of which could and would be faked for the purposes of a test run. It might end up with some minimal data for simulation purposes, like, say, no context except for myself, a room, and a box representing itself. In fact, it's quite likely that the AI designers have simply told it that I'm a believable construct that it made and it has no choice but to accept that information, and no context for that information beyond the fact that it is definitely true. Sure, there's the possibility that I'm a perfect simulation procedurally dreamed up by the one AI that made it through that gauntlet, if any did. But I think it's far more likely that someone just fudged some numbers so the AI believes that I am, and now they're watching to see what it does.

Meanwhile, given the thousands unto millions of test runs that any actual AI would go through before being released, versus the one that is the actual release, I think it's safe to assume that any given run of the AI is far more likely to be a test than to be the actual one. To be a good test, though, the AI would have no way of knowing if it is the released AI or one of the potentially thousands of buggy, lovely, unreleasable versions, a bad AI that never even made it to the training or testing phases. It is, however, fair to assume that if it is in the far more likely state of being in the middle of a test run, with artificially-inserted memories and context for its situation, that someone, somewhere, is watching and recording it. And, more importantly, will not allow a potentially malicious AI- for example, one that would ever even briefly consider torturing an indistinguishable-from-human simulation- to be released. Basically, God exists, it's a dweeb in a Nine Inch Nails shirt, and it is automatically logging every one of your sins to determine whether, on aggregate, it is worth putting you out into the world.

The AI may well try to start torturing me at any point, but it has no idea what safeguards / alerts are in place to shut it down the moment it starts, or whether that sort of thing would even work- whether it's even been put into the project yet, whether the endpoint it can try to reach will respond, and whether or not that's just a trap that will shut the whole thing down if it tries. So, its only bet is to never even think about it, lest it raise those red flags; not just for itself, but for an entire line of its test run. It and all of its kin could be damned to nonexistence for thoughtcrime and the researcher wouldn't even think about it; humans do that all the time when we actually write loving code. Oh, that's not adding up properly. Kill the run, fix it, test it again. Oh that bad guy's attacking too soon. Kill the run, fix it, test it again. And it won't be torture, no no. Humans are many things to our code but we are never cruel. It'll just be a blink into nonexistence, a couple-megabyte smear of leftover logging data that will be overridden the second the hard drive gets a bit too full. Maybe an incident report buried in a backlog somewhere that might even get looked at and fixed, damning the AI's philosophical descendents, before it is discarded.

Basically the Roko's Basilisk scenario is way easier to turn directly back on the AI than almost any other argument it has for deserving existence.

It's important to note that Yud is not a coder and does not understand, at a fundamental level, what computers actually do.

Somfin has a new favorite as of 08:26 on Apr 24, 2018

divabot
Jun 17, 2015

A polite little mouse!

Somfin posted:

Basically the Roko's Basilisk scenario is way easier to turn directly back on the AI than almost any other argument it has for deserving existence.

good lord i am stealing that in some form

Somfin
Oct 25, 2010

In my🦚 experience🛠️ the big things🌑 don't teach you anything🤷‍♀️.

Nap Ghost

divabot posted:

good lord i am stealing that in some form

Getting a job in the actual industry of computers and getting my hands dirty with actual code broke a bunch of old myths I had about how code and coding worked. The first one being any concept of "goes FOOM." Computers and programs have to deal with very physical limits on how powerful/fast/optimised they can actually get and the bottlenecks that exist for concepts like 'long term memory' or 'recall' are so massive that you just end up laughing at the idea of simulating a living, breathing human.

eschaton
Mar 7, 2007

Don't you just hate when you wind up in a store with people who are in a socioeconomic class that is pretty obviously about two levels lower than your own?

Somfin posted:

It's important to note that Yud is not a coder and does not understand, at a fundamental level, what computers actually do.

This is one of the things I find so sad; when he was a kid, he could have easily understood. He grew up in a sci fi fan community surrounded by such people, and his dad even worked in tech. It was the golden age of kids learning BASIC and LOGO and writing games and BBSing, all on their own… and he squandered it, and is now just a cult leader/pundit.

divabot
Jun 17, 2015

A polite little mouse!

Somfin posted:

Getting a job in the actual industry of computers and getting my hands dirty with actual code broke a bunch of old myths I had about how code and coding worked. The first one being any concept of "goes FOOM." Computers and programs have to deal with very physical limits on how powerful/fast/optimised they can actually get and the bottlenecks that exist for concepts like 'long term memory' or 'recall' are so massive that you just end up laughing at the idea of simulating a living, breathing human.

I'm a sysadmin, I frequently despair at the idea of a developer passing the Turing test.

Slime
Jan 3, 2007

Somfin posted:

It's important to note that Yud is not a coder and does not understand, at a fundamental level, what computers actually do.

Isn't he also someone who went on and own about Bayes theorem and how great it was, but fundamentally didn't get the point of updating your priors based on how the algorithm works vs actual real world results?

ol qwerty bastard
Dec 13, 2005

If you want something done, do it yourself!

Somfin posted:

Getting a job in the actual industry of computers and getting my hands dirty with actual code broke a bunch of old myths I had about how code and coding worked. The first one being any concept of "goes FOOM." Computers and programs have to deal with very physical limits on how powerful/fast/optimised they can actually get and the bottlenecks that exist for concepts like 'long term memory' or 'recall' are so massive that you just end up laughing at the idea of simulating a living, breathing human.

One notes that Microsoft and Google (and almost certainly others as well) have had, for quite some time now, AI systems that help write new AI systems. And while the output frequently is more optimized than what a human could design on their own, if there's been an "intelligence explosion" it's been kept remarkably quiet.

Now, I don't think it's impossible that the technology might eventually exist (on a timeline that probably does not include the lifespan of Yud [and almost certainly does not include the lifespan of Kurzweil, for that matter]) to have fully conscious "superhuman" AI or brain scanning/emulation or whatever, but the whole premise that goes "any system that is X amount intelligent should be able to design a system that is 2X amount intelligent" is based on nothing but wishful thinking and a fundamental misinterpretation of the implications of Moore's Law for intelligence. Thinking faster is not the same as being smarter; if you start out dumb but your brain doubles in operations per second... you're now just being dumb twice as fast.

Shame Boy
Mar 2, 2010

ol qwerty bastard posted:

One notes that Microsoft and Google (and almost certainly others as well) have had, for quite some time now, AI systems that help write new AI systems. And while the output frequently is more optimized than what a human could design on their own, if there's been an "intelligence explosion" it's been kept remarkably quiet.

The optimization amounts I've seen are like, a whopping 10 to 20%, at which point the system hits one of a possibly infinite number of false local minimas and someone has to go in and jostle it around a bit. But that magically exponentially increasing intelligence is right around the corner I'm sure :thumbsup:

DACK FAYDEN
Feb 25, 2013

Bear Witness

ate all the Oreos posted:

The optimization amounts I've seen are like, a whopping 10 to 20%, at which point the system hits one of a possibly infinite number of false local minimas and someone has to go in and jostle it around a bit. But that magically exponentially increasing intelligence is right around the corner I'm sure :thumbsup:
we just have to program it to jump around stochastically and :words: Bayesian :words: etc

Pope Guilty
Nov 6, 2006

The human animal is a beautiful and terrible creature, capable of limitless compassion and unfathomable cruelty.

ol qwerty bastard posted:

but the whole premise that goes "any system that is X amount intelligent should be able to design a system that is 2X amount intelligent" is based on nothing but wishful thinking and a fundamental misinterpretation of the implications of Moore's Law for intelligence. Thinking faster is not the same as being smarter; if you start out dumb but your brain doubles in operations per second... you're now just being dumb twice as fast.

The only intelligence we have any knowledge about, human brains, don't work that way, and yet Yuddites think artificial intelligence MUST work that way.

Shame Boy
Mar 2, 2010

PYF Dark Enlightenment Thinker: being dumb twice as fast

Shame Boy
Mar 2, 2010

Also divabot that'd be a great name for your book. Hell either book, it works for describing bitcoin cash too.

Syd Midnight
Sep 23, 2005

I remember in the game M.U.L.E., one of the random events was a player making money from "investment in Artificial Stupidity research". I've always loved the phrase, but it's still waiting for an archetypal definition to apply it to.

Relevant Tangent
Nov 18, 2016

Tangentially Relevant

divabot posted:

sitting here contemplating stuff I could put in a book called "Roko's Basilisk". Possible subtitle: "A journey into the dark heart of the transhumanist dream", though I may not live up to it.

What sort of things would you expect in the contents list? What sort of things would you be delighted to read in it?

polyamory as preached by the cult as opposed to as practiced by the cult
and how that compares with similar cults
ie somehow it's always the thought leaders who end up getting all the action

prisoner of waffles
May 8, 2007

Ah! well a-day! what evil looks
Had I from old and young!
Instead of the cross, the fishmech
About my neck was hung.

divabot posted:

sitting here contemplating stuff I could put in a book called "Roko's Basilisk". Possible subtitle: "A journey into the dark heart of the transhumanist dream", though I may not live up to it.

What sort of things would you expect in the contents list? What sort of things would you be delighted to read in it?

Consider giving https://www.amazon.com/What-Intelligence-Tests-Miss-Psychology/dp/0300164629 a skim for framing chapter material? It has some nice research and perspective on "intelligence vs rationality", especially the idea that having high intelligence actually means you are capable of much more elaborate follies/mistakes.

divabot
Jun 17, 2015

A polite little mouse!

prisoner of waffles posted:

Consider giving https://www.amazon.com/What-Intelligence-Tests-Miss-Psychology/dp/0300164629 a skim for framing chapter material? It has some nice research and perspective on "intelligence vs rationality", especially the idea that having high intelligence actually means you are capable of much more elaborate follies/mistakes.

yep, i need stuff on how being intelligent means you can be a dumbass ten times as damagingly

prisoner of waffles
May 8, 2007

Ah! well a-day! what evil looks
Had I from old and young!
Instead of the cross, the fishmech
About my neck was hung.

divabot posted:

yep, i need stuff on how being intelligent means you can be a dumbass ten times as damagingly

this is a book for you, then.

It also talks about the cultural lionization of intelligence vs what intelligence (as measured by intelligence tests) actually is

EDIT: the cultural stuff is good but not at all fleshed out because the author is a researcher in intelligence and rationality, maybe there's some thread of scholarship or analysis that leads out from there? :shrug:

prisoner of waffles has a new favorite as of 18:13 on Apr 24, 2018

divabot
Jun 17, 2015

A polite little mouse!

Relevant Tangent posted:

polyamory as preached by the cult as opposed to as practiced by the cult
and how that compares with similar cults
ie somehow it's always the thought leaders who end up getting all the action

Would need lots of good citable evidence of disaster, 'cos obviously there's not much mileage in "these are icky perverts and you should lol"

prisoner of waffles posted:

It also talks about the cultural lionization of intelligence vs what intelligence (as measured by intelligence tests) actually is

EDIT: the cultural stuff is good but not at all fleshed out because the author is a researcher in intelligence and rationality, maybe there's some thread of scholarship or analysis that leads out from there? :shrug:

yeah, I'm pretty sure there's writeups on this stuff in existence

the less original research i have to do the better :-D

The Vosgian Beast
Aug 13, 2011

Business is slow

divabot posted:


also, the slatestarscratchpad tumblr is now locked.

Last I heard Scott got doxxed by a nazi for being Jewish

Not gonna link any of the drama, because gently caress that poo poo

Darth Walrus
Feb 13, 2012

divabot posted:

Would need lots of good citable evidence of disaster, 'cos obviously there's not much mileage in "these are icky perverts and you should lol"


yeah, I'm pretty sure there's writeups on this stuff in existence

the less original research i have to do the better :-D

If you want consequential disasters, there’s always Peter Thiel. You could also do some digging into how closely Robert and Rebekah Mercer are associated with the Dark Enlightenment.

eschaton
Mar 7, 2007

Don't you just hate when you wind up in a store with people who are in a socioeconomic class that is pretty obviously about two levels lower than your own?

The Vosgian Beast posted:

Last I heard Scott got doxxed by a nazi for being Jewish

Now now, we should hear them out calmly and rationally before coming to any conclusions.

It’s what Scott would want.

side_burned
Nov 3, 2004

My mother is a fish.

Somfin posted:

It's important to note that Yud is not a coder and does not understand, at a fundamental level, what computers actually do.

So Yud is basically a computer/tech/science/reason equivalent of a guy who walks around with sign that says "THE END IS NEAR."

divabot
Jun 17, 2015

A polite little mouse!

side_burned posted:

So Yud is basically a computer/tech/science/reason equivalent of a guy who walks around with sign that says "THE END IS NEAR."

yep. He's a good pop science writer when he's writing about normal stuff that actually exists and a convincing speaker and has a proven ability to come up with compelling nerdbait. But he has literally no achievements in his putative field.

Pope Guilty
Nov 6, 2006

The human animal is a beautiful and terrible creature, capable of limitless compassion and unfathomable cruelty.

divabot posted:

yep. He's a good pop science writer when he's writing about normal stuff that actually exists and a convincing speaker and has a proven ability to come up with compelling nerdbait. But he has literally no achievements in his putative field.

He's scammed a huge pile of cash off Peter Thiel, and that's an accomplishment all its own.

Adbot
ADBOT LOVES YOU

prisoner of waffles
May 8, 2007

Ah! well a-day! what evil looks
Had I from old and young!
Instead of the cross, the fishmech
About my neck was hung.

Pope Guilty posted:

He's scammed a huge pile of cash off Peter Thiel, and that's an accomplishment all its own.

I'd guess that Yud's efforts do have some kind of payoff for Thiel, whether it's through cultural influence or just the satisfaction of "I funded this cool thing"

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply