Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
eschaton
Mar 7, 2007

Don't you just hate when you wind up in a store with people who are in a socioeconomic class that is pretty obviously about two levels lower than your own?

Syd Midnight posted:

To be a fly on the wall if he ever had to spend the night in a drunk tank or county lockup and tried engaging some bored, annoyed cop in debate over the rationality and justness of their imprisonment. There's a fanfic I'd like to read.

Someone please write this.

Adbot
ADBOT LOVES YOU

eschaton
Mar 7, 2007

Don't you just hate when you wind up in a store with people who are in a socioeconomic class that is pretty obviously about two levels lower than your own?

Curvature of Earth posted:

Yudkowsky's just a lot better at cargo culting science because science-esque* is his first language and built into his philosophy, as opposed to young earth creationists, for whom it's a second language that feels awkward when stapled on top of their beliefs.

*Not science, science-esque. These are different things.

He has to be such an embarrassment to his father, who is actually a very skilled engineer.

I'm okay with that though, his dad went from being ultra-libertarian "the state can do no right" but otherwise live and let live to full-ESR "rah rah military, crush the Muslims, anyone who disagrees is an anti-Semite" after 9/11.

Gitro
May 29, 2013

quote:

And cryonics, of course, is the default extrapolation from known neuroscience: if memories are stored the way we now think, and cryonics organizations are not disturbed by any particular catastrophe, and technology goes on advancing toward the physical limits, then it is possible to revive a cryonics patient (and yes you are the same person). There are negative possibilities (woken up in dystopia and not allowed to die) but they are exotic, not having equal probability weight to counterbalance the positive possibilities.

I love the outright dismissal of possibilities like 'it literally does not work and you are just dead' or even 'you're one of the first people woken up and it doesn't work very well because the technology is still new. Enjoy chronic pain/neurological issues/who knows what.' Just assign good things an arbitrarily high number and bad things an arbitrarily low number and you too can prove whatever conclusion you'd like!

This popped up on my facebook feed today. Not exactly DE, but fits in the current line of conversation. Choice bits:

quote:

Bostrom, a 43-year-old Swedish-born philosopher, has lately acquired something of the status of prophet of doom among those currently doing most to shape our civilisation: the tech billionaires of Silicon Valley...The book is a lively, speculative examination of the singular threat that Bostrom believes – after years of calculation and argument – to be the one most likely to wipe us out. This threat is not climate change, nor pandemic, nor nuclear winter; it is the possibly imminent creation of a general machine intelligence greater than our own.

quote:

“Machine learning and deep learning [the pioneering ‘neural’ computer algorithms that most closely mimic human brain function] have over the last few years moved much faster than people anticipated,” he says. “That is certainly one of the reasons why this has become such a big topic just now. People can see things moving forward in the technical field, and they become concerned about what next.”

I admit to a lot of ignorance here, but from what other posters were talking about maybe earlier in this and the old Yud thread aren't the absolute best performing bits of the field still absolutely nowhere near anything that could be called general intelligence, especially superhuman general intelligence?

quote:

But he thinks there is a value in cryogenic research?

“It seems a pretty rational thing for people to do if they can afford it,” he says. “When you think about what life in the quite near future could be like, trying to store the information in your brain seems like a conservative option as opposed to burning the brain down and throwing it away. Unless you are really confident that the information will never be useful…”

Of course. Shredding your brain with icicles is so much better than burning it down, guys.

quote:

Somehow, and he has no idea how really, he thinks [ethics] will need to be hardwired from the outset to avoid catastrophe. It is no good getting your owl first then wondering how to train it. And with artificial systems already superior to the best human intelligence in many discrete fields, a conversation about how that might be done is already overdue.

Is the last sentence meaningfully true? Like I'm sure some robots can assemble the poo poo out of a car or play a mad game of chess but that's pretty far off anything like general intelligence.

quote:

“Sometimes the work does seem strange,” he says. “Then from another point it seems strange that most of the world is completely oblivious to the most major things that are going to happen in the 21st century. Even people who talk about global warming never mention any threat posed by AI.”

:rolleyes:

eschaton
Mar 7, 2007

Don't you just hate when you wind up in a store with people who are in a socioeconomic class that is pretty obviously about two levels lower than your own?

Gitro posted:

I admit to a lot of ignorance here, but from what other posters were talking about maybe earlier in this and the old Yud thread aren't the absolute best performing bits of the field still absolutely nowhere near anything that could be called general intelligence, especially superhuman general intelligence?

Yes, and what's more, there are vanishingly few projects even attempting to work on things that might fit under the umbrella of "general intelligence."

The only one I know of that's still ongoing is Cyc, and while it sounds scary-smart from the things I've been told by people who work with the full system, it's still far more of a decision support system than an intelligence in its own right: It's basically a self-modifying inference engine with a huge base of facts accumulated over 30+ years.

You can tell Cyc new facts (whether in English, encoded into CycL, or via its data source mechanisms), and then ask it things as well as to explain its reasoning. It might ask questions as part of answering your question, but it's not having ideas and taking actions on its own, and interfacing anything with it is significant work. (A friend has used OpenCyc in home automation, and had an exciting time adding any support for triggers/events to it.)

Shame Boy
Mar 2, 2010

Lottery of Babylon posted:

Actually, he believes the laws of thermodynamics are false.

No, really.


He's terrified of death, and doesn't like that the laws of physics say death is inevitable. So he decides that since he doesn't like the laws of physics, they're probably wrong. After all, in Conway's Game of Life you can be immortal, and the rules of Conway's Game of Life are very simple (and therefore by Occam's Razor very likely to be true), so if we just ignore all those inconvenient observations we've made of the ways in which our universe's physics is not like Conway's Game of Life, it becomes obvious that we're pretty much living in Conway's Game of Life and therefore immortality is possible.

As someone who has a passing understanding of theoretical physics he is so hilariously wrong and it's painfully obvious that he's just making poo poo up that sounds correct to him like a 10th grader who knows he can get away with it because nobody else knows what he's talking about.


In other news a gay friend I previously respected just told me about his love for Milo and how Milo is right about Islam and the pay gap :smith:

I Killed GBS
Jun 2, 2011

by Lowtax

Parallel Paraplegic posted:

In other news a gay friend I previously respected just told me about his love for Milo and how Milo is right about Islam and the pay gap :smith:

As an LGBTQ person, I give you permission to sever

Shame Boy
Mar 2, 2010

Small Frozen Thing posted:

As an LGBTQ person, I give you permission to sever

I mean I'm A Gay myself and we had sex once so this is very disappointing

I Killed GBS
Jun 2, 2011

by Lowtax

Parallel Paraplegic posted:

I mean I'm A Gay myself and we had sex once so this is very disappointing

oh

:rip:

ikanreed
Sep 25, 2009

I honestly I have no idea who cannibal[SIC] is and I do not know why I should know.

syq dude, just syq!

Antivehicular posted:

Well, I'm talking about the article itself -- I don't actually find any of the arguments compelling, but you've got quotes in the article like these that suggests the author does:




It basically reeks of the author thinking that any Gatekeeper who isn't persuaded by Big Yud the Box AI must be a stupid, stubborn, irrational child-person or deliberately playing "dishonestly," instead of just not swayed by the goddamn Basilisk.

It's also kind of weird that the strategies they list for an alleged experiment in rational thinking are almost all rooted in writing impromptu SF -- telling the AI it's been corrupted by a virus, or whatever, or otherwise just roleplaying the scenario as writing competitive fiction against each other. It's like... maybe the whole thing is some kind of glorified Let's Pretend from someone too delusional to realize that his ideas aren't reality-based? What a shocker!!

I'll fess up, I wrote that last bit. Me. I didn't really touch the rest of the article, but I found the notion that higher intelligence -> necessarily better ability to persuade a pretty laughable core construct.

All I was saying there is that the whole thing just presupposes people are way more inclined to follow "rational" argument than we actually are.

Oligopsony
May 17, 2007
Re:Basilskchat, Eliezer's official line, the logic of which is pretty sound, is that TDT entails not caving to the Basilisk; since TDT is required in some of the steps to make the Basilisk work, the Basilisk doesn't work. He may or may not have won games with it, but if so it was exploiting an uncanny taking-ideas-seriously valley among the human player.

(His response to NAB reflected obviously not having read the most basic things about it, but honestly who wouldn't have an ick reaction at the mere thought of association with those guys.)

Peztopiary
Mar 16, 2009

by exmarx

Parallel Paraplegic posted:

I mean I'm A Gay myself and we had sex once so this is very disappointing

You wore tinfoil right? Memetic protection is very important.

Nessus
Dec 22, 2003

After a Speaker vote, you may be entitled to a valuable coupon or voucher!



Gitro posted:

I admit to a lot of ignorance here, but from what other posters were talking about maybe earlier in this and the old Yud thread aren't the absolute best performing bits of the field still absolutely nowhere near anything that could be called general intelligence, especially superhuman general intelligence?
It occurs to me that maybe this terror is because these fellows are so terribly used to developing technologies that put people out of work, and so they think: What if that happens to me, the smarty man?

But yeah the only way I could see AI 'destroying humanity' is if, say, AI-assisted trading algorithms devastate the economy completely in a way that maximizes short-term gains for the owners and the damage is too systemic by the time anyone's willing to overcome inertia and fix it.

Curvature of Earth
Sep 9, 2011

Projected cost of
invading Canada:
$900
I typed up a long response to The Vosgian Beast about free will, but I deleted it, because it was a bad post and a bad derail.

Instead, enjoy this ClarkHat update.

NRxers are having a field day with the whole "Islamist guns down gays" narrative:



ClarkHat deflects all blame from Christian homophobes, singlehandedly ends centuries of debate about the Old Testament within Christianity:


Also lots of posturing about how awesome guns are:



He predicts the total dissolution of the US federal government within a couple years:


Yeah, that's definitely a prediction that'll pan out. :thumbsup:

And in the world's least surprising turn of events, ClarkHat is a bitcoiner:

I AM GRANDO
Aug 20, 2006

What will Milo say when he finds out that the killer was a self-hating gay just like him? I know the answer is nothing.

Curvature of Earth
Sep 9, 2011

Projected cost of
invading Canada:
$900

Jack Gladney posted:

What will Milo say when he finds out that the killer was a self-hating gay just like him? I know the answer is nothing.

I really do hate the "self-hating gay hurts/murders other gays" thing. It's usually not true, and worse, it changes the focus from "gays are under attack by bigots" to "gays are self-destructive and destroy their own kind".

I AM GRANDO
Aug 20, 2006

It's the fact that gays are under attack that makes people turn out like that. This killer was on grindr and a regular at the club he shot up.

Shame Boy
Mar 2, 2010

This all came up because Milo's giving a speech at UCF. I live ~1hr away, maybe I should go protest.

Then again I get the feeling that I'd wind up being harassed for life and doxxed and poo poo for my troubles. But no, Milo is the one who's free speech is threatened guys, honest.

Also kinda unrelated but having grown up like 45 minutes from Orlando I've been to Pulse a few times and it's really spooky having a mass murder happen this nearby. I don't know anyone who died at... least?

Fututor Magnus
Feb 22, 2016

by FactsAreUseless
In another instalment of things which Rev feels threatened by:

https://twitter.com/St_Rev/status/742674689399480321
https://twitter.com/St_Rev/status/742675238891061248

The Vosgian Beast
Aug 13, 2011

Business is slow

Jack Gladney posted:

It's the fact that gays are under attack that makes people turn out like that. This killer was on grindr and a regular at the club he shot up.

Probably just for research though

Darth Walrus
Feb 13, 2012

The Vosgian Beast posted:

Probably just for research though

Yeah, no joke, that does actually seem like what he was doing. His questions on Grindr were stuff like 'so, any really big gay clubs in the area? How popular are they? What time of night are they most crowded at?'

Goon Danton
May 24, 2012

Don't forget to show my shitposts to the people. They're well worth seeing.

Lottery of Babylon posted:

Actually, he believes the laws of thermodynamics are false.

No, really.


He's terrified of death, and doesn't like that the laws of physics say death is inevitable. So he decides that since he doesn't like the laws of physics, they're probably wrong. After all, in Conway's Game of Life you can be immortal, and the rules of Conway's Game of Life are very simple (and therefore by Occam's Razor very likely to be true), so if we just ignore all those inconvenient observations we've made of the ways in which our universe's physics is not like Conway's Game of Life, it becomes obvious that we're pretty much living in Conway's Game of Life and therefore immortality is possible.

I love how he confidently asserts that MWI is an obvious and incontrovertible implication of quantum mechanics despite actual physicists being heavily divided on the question, but claims that the Second Law of Thermodynamics is somehow in question despite the fact that you can derive it solely from "things are made of quantized particles at quantized energy levels" and the First Law. It's almost as if he's basing his theories on whatever fits his predefined conclusions where nothing bad ever has to happen to him.

Syd Midnight
Sep 23, 2005

Jack Gladney posted:

I wonder if you could use some dumb logic game to get Yud to kill himself like an evil supercomputer on Star Trek.



edit: This scene is very similar to Yudowski's "Timeless Decision Theory" as I understand it. Truly a dizzying intellect.

Syd Midnight has a new favorite as of 13:19 on Jun 14, 2016

Sax Solo
Feb 18, 2011



Old man yells at word cloud.

Puppy Time
Mar 1, 2005


quote:

A stress ball stamping on a human face forever.

That sounds mildly annoying at worst.

"Oh no, something squishy is pressing on me repeatedly! I WILL NEVER RECOVER!"

Shame Boy
Mar 2, 2010

Goon Danton posted:

I love how he confidently asserts that MWI is an obvious and incontrovertible implication of quantum mechanics despite actual physicists being heavily divided on the question, but claims that the Second Law of Thermodynamics is somehow in question despite the fact that you can derive it solely from "things are made of quantized particles at quantized energy levels" and the First Law. It's almost as if he's basing his theories on whatever fits his predefined conclusions where nothing bad ever has to happen to him.

Yeah he's like one step above the new age weirdos who skim a quantum mechanics for dummies book and declare that science says you can control energy with your ~mind~

Tesseraction
Apr 5, 2009

Lottery of Babylon posted:

Actually, he believes the laws of thermodynamics are false.

No, really.


He's terrified of death, and doesn't like that the laws of physics say death is inevitable. So he decides that since he doesn't like the laws of physics, they're probably wrong. After all, in Conway's Game of Life you can be immortal, and the rules of Conway's Game of Life are very simple (and therefore by Occam's Razor very likely to be true), so if we just ignore all those inconvenient observations we've made of the ways in which our universe's physics is not like Conway's Game of Life, it becomes obvious that we're pretty much living in Conway's Game of Life and therefore immortality is possible.

How in the blue gently caress :psyduck:

So Conway's Game of Life can run indefinitely, right. That's fine. It works on a loop basis. It iterates over simple memory states. What does that have to do with living forever? Is he saying that one program can run forever so all programs can run forever, even really complicated ones like AI? Has he heard of the halting problem? gruagh

Shame Boy
Mar 2, 2010

Tesseraction posted:

How in the blue gently caress :psyduck:

So Conway's Game of Life can run indefinitely, right. That's fine. It works on a loop basis. It iterates over simple memory states. What does that have to do with living forever? Is he saying that one program can run forever so all programs can run forever, even really complicated ones like AI? Has he heard of the halting problem? gruagh

Again it seems a lot like he read the wikipedia article and just took from it what he wanted. "Conway's game of life is Turing complete" and "Conway's game of life can run forever" equals "a simulation of me can run forever!"

The Vosgian Beast
Aug 13, 2011

Business is slow

Puppy Time posted:

That sounds mildly annoying at worst.

"Oh no, something squishy is pressing on me repeatedly! I WILL NEVER RECOVER!"

Standard prog grimbertarian moralism


Curvature of Earth posted:

I typed up a long response to The Vosgian Beast about free will, but I deleted it, because it was a bad post and a bad derail.

I'll take "things Cingulate never said to himself" for 300$ Alex

Tesseraction
Apr 5, 2009

Parallel Paraplegic posted:

Again it seems a lot like he read the wikipedia article and just took from it what he wanted. "Conway's game of life is Turing complete" and "Conway's game of life can run forever" equals "a simulation of me can run forever!"

But that's even funnier because I even checked that page and it mentions the problem. Sure, you can run an infinite loop forever but if you want to run anything more complicated than "count how many 1s are around you" you need to have complicated branching and decision trees and hell at this point you're going well beyond Turing complete. And what about if you end up thinking about something undecidable? Is there a way to cancel your machine thoughts if you're mid-thought?

Imagine your stupid AI fuckshit body trying to think about something and then never ending for eternity because you thought of something outside the bounds of computational complexity.

Shame Boy
Mar 2, 2010

Tesseraction posted:

But that's even funnier because I even checked that page and it mentions the problem. Sure, you can run an infinite loop forever but if you want to run anything more complicated than "count how many 1s are around you" you need to have complicated branching and decision trees and hell at this point you're going well beyond Turing complete. And what about if you end up thinking about something undecidable? Is there a way to cancel your machine thoughts if you're mid-thought?

Imagine your stupid AI fuckshit body trying to think about something and then never ending for eternity because you thought of something outside the bounds of computational complexity.

I guess it would be a good end state once you've lived forever and everything's boring like in that episode of Star Trek where they go to the Q continuum and nobody talks because "it's all been said"

Frogisis
Apr 15, 2003

relax brother relax
Has he ever actually played around with the game of life? Like the very first thing you realize is how easily and inevitably anything interesting and "organic" collapses into its version of uselessly blinking heat death.

Goon Danton
May 24, 2012

Don't forget to show my shitposts to the people. They're well worth seeing.

Frogisis posted:

Has he ever actually played around with the game of life? Like the very first thing you realize is how easily and inevitably anything interesting and "organic" collapses into its version of uselessly blinking heat death.

There are plenty of interesting "machines" that infinitely expand or crap out gliders or whatever. The problem is that the impressive ones don't arise without an outside force arranging every cell perfectly, and they tend to be extremely fragile if there's anything else in their universe.

Parallel Paraplegic posted:

Yeah he's like one step above the new age weirdos who skim a quantum mechanics for dummies book and declare that science says you can control energy with your ~mind~

My friend is getting a PhD in robotics, and her reaction to Yud is pretty much the same as when those of us in physical chemistry come across Deepak Chopra for the first time.

I sent her that quote about Conway's Game of Physics, we'll see how she responds!

Tesseraction
Apr 5, 2009

Goon Danton posted:

My friend is getting a PhD in robotics, and her reaction to Yud is pretty much the same as when those of us in physical chemistry come across Deepak Chopra for the first time.

I sent her that quote about Conway's Game of Physics, we'll see how she responds!

"A young woman was arrested today in connection with the shooting of AI scholar Elizier Yudkowsky."

Syd Midnight
Sep 23, 2005

Gitro posted:

Is the last sentence meaningfully true? Like I'm sure some robots can assemble the poo poo out of a car or play a mad game of chess but that's pretty far off anything like general intelligence.

Vosgian Beast put it pretty well in an earlier post:

quote:

When does an AI successfully run through Tomb of Horrors? Get Deep Blue to do that, and I'd be really impressed

hackbunny
Jul 22, 2007

I haven't been on SA for years but the person who gave me my previous av as a joke felt guilty for doing so and decided to get me a non-shitty av

Nessus posted:

Not quite: he believes the computer would create another universe with different physical laws.

Now I wonder, is the writer of The Metamorphosis of Prime Intellect related to these goobers? It's an interesting book in that it hits all the "technology singularity happens" beats (including the technological god creating another universe with simpler laws), but without all the weird technology worship (replaced with... plenty of other weird stuff)

Jack Gladney posted:

It's the fact that gays are under attack that makes people turn out like that. This killer was on grindr and a regular at the club he shot up.

My favorite part? His dad has been reported as saying he told him not to hurt gay people. What isn't reported as often is that he then segued into "... because god will punish them"

The Vosgian Beast
Aug 13, 2011

Business is slow
So did the shooter being Muslim stave off Mansophere think pieces about how the killer was an omega incel whose life, and the lives of others, would have been saved if he had just learned Game or not grown up in a feminized society?

Or are we going to get those regardless?

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

Tesseraction posted:

But that's even funnier because I even checked that page and it mentions the problem. Sure, you can run an infinite loop forever but if you want to run anything more complicated than "count how many 1s are around you" you need to have complicated branching and decision trees and hell at this point you're going well beyond Turing complete. And what about if you end up thinking about something undecidable? Is there a way to cancel your machine thoughts if you're mid-thought?

Imagine your stupid AI fuckshit body trying to think about something and then never ending for eternity because you thought of something outside the bounds of computational complexity.

Turing machines can handle all those problems (except the halting one, but you can give yourself an escape valve of, "count the cycles spent thinking about Issue X, if greater than Y, abort and declare the problem too difficult to solve" without much trouble).

Yud's thing here isn't even a logical issue - it's nothing more than a cheap rhetorical trick, where he shifts the definition of "Turing complete" and hopes you won't notice. Turing machines are assumed to be infinite and free of physical limitations, because it makes the math a lot easier. When we talk about real-world systems being Turing complete, there's always an implicit "assuming the presence of similarly infinite time, memory, and so on..." statement in there. Otherwise, it's pointless to even talk about the concept, because we do exist in a world with known physical limits and nothing will ever be truly Turing complete in that sense.

But, if you start at "I believe robo-Jesus will carry me away to perfectly simulated Heaven" and work backwards, you can do something like this:
  • I will live forever in Heaven
  • To live forever, it is necessary for my computer God to work forever
  • My computer God will be Turing complete, like almost all computers. Everybody knows that all kinds of stuff, from Conway's Life to C to Scala, is Turing complete
  • The definition of Turing complete includes infinite memory and a machine which never degrades over time
  • Therefore there are no problems with computer God simulating perfect ecstasy for 10100 copies of me, singing AI-leluia forever

GunnerJ
Aug 1, 2005

Do you think this is funny?

hackbunny posted:

Now I wonder, is the writer of The Metamorphosis of Prime Intellect related to these goobers? It's an interesting book in that it hits all the "technology singularity happens" beats (including the technological god creating another universe with simpler laws), but without all the weird technology worship (replaced with... plenty of other weird stuff)

IMO, it's opposed to the Yud AI agenda in that its central conceit is that a perfectly benevolent AI could make human life actually kind of suck for a lot of people.

Shame Boy
Mar 2, 2010

The Vosgian Beast posted:

So did the shooter being Muslim stave off Mansophere think pieces about how the killer was an omega incel whose life, and the lives of others, would have been saved if he had just learned Game or not grown up in a feminized society?

Or are we going to get those regardless?

My former friend basically told me "it's either Islam told him to kill gays or he did it for no reason at all" when I forced him to actually critically think about why he would do such a thing so yeah I guess so.

Adbot
ADBOT LOVES YOU

The Vosgian Beast
Aug 13, 2011

Business is slow
Let's check in with Mister Mean-Spirited, a man so edgy, he http://mister-mean-spirited.blogspot.com/2016/06/hate-fucks-are-only-genuine-fucks.html :barf::barf::barf:

:yikes:

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply