|
Syd Midnight posted:To be a fly on the wall if he ever had to spend the night in a drunk tank or county lockup and tried engaging some bored, annoyed cop in debate over the rationality and justness of their imprisonment. There's a fanfic I'd like to read. Someone please write this.
|
# ? Jun 14, 2016 04:00 |
|
|
# ? May 9, 2024 04:54 |
|
Curvature of Earth posted:Yudkowsky's just a lot better at cargo culting science because science-esque* is his first language and built into his philosophy, as opposed to young earth creationists, for whom it's a second language that feels awkward when stapled on top of their beliefs. He has to be such an embarrassment to his father, who is actually a very skilled engineer. I'm okay with that though, his dad went from being ultra-libertarian "the state can do no right" but otherwise live and let live to full-ESR "rah rah military, crush the Muslims, anyone who disagrees is an anti-Semite" after 9/11.
|
# ? Jun 14, 2016 04:10 |
|
quote:And cryonics, of course, is the default extrapolation from known neuroscience: if memories are stored the way we now think, and cryonics organizations are not disturbed by any particular catastrophe, and technology goes on advancing toward the physical limits, then it is possible to revive a cryonics patient (and yes you are the same person). There are negative possibilities (woken up in dystopia and not allowed to die) but they are exotic, not having equal probability weight to counterbalance the positive possibilities. I love the outright dismissal of possibilities like 'it literally does not work and you are just dead' or even 'you're one of the first people woken up and it doesn't work very well because the technology is still new. Enjoy chronic pain/neurological issues/who knows what.' Just assign good things an arbitrarily high number and bad things an arbitrarily low number and you too can prove whatever conclusion you'd like! This popped up on my facebook feed today. Not exactly DE, but fits in the current line of conversation. Choice bits: quote:Bostrom, a 43-year-old Swedish-born philosopher, has lately acquired something of the status of prophet of doom among those currently doing most to shape our civilisation: the tech billionaires of Silicon Valley...The book is a lively, speculative examination of the singular threat that Bostrom believes – after years of calculation and argument – to be the one most likely to wipe us out. This threat is not climate change, nor pandemic, nor nuclear winter; it is the possibly imminent creation of a general machine intelligence greater than our own. quote:“Machine learning and deep learning [the pioneering ‘neural’ computer algorithms that most closely mimic human brain function] have over the last few years moved much faster than people anticipated,” he says. “That is certainly one of the reasons why this has become such a big topic just now. People can see things moving forward in the technical field, and they become concerned about what next.” I admit to a lot of ignorance here, but from what other posters were talking about maybe earlier in this and the old Yud thread aren't the absolute best performing bits of the field still absolutely nowhere near anything that could be called general intelligence, especially superhuman general intelligence? quote:But he thinks there is a value in cryogenic research? Of course. Shredding your brain with icicles is so much better than burning it down, guys. quote:Somehow, and he has no idea how really, he thinks [ethics] will need to be hardwired from the outset to avoid catastrophe. It is no good getting your owl first then wondering how to train it. And with artificial systems already superior to the best human intelligence in many discrete fields, a conversation about how that might be done is already overdue. Is the last sentence meaningfully true? Like I'm sure some robots can assemble the poo poo out of a car or play a mad game of chess but that's pretty far off anything like general intelligence. quote:“Sometimes the work does seem strange,” he says. “Then from another point it seems strange that most of the world is completely oblivious to the most major things that are going to happen in the 21st century. Even people who talk about global warming never mention any threat posed by AI.”
|
# ? Jun 14, 2016 04:23 |
|
Gitro posted:I admit to a lot of ignorance here, but from what other posters were talking about maybe earlier in this and the old Yud thread aren't the absolute best performing bits of the field still absolutely nowhere near anything that could be called general intelligence, especially superhuman general intelligence? Yes, and what's more, there are vanishingly few projects even attempting to work on things that might fit under the umbrella of "general intelligence." The only one I know of that's still ongoing is Cyc, and while it sounds scary-smart from the things I've been told by people who work with the full system, it's still far more of a decision support system than an intelligence in its own right: It's basically a self-modifying inference engine with a huge base of facts accumulated over 30+ years. You can tell Cyc new facts (whether in English, encoded into CycL, or via its data source mechanisms), and then ask it things as well as to explain its reasoning. It might ask questions as part of answering your question, but it's not having ideas and taking actions on its own, and interfacing anything with it is significant work. (A friend has used OpenCyc in home automation, and had an exciting time adding any support for triggers/events to it.)
|
# ? Jun 14, 2016 04:47 |
|
Lottery of Babylon posted:Actually, he believes the laws of thermodynamics are false. As someone who has a passing understanding of theoretical physics he is so hilariously wrong and it's painfully obvious that he's just making poo poo up that sounds correct to him like a 10th grader who knows he can get away with it because nobody else knows what he's talking about. In other news a gay friend I previously respected just told me about his love for Milo and how Milo is right about Islam and the pay gap
|
# ? Jun 14, 2016 05:01 |
|
Parallel Paraplegic posted:In other news a gay friend I previously respected just told me about his love for Milo and how Milo is right about Islam and the pay gap As an LGBTQ person, I give you permission to sever
|
# ? Jun 14, 2016 05:03 |
|
Small Frozen Thing posted:As an LGBTQ person, I give you permission to sever I mean I'm A Gay myself and we had sex once so this is very disappointing
|
# ? Jun 14, 2016 05:06 |
|
Parallel Paraplegic posted:I mean I'm A Gay myself and we had sex once so this is very disappointing oh
|
# ? Jun 14, 2016 05:13 |
|
Antivehicular posted:Well, I'm talking about the article itself -- I don't actually find any of the arguments compelling, but you've got quotes in the article like these that suggests the author does: I'll fess up, I wrote that last bit. Me. I didn't really touch the rest of the article, but I found the notion that higher intelligence -> necessarily better ability to persuade a pretty laughable core construct. All I was saying there is that the whole thing just presupposes people are way more inclined to follow "rational" argument than we actually are.
|
# ? Jun 14, 2016 05:24 |
|
Re:Basilskchat, Eliezer's official line, the logic of which is pretty sound, is that TDT entails not caving to the Basilisk; since TDT is required in some of the steps to make the Basilisk work, the Basilisk doesn't work. He may or may not have won games with it, but if so it was exploiting an uncanny taking-ideas-seriously valley among the human player. (His response to NAB reflected obviously not having read the most basic things about it, but honestly who wouldn't have an ick reaction at the mere thought of association with those guys.)
|
# ? Jun 14, 2016 05:27 |
|
Parallel Paraplegic posted:I mean I'm A Gay myself and we had sex once so this is very disappointing You wore tinfoil right? Memetic protection is very important.
|
# ? Jun 14, 2016 05:37 |
Gitro posted:I admit to a lot of ignorance here, but from what other posters were talking about maybe earlier in this and the old Yud thread aren't the absolute best performing bits of the field still absolutely nowhere near anything that could be called general intelligence, especially superhuman general intelligence? But yeah the only way I could see AI 'destroying humanity' is if, say, AI-assisted trading algorithms devastate the economy completely in a way that maximizes short-term gains for the owners and the damage is too systemic by the time anyone's willing to overcome inertia and fix it.
|
|
# ? Jun 14, 2016 05:41 |
|
I typed up a long response to The Vosgian Beast about free will, but I deleted it, because it was a bad post and a bad derail. Instead, enjoy this ClarkHat update. NRxers are having a field day with the whole "Islamist guns down gays" narrative: ClarkHat deflects all blame from Christian homophobes, singlehandedly ends centuries of debate about the Old Testament within Christianity: Also lots of posturing about how awesome guns are: He predicts the total dissolution of the US federal government within a couple years: Yeah, that's definitely a prediction that'll pan out. And in the world's least surprising turn of events, ClarkHat is a bitcoiner:
|
# ? Jun 14, 2016 05:42 |
|
What will Milo say when he finds out that the killer was a self-hating gay just like him? I know the answer is nothing.
|
# ? Jun 14, 2016 05:55 |
|
Jack Gladney posted:What will Milo say when he finds out that the killer was a self-hating gay just like him? I know the answer is nothing. I really do hate the "self-hating gay hurts/murders other gays" thing. It's usually not true, and worse, it changes the focus from "gays are under attack by bigots" to "gays are self-destructive and destroy their own kind".
|
# ? Jun 14, 2016 06:45 |
|
It's the fact that gays are under attack that makes people turn out like that. This killer was on grindr and a regular at the club he shot up.
|
# ? Jun 14, 2016 06:48 |
|
This all came up because Milo's giving a speech at UCF. I live ~1hr away, maybe I should go protest. Then again I get the feeling that I'd wind up being harassed for life and doxxed and poo poo for my troubles. But no, Milo is the one who's free speech is threatened guys, honest. Also kinda unrelated but having grown up like 45 minutes from Orlando I've been to Pulse a few times and it's really spooky having a mass murder happen this nearby. I don't know anyone who died at... least?
|
# ? Jun 14, 2016 06:55 |
|
In another instalment of things which Rev feels threatened by: https://twitter.com/St_Rev/status/742674689399480321 https://twitter.com/St_Rev/status/742675238891061248
|
# ? Jun 14, 2016 12:30 |
|
Jack Gladney posted:It's the fact that gays are under attack that makes people turn out like that. This killer was on grindr and a regular at the club he shot up. Probably just for research though
|
# ? Jun 14, 2016 12:42 |
|
The Vosgian Beast posted:Probably just for research though Yeah, no joke, that does actually seem like what he was doing. His questions on Grindr were stuff like 'so, any really big gay clubs in the area? How popular are they? What time of night are they most crowded at?'
|
# ? Jun 14, 2016 12:55 |
|
Lottery of Babylon posted:Actually, he believes the laws of thermodynamics are false. I love how he confidently asserts that MWI is an obvious and incontrovertible implication of quantum mechanics despite actual physicists being heavily divided on the question, but claims that the Second Law of Thermodynamics is somehow in question despite the fact that you can derive it solely from "things are made of quantized particles at quantized energy levels" and the First Law. It's almost as if he's basing his theories on whatever fits his predefined conclusions where nothing bad ever has to happen to him.
|
# ? Jun 14, 2016 13:03 |
|
Jack Gladney posted:I wonder if you could use some dumb logic game to get Yud to kill himself like an evil supercomputer on Star Trek. edit: This scene is very similar to Yudowski's "Timeless Decision Theory" as I understand it. Truly a dizzying intellect. Syd Midnight has a new favorite as of 13:19 on Jun 14, 2016 |
# ? Jun 14, 2016 13:17 |
|
Old man yells at word cloud.
|
# ? Jun 14, 2016 13:37 |
|
quote:A stress ball stamping on a human face forever. That sounds mildly annoying at worst. "Oh no, something squishy is pressing on me repeatedly! I WILL NEVER RECOVER!"
|
# ? Jun 14, 2016 13:38 |
|
Goon Danton posted:I love how he confidently asserts that MWI is an obvious and incontrovertible implication of quantum mechanics despite actual physicists being heavily divided on the question, but claims that the Second Law of Thermodynamics is somehow in question despite the fact that you can derive it solely from "things are made of quantized particles at quantized energy levels" and the First Law. It's almost as if he's basing his theories on whatever fits his predefined conclusions where nothing bad ever has to happen to him. Yeah he's like one step above the new age weirdos who skim a quantum mechanics for dummies book and declare that science says you can control energy with your ~mind~
|
# ? Jun 14, 2016 13:50 |
|
Lottery of Babylon posted:Actually, he believes the laws of thermodynamics are false. How in the blue gently caress So Conway's Game of Life can run indefinitely, right. That's fine. It works on a loop basis. It iterates over simple memory states. What does that have to do with living forever? Is he saying that one program can run forever so all programs can run forever, even really complicated ones like AI? Has he heard of the halting problem? gruagh
|
# ? Jun 14, 2016 14:17 |
|
Tesseraction posted:How in the blue gently caress Again it seems a lot like he read the wikipedia article and just took from it what he wanted. "Conway's game of life is Turing complete" and "Conway's game of life can run forever" equals "a simulation of me can run forever!"
|
# ? Jun 14, 2016 14:31 |
|
Puppy Time posted:That sounds mildly annoying at worst. Standard Curvature of Earth posted:I typed up a long response to The Vosgian Beast about free will, but I deleted it, because it was a bad post and a bad derail. I'll take "things Cingulate never said to himself" for 300$ Alex
|
# ? Jun 14, 2016 14:36 |
|
Parallel Paraplegic posted:Again it seems a lot like he read the wikipedia article and just took from it what he wanted. "Conway's game of life is Turing complete" and "Conway's game of life can run forever" equals "a simulation of me can run forever!" But that's even funnier because I even checked that page and it mentions the problem. Sure, you can run an infinite loop forever but if you want to run anything more complicated than "count how many 1s are around you" you need to have complicated branching and decision trees and hell at this point you're going well beyond Turing complete. And what about if you end up thinking about something undecidable? Is there a way to cancel your machine thoughts if you're mid-thought? Imagine your stupid AI fuckshit body trying to think about something and then never ending for eternity because you thought of something outside the bounds of computational complexity.
|
# ? Jun 14, 2016 14:38 |
|
Tesseraction posted:But that's even funnier because I even checked that page and it mentions the problem. Sure, you can run an infinite loop forever but if you want to run anything more complicated than "count how many 1s are around you" you need to have complicated branching and decision trees and hell at this point you're going well beyond Turing complete. And what about if you end up thinking about something undecidable? Is there a way to cancel your machine thoughts if you're mid-thought? I guess it would be a good end state once you've lived forever and everything's boring like in that episode of Star Trek where they go to the Q continuum and nobody talks because "it's all been said"
|
# ? Jun 14, 2016 14:51 |
|
Has he ever actually played around with the game of life? Like the very first thing you realize is how easily and inevitably anything interesting and "organic" collapses into its version of uselessly blinking heat death.
|
# ? Jun 14, 2016 14:53 |
|
Frogisis posted:Has he ever actually played around with the game of life? Like the very first thing you realize is how easily and inevitably anything interesting and "organic" collapses into its version of uselessly blinking heat death. There are plenty of interesting "machines" that infinitely expand or crap out gliders or whatever. The problem is that the impressive ones don't arise without an outside force arranging every cell perfectly, and they tend to be extremely fragile if there's anything else in their universe. Parallel Paraplegic posted:Yeah he's like one step above the new age weirdos who skim a quantum mechanics for dummies book and declare that science says you can control energy with your ~mind~ My friend is getting a PhD in robotics, and her reaction to Yud is pretty much the same as when those of us in physical chemistry come across Deepak Chopra for the first time. I sent her that quote about Conway's Game of Physics, we'll see how she responds!
|
# ? Jun 14, 2016 15:11 |
|
Goon Danton posted:My friend is getting a PhD in robotics, and her reaction to Yud is pretty much the same as when those of us in physical chemistry come across Deepak Chopra for the first time. "A young woman was arrested today in connection with the shooting of AI scholar Elizier Yudkowsky."
|
# ? Jun 14, 2016 15:32 |
|
Gitro posted:Is the last sentence meaningfully true? Like I'm sure some robots can assemble the poo poo out of a car or play a mad game of chess but that's pretty far off anything like general intelligence. Vosgian Beast put it pretty well in an earlier post: quote:When does an AI successfully run through Tomb of Horrors? Get Deep Blue to do that, and I'd be really impressed
|
# ? Jun 14, 2016 15:44 |
|
Nessus posted:Not quite: he believes the computer would create another universe with different physical laws. Now I wonder, is the writer of The Metamorphosis of Prime Intellect related to these goobers? It's an interesting book in that it hits all the "technology singularity happens" beats (including the technological god creating another universe with simpler laws), but without all the weird technology worship (replaced with... plenty of other weird stuff) Jack Gladney posted:It's the fact that gays are under attack that makes people turn out like that. This killer was on grindr and a regular at the club he shot up. My favorite part? His dad has been reported as saying he told him not to hurt gay people. What isn't reported as often is that he then segued into "... because god will punish them"
|
# ? Jun 14, 2016 16:33 |
|
So did the shooter being Muslim stave off Mansophere think pieces about how the killer was an omega incel whose life, and the lives of others, would have been saved if he had just learned Game or not grown up in a feminized society? Or are we going to get those regardless?
|
# ? Jun 14, 2016 16:38 |
|
Tesseraction posted:But that's even funnier because I even checked that page and it mentions the problem. Sure, you can run an infinite loop forever but if you want to run anything more complicated than "count how many 1s are around you" you need to have complicated branching and decision trees and hell at this point you're going well beyond Turing complete. And what about if you end up thinking about something undecidable? Is there a way to cancel your machine thoughts if you're mid-thought? Turing machines can handle all those problems (except the halting one, but you can give yourself an escape valve of, "count the cycles spent thinking about Issue X, if greater than Y, abort and declare the problem too difficult to solve" without much trouble). Yud's thing here isn't even a logical issue - it's nothing more than a cheap rhetorical trick, where he shifts the definition of "Turing complete" and hopes you won't notice. Turing machines are assumed to be infinite and free of physical limitations, because it makes the math a lot easier. When we talk about real-world systems being Turing complete, there's always an implicit "assuming the presence of similarly infinite time, memory, and so on..." statement in there. Otherwise, it's pointless to even talk about the concept, because we do exist in a world with known physical limits and nothing will ever be truly Turing complete in that sense. But, if you start at "I believe robo-Jesus will carry me away to perfectly simulated Heaven" and work backwards, you can do something like this:
|
# ? Jun 14, 2016 16:54 |
|
hackbunny posted:Now I wonder, is the writer of The Metamorphosis of Prime Intellect related to these goobers? It's an interesting book in that it hits all the "technology singularity happens" beats (including the technological god creating another universe with simpler laws), but without all the weird technology worship (replaced with... plenty of other weird stuff) IMO, it's opposed to the Yud AI agenda in that its central conceit is that a perfectly benevolent AI could make human life actually kind of suck for a lot of people.
|
# ? Jun 14, 2016 17:03 |
|
The Vosgian Beast posted:So did the shooter being Muslim stave off Mansophere think pieces about how the killer was an omega incel whose life, and the lives of others, would have been saved if he had just learned Game or not grown up in a feminized society? My former friend basically told me "it's either Islam told him to kill gays or he did it for no reason at all" when I forced him to actually critically think about why he would do such a thing so yeah I guess so.
|
# ? Jun 14, 2016 17:58 |
|
|
# ? May 9, 2024 04:54 |
|
Let's check in with Mister Mean-Spirited, a man so edgy, he http://mister-mean-spirited.blogspot.com/2016/06/hate-fucks-are-only-genuine-fucks.html
|
# ? Jun 14, 2016 18:10 |