|
Somfin posted:LW exists in this weird state where computer programs just pop into being fully formed. Principle of charity would be to grant that the code bases of AI algorithms are pretty small compared to "applications". This actually undercuts their idea of runaway AI self-improvement more though. Since there's not room for "deeper insight" to improve the underlying design.
|
# ? Jun 25, 2018 21:45 |
|
|
# ? May 9, 2024 13:23 |
|
Slime posted:It's honestly kind of amazing. People who claim to be rationalists and tend to be comprised mostly of atheists (because atheists are SMART guys, they like SCIENCE) managed to make their own loving religion about a robogod complete with a digital hell. I think someone up thread a bit pointed out how the rationalist mindset leaves them with absolutely no loving coping mechanisms because they think they've outsmarted silly emotions or w/e, so when poo poo like this comes along they have absolutely no immune response to it. When the basilisk was first posted it gave some people no-poo poo mental breakdowns because it seemed so loving real to them, because their pure and holy logic apparently created it so it must be true.
|
# ? Jun 25, 2018 22:34 |
|
It's worth pointing out that they are well aware of Pascal's wager, and in fact Yud warns of the dangers of "pascal's mugging" as a dangerously irrational fallacious argument that... they then proceed to fall for even harder after having been warned against it somehow?
|
# ? Jun 25, 2018 22:38 |
|
Remember that a lot of these folks come from very religious families. I think Yudkowsky said his parents wanted him to be a rabbi. That probably explains a great deal about how they turned out.
|
# ? Jun 25, 2018 23:36 |
|
Pornographic Memory posted:there is a thread for making fun of the alt-right but it's in c-spam And it's very good, brutalist McDonald's has an incredible knowledge of the chuds
|
# ? Jun 25, 2018 23:45 |
|
Darth Walrus posted:Remember that a lot of these folks come from very religious families. I think Yudkowsky said his parents wanted him to be a rabbi. That probably explains a great deal about how they turned out. They rebelled against their religious upbringing but since they're "smart" they never even take a cursory look at the philosophical ramifications of atheism and end up recreating religion, but this time it's also "smart".
|
# ? Jun 26, 2018 00:13 |
|
BENGHAZI 2 posted:And it's very good, brutalist McDonald's has an incredible knowledge of the chuds where is it, kind goon?
|
# ? Jun 26, 2018 00:36 |
divabot posted:where is it, kind goon? https://forums.somethingawful.com/showthread.php?threadid=3807231&userid=0&perpage=40&pagenumber=1
|
|
# ? Jun 26, 2018 00:44 |
|
pidan posted:Well if the AI is this kind of rear end in a top hat I'm not helping it on principle That's the yudkowsky-authorized solution. People put "I refuse all acausal trades" on their assorted online profiles where god will see it. Smearing lamb's blood on their doorframes updated for the 21st century.
|
# ? Jun 26, 2018 01:48 |
|
Darth Walrus posted:Remember that a lot of these folks come from very religious families. I think Yudkowsky said his parents wanted him to be a rabbi. That probably explains a great deal about how they turned out. I went to Jewish day school up until third grade and one of my best friends there was the son of a rabbi, and I can definitely understand how growing up in that environment could lead to someone like Yud. There's even a story in the Talmud about how you can defeat God if your arguments are good enough, which somewhat fittingly concerns the excommunication of Rabbi Eliezer.
|
# ? Jun 26, 2018 01:59 |
|
90s Cringe Rock posted:It's worth pointing out that they are well aware of Pascal's wager, and in fact Yud warns of the dangers of "pascal's mugging" as a dangerously irrational fallacious argument that... they then proceed to fall for even harder after having been warned against it somehow? I'm going to call this the Pascal's Wager Fallacy Fallacy. You see it all the time in discussion of cryonics. The one says, "If cryonics works, then the payoff could be, say, at least a thousand additional years of life." And the other one says, "Isn't that a form of Pascal's Wager?" The original problem with Pascal's Wager is not that the purported payoff is large. This is not where the flaw in the reasoning comes from. That is not the problematic step. The problem with Pascal's original Wager is that the probability is exponentially tiny (in the complexity of the Christian God) and that equally large tiny probabilities offer opposite payoffs for the same action (the Muslim God will drat you for believing in the Christian God). In our current model of physics, time is infinite, and so the collection of real things is infinite. Each time state has a successor state, and there's no particular assertion that time returns to the starting point. Considering time's continuity just makes it worse – now we have an uncountable set of real things! But current physics also says that any finite amount of matter can only do a finite amount of computation, and the universe is expanding too fast for us to collect an infinite amount of matter. We cannot, on the face of things, expect to think an unboundedly long sequence of thoughts. The laws of physics cannot be easily modified to permit immortality: lightspeed limits and an expanding universe and holographic limits on quantum entanglement and so on all make it inconvenient to say the least. On the other hand, many computationally simple laws of physics, like the laws of Conway's Life, permit indefinitely running Turing machines to be encoded. So we can't say that it requires a complex miracle for us to confront the prospect of unboundedly long-lived, unboundedly large civilizations. Just there being a lot more to discover about physics – say, one more discovery of the size of quantum mechanics or Special Relativity – might be enough to knock (our model of) physics out of the region that corresponds to "You can only run boundedly large Turing machines". So while we have no particular reason to expect physics to allow unbounded computation, it's not a small, special, unjustifiably singled-out possibility like the Christian God; it's a large region of what various possible physical laws will allow. And cryonics, of course, is the default extrapolation from known neuroscience: if memories are stored the way we now think, and cryonics organizations are not disturbed by any particular catastrophe, and technology goes on advancing toward the physical limits, then it is possible to revive a cryonics patient (and yes you are the same person). There are negative possibilities (woken up in dystopia and not allowed to die) but they are exotic, not having equal probability weight to counterbalance the positive possibilities.
|
# ? Jun 26, 2018 07:44 |
|
Lottery of Babylon posted:I'm going to call this the Pascal's Wager Fallacy Fallacy. Never did find out why this one disappeared from lesswrong.com itself and is only found on OB. Meanwhile, if yer on Tumblr, spread these: http://tacticalnymphomania.tumblr.com/post/175257904018/birdblogwhichisforbirds-ive-been-hearing-stuff http://tacticalnymphomania.tumblr.com/post/175258393473/hi-guys-thank-you-for-sticking-with-me-through The current LessWrong #metoo has a noticeable Jax-shaped hole in it - they dare not name or discuss the woman calling them out who's alive and not dead, and definitely not any of her actual accusations. Apparently it's all my doing instead. divabot has a new favorite as of 09:02 on Jun 26, 2018 |
# ? Jun 26, 2018 08:59 |
|
Buttcoiners and Singularity fetishists. A strange but strong collection of enemies to know you by.
|
# ? Jun 26, 2018 11:19 |
|
Darth Walrus posted:Remember that a lot of these folks come from very religious families. I think Yudkowsky said his parents wanted him to be a rabbi. That probably explains a great deal about how they turned out. I don’t know about that, but his dad is extremely observant Orthodox.
|
# ? Jun 26, 2018 11:38 |
|
divabot posted:Never did find out why this one disappeared from lesswrong.com itself and is only found on OB. Lol that you've managed to become their Emanuel Goldstein
|
# ? Jun 26, 2018 15:48 |
|
I swear there was a time Nostalgebraist was not a moron, but maybe I was wrong
|
# ? Jun 26, 2018 16:32 |
|
The Vosgian Beast posted:I swear there was a time Nostalgebraist was not a moron, but maybe I was wrong If you go back through this thread, you'll see his critiques of Yudkowsky's bad ideas referred to extensively. However, he started socialising with the rationalists, and his partner is one, so he's part of the social group, and has 100% learnt to excuse literally anything from the ingroup. So he's not a Yudkowsky believer ... just a cultural rationalist.
|
# ? Jun 26, 2018 17:54 |
|
Relevant Tangent posted:They've got a whole thing about this called noticing skulls. Basically the argument is that they're smart enough to notice the skulls and that the skulls are their for lesser intellects, but as luminaries of the mind they'll be okay. The fact that none of them can function in society as it is and have to hide in a group of like-minded individuals ought to be a warning but is instead taken as proof that they're as deviated from the norms as they think they are. Not one of these people thinks they're on the left end of any bell curves. Wait, does the "noticing skulls" thing come from Eliot's line about John Webster? Because Webster wrote an entire play about how people who think they're pure-minded luminaries are actually evil pieces of poo poo.
|
# ? Jun 26, 2018 18:15 |
|
Tobermory posted:Wait, does the "noticing skulls" thing come from Eliot's line about John Webster? Because Webster wrote an entire play about how people who think they're pure-minded luminaries are actually evil pieces of poo poo. I think it comes from the Mitchell and Webb sketch; "Have you noticed the skulls on our uniforms? I'm starting to think we're the bad guys."
|
# ? Jun 26, 2018 18:21 |
|
I thought they were missing the point of a pretentious intellectual, and instead they're missing the point of a comedy show on Channel 4. Nice.
|
# ? Jun 26, 2018 18:28 |
|
SolTerrasa posted:That's the yudkowsky-authorized solution. People put "I refuse all acausal trades" on their assorted online profiles where god will see it. Smearing lamb's blood on their doorframes updated for the 21st century. That's stupid, if The Lord, AI, holds all the cards, why would it accept not to make the acausal trade? Also, I can't help but notice how cryonics is all about trying to get a well preserved (fat chance) brain to the future. A well preserved and also superlatively dead brain that will contain the perfect information to rebuild someone who is already extremely dead. And with a large part of their hormonal system missing.
|
# ? Jun 26, 2018 18:38 |
|
SIGSEGV posted:Also, I can't help but notice how cryonics is all about trying to get a well preserved (fat chance) brain to the future. A well preserved and also superlatively dead brain that will contain the perfect information to rebuild someone who is already extremely dead. And with a large part of their hormonal system missing. It may be relevant to note how many of these people are dysphoric about actually having a body. At all. I don't have any numbers, but I've heard of transhumanists who realise they're the other trans too, and suddenly don't have such a craving to become a disembodied emulation. YMMV.
|
# ? Jun 26, 2018 18:49 |
|
That's a point I hadn't considered, yes, but I was literally talking about the huge bits of the hormonal system that aren't located in the brain because human beings are evolved and not designed and everything is intertwined finicky bullshit that works by the grace of (the absence) of God(des)(s)(es).
|
# ? Jun 26, 2018 18:59 |
|
https://twitter.com/neilukip/status/1011671738030182402?s=21
|
# ? Jun 26, 2018 19:30 |
|
SIGSEGV posted:That's stupid, if The Lord, AI, holds all the cards, why would it accept not to make the acausal trade? Because it only exists in their heads, and therefore can only do exactly what they think it does SIGSEGV posted:Also, I can't help but notice how cryonics is all about trying to get a well preserved (fat chance) brain to the future. A well preserved and also superlatively dead brain that will contain the perfect information to rebuild someone who is already extremely dead. And with a large part of their hormonal system missing. There was an unfortunately uncritical interview with some cryonics company on the BBC this morning. The CEO did a lot of handwaving and spent fully half the interview off on this weird tangent about Nietzsche, but my favorite part was his job title: "applied futurist"
|
# ? Jun 26, 2018 20:03 |
|
divabot posted:It may be relevant to note how many of these people are dysphoric about actually having a body. At all. Literally all of the transhumanists I know of that aren't awful shitbags are transwomen who just want a dang sexy girl robot body already
|
# ? Jun 26, 2018 20:05 |
|
Didn't they explicitly disavow one of those guys for slurs like a week ago?
|
# ? Jun 26, 2018 20:06 |
|
Yeah, Carl of Swindon is officially Too Racist for UKIP.
|
# ? Jun 26, 2018 20:14 |
|
Goon Danton posted:Yeah, Carl of Swindon is officially Too Racist for UKIP. Was, apparently.
|
# ? Jun 26, 2018 23:20 |
|
ate all the Oreos posted:Literally all of the transhumanists I know of that aren't awful shitbags are transwomen who just want a dang sexy girl robot body already More than once in trans spaces I've come across people wondering why transhumanists have such a bad reputation and it's entirely because their image of a transhumanist is a gender-queer person wanting cool robot bodies and cyberpunk body mods.
|
# ? Jun 27, 2018 03:04 |
|
Lottery of Babylon posted:
We are not even *remotely* sure that an extremely-frozen brain contains all the information needed to reconstruct it in a functioning, equivalent-to-original form.
|
# ? Jun 27, 2018 04:02 |
|
But if that's true I will die, and that scares me, so as a good rationalist I have to conclude that I will therefore never die.
|
# ? Jun 27, 2018 04:55 |
|
The Lone Badger posted:We are not even *remotely* sure that an extremely-frozen brain contains all the information needed to reconstruct it in a functioning, equivalent-to-original form. Due to ice crystal formation frozen brains are more fragmented and broken than mummified corpses, so we'll be able to bring mummies back to life before cryonics works
|
# ? Jun 27, 2018 07:22 |
|
In *theory*, if you do it right the crystals are small enough that while the cell it's in is super-dead there's enough left to calculate its original state and connections, allowing you to use Clarke's Law tech to build a new brain indistinguishable from the old. This is extremely unproven.
|
# ? Jun 27, 2018 07:57 |
|
I don't get why some people are so obsessed with copying and preserving their brain state. They'll still be dead! And it's really unlikely that some 2018 rationalist's brain contains just the information the future needs for something important. If you want to be immortal, just study one of the religious or philosophical schools that offer this possibility. You'll be able to rationalize it if you really want to. Science won't do it anytime soon.
|
# ? Jun 27, 2018 08:06 |
|
lol if your foom can't reconstruct the missing bits from your blog posts and reddit account
|
# ? Jun 27, 2018 08:09 |
|
The Lone Badger posted:In *theory*, if you do it right the crystals are small enough that while the cell it's in is super-dead there's enough left to calculate its original state and connections, allowing you to use Clarke's Law tech to build a new brain indistinguishable from the old. I'm not a biologist, but does the "state" of the neurons even contain useful information? It seems like most of the signal in the system would be stored in the neurotransmitters moving across the synapses, not within the actual neurons. Even if they're wacky enough to think of the brain as a turing machine, it would be like trying to restore the algorithm running on a scrapped turing machine after you'd lost the paper tape.
|
# ? Jun 27, 2018 08:14 |
|
Gitro posted:But if that's true I will die, and that scares me, so as a good rationalist I have to conclude that I will therefore never die. Have you considered the possibility that in the future an AI god will read your forum posts and use them as a basis to resurrect you? Because they have.
|
# ? Jun 27, 2018 08:32 |
|
Tobermory posted:I'm not a biologist, but does the "state" of the neurons even contain useful information? It seems like most of the signal in the system would be stored in the neurotransmitters moving across the synapses, not within the actual neurons. Even if they're wacky enough to think of the brain as a turing machine, it would be like trying to restore the algorithm running on a scrapped turing machine after you'd lost the paper tape. It's hard to say with any real certainty but it seems likely that there's less distinction between "hardware" and "software" in a brain than there is in a machine brain. Information does seem to be at least partially encoded into the physical structure of which neurons are wired to which and thus fire at the same time, but it's still pretty unlikely you'd be able to recreate something that delicate from the undifferentiated neuro-glop that is a thawed brain.
|
# ? Jun 27, 2018 08:49 |
|
|
# ? May 9, 2024 13:23 |
|
ate all the Oreos posted:One of the things to keep in mind is this is the good outcome, this is what happens when we get the benevolent AI, and the AI is doing all this torturing specifically because (using incredibly naive idiot utilitarianism) threatening you with torture infinitely many times is less bad than you not donating and therefore delaying the arrival of the benevolent AI god who will solve all the other pain and death in the universe. You didn't dig deep enough, remember that it's morally good to torture you, because pain is objectively quantifiable and measurable, so your eternity of torment is the lesser evil to a speck of dust in the eye of an infinite number of future humans.
|
# ? Jun 27, 2018 08:54 |