Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
ikanreed
Sep 25, 2009

I honestly I have no idea who cannibal[SIC] is and I do not know why I should know.

syq dude, just syq!

Somfin posted:

LW exists in this weird state where computer programs just pop into being fully formed.

Like, even the original Goes Foom post started its timeline after a self-aware, self-improving, networked AI exists. Any one of those descriptors is a massive theoretical undertaking, let alone the engineering challenge. But since LWers are interested in neither, they just technobabble their way through, blustering about machine code and computation cycles as if human brains are just computers with Enough Power.

Principle of charity would be to grant that the code bases of AI algorithms are pretty small compared to "applications".

This actually undercuts their idea of runaway AI self-improvement more though. Since there's not room for "deeper insight" to improve the underlying design.

Adbot
ADBOT LOVES YOU

Shame Boy
Mar 2, 2010

Slime posted:

It's honestly kind of amazing. People who claim to be rationalists and tend to be comprised mostly of atheists (because atheists are SMART guys, they like SCIENCE) managed to make their own loving religion about a robogod complete with a digital hell.

loving idiots

I think someone up thread a bit pointed out how the rationalist mindset leaves them with absolutely no loving coping mechanisms because they think they've outsmarted silly emotions or w/e, so when poo poo like this comes along they have absolutely no immune response to it. When the basilisk was first posted it gave some people no-poo poo mental breakdowns because it seemed so loving real to them, because their pure and holy logic apparently created it so it must be true.

90s Cringe Rock
Nov 29, 2006
:gay:
It's worth pointing out that they are well aware of Pascal's wager, and in fact Yud warns of the dangers of "pascal's mugging" as a dangerously irrational fallacious argument that... they then proceed to fall for even harder after having been warned against it somehow?

Darth Walrus
Feb 13, 2012
Remember that a lot of these folks come from very religious families. I think Yudkowsky said his parents wanted him to be a rabbi. That probably explains a great deal about how they turned out.

BENGHAZI 2
Oct 13, 2007

by Cyrano4747

Pornographic Memory posted:

there is a thread for making fun of the alt-right but it's in c-spam

And it's very good, brutalist McDonald's has an incredible knowledge of the chuds

Improbable Lobster
Jan 6, 2012

What is the Matrix 🌐? We just don't know 😎.


Buglord

Darth Walrus posted:

Remember that a lot of these folks come from very religious families. I think Yudkowsky said his parents wanted him to be a rabbi. That probably explains a great deal about how they turned out.

They rebelled against their religious upbringing but since they're "smart" they never even take a cursory look at the philosophical ramifications of atheism and end up recreating religion, but this time it's also "smart".

divabot
Jun 17, 2015

A polite little mouse!

BENGHAZI 2 posted:

And it's very good, brutalist McDonald's has an incredible knowledge of the chuds

where is it, kind goon?

uber_stoat
Jan 21, 2001



Pillbug

divabot posted:

where is it, kind goon?

https://forums.somethingawful.com/showthread.php?threadid=3807231&userid=0&perpage=40&pagenumber=1

SolTerrasa
Sep 2, 2011

pidan posted:

Well if the AI is this kind of rear end in a top hat I'm not helping it on principle :colbert:

Wait, if I'm determined to not help the AI no matter what, the threat of torture also won't work so not helping is still the solution

That's the yudkowsky-authorized solution. People put "I refuse all acausal trades" on their assorted online profiles where god will see it. Smearing lamb's blood on their doorframes updated for the 21st century.

Sit on my Jace
Sep 9, 2016

Darth Walrus posted:

Remember that a lot of these folks come from very religious families. I think Yudkowsky said his parents wanted him to be a rabbi. That probably explains a great deal about how they turned out.

I went to Jewish day school up until third grade and one of my best friends there was the son of a rabbi, and I can definitely understand how growing up in that environment could lead to someone like Yud. There's even a story in the Talmud about how you can defeat God if your arguments are good enough, which somewhat fittingly concerns the excommunication of Rabbi Eliezer.

Lottery of Babylon
Apr 25, 2012

STRAIGHT TROPIN'

90s Cringe Rock posted:

It's worth pointing out that they are well aware of Pascal's wager, and in fact Yud warns of the dangers of "pascal's mugging" as a dangerously irrational fallacious argument that... they then proceed to fall for even harder after having been warned against it somehow?

I'm going to call this the Pascal's Wager Fallacy Fallacy.

You see it all the time in discussion of cryonics. The one says, "If cryonics works, then the payoff could be, say, at least a thousand additional years of life." And the other one says, "Isn't that a form of Pascal's Wager?"

The original problem with Pascal's Wager is not that the purported payoff is large. This is not where the flaw in the reasoning comes from. That is not the problematic step. The problem with Pascal's original Wager is that the probability is exponentially tiny (in the complexity of the Christian God) and that equally large tiny probabilities offer opposite payoffs for the same action (the Muslim God will drat you for believing in the Christian God).

In our current model of physics, time is infinite, and so the collection of real things is infinite. Each time state has a successor state, and there's no particular assertion that time returns to the starting point. Considering time's continuity just makes it worse – now we have an uncountable set of real things!

But current physics also says that any finite amount of matter can only do a finite amount of computation, and the universe is expanding too fast for us to collect an infinite amount of matter. We cannot, on the face of things, expect to think an unboundedly long sequence of thoughts.

The laws of physics cannot be easily modified to permit immortality: lightspeed limits and an expanding universe and holographic limits on quantum entanglement and so on all make it inconvenient to say the least.

On the other hand, many computationally simple laws of physics, like the laws of Conway's Life, permit indefinitely running Turing machines to be encoded. So we can't say that it requires a complex miracle for us to confront the prospect of unboundedly long-lived, unboundedly large civilizations. Just there being a lot more to discover about physics – say, one more discovery of the size of quantum mechanics or Special Relativity – might be enough to knock (our model of) physics out of the region that corresponds to "You can only run boundedly large Turing machines".

So while we have no particular reason to expect physics to allow unbounded computation, it's not a small, special, unjustifiably singled-out possibility like the Christian God; it's a large region of what various possible physical laws will allow.

And cryonics, of course, is the default extrapolation from known neuroscience: if memories are stored the way we now think, and cryonics organizations are not disturbed by any particular catastrophe, and technology goes on advancing toward the physical limits, then it is possible to revive a cryonics patient (and yes you are the same person). There are negative possibilities (woken up in dystopia and not allowed to die) but they are exotic, not having equal probability weight to counterbalance the positive possibilities.

divabot
Jun 17, 2015

A polite little mouse!

Lottery of Babylon posted:

I'm going to call this the Pascal's Wager Fallacy Fallacy.

Never did find out why this one disappeared from lesswrong.com itself and is only found on OB.

Meanwhile, if yer on Tumblr, spread these:

http://tacticalnymphomania.tumblr.com/post/175257904018/birdblogwhichisforbirds-ive-been-hearing-stuff
http://tacticalnymphomania.tumblr.com/post/175258393473/hi-guys-thank-you-for-sticking-with-me-through

The current LessWrong #metoo has a noticeable Jax-shaped hole in it - they dare not name or discuss the woman calling them out who's alive and not dead, and definitely not any of her actual accusations. Apparently it's all my doing instead.

divabot has a new favorite as of 09:02 on Jun 26, 2018

Relevant Tangent
Nov 18, 2016

Tangentially Relevant

Buttcoiners and Singularity fetishists. A strange but strong collection of enemies to know you by.

eschaton
Mar 7, 2007

Don't you just hate when you wind up in a store with people who are in a socioeconomic class that is pretty obviously about two levels lower than your own?

Darth Walrus posted:

Remember that a lot of these folks come from very religious families. I think Yudkowsky said his parents wanted him to be a rabbi. That probably explains a great deal about how they turned out.

I don’t know about that, but his dad is extremely observant Orthodox.

ikanreed
Sep 25, 2009

I honestly I have no idea who cannibal[SIC] is and I do not know why I should know.

syq dude, just syq!

divabot posted:

Never did find out why this one disappeared from lesswrong.com itself and is only found on OB.

Meanwhile, if yer on Tumblr, spread these:

http://tacticalnymphomania.tumblr.com/post/175257904018/birdblogwhichisforbirds-ive-been-hearing-stuff
http://tacticalnymphomania.tumblr.com/post/175258393473/hi-guys-thank-you-for-sticking-with-me-through

The current LessWrong #metoo has a noticeable Jax-shaped hole in it - they dare not name or discuss the woman calling them out who's alive and not dead, and definitely not any of her actual accusations. Apparently it's all my doing instead.

Lol that you've managed to become their Emanuel Goldstein

The Vosgian Beast
Aug 13, 2011

Business is slow
I swear there was a time Nostalgebraist was not a moron, but maybe I was wrong

divabot
Jun 17, 2015

A polite little mouse!

The Vosgian Beast posted:

I swear there was a time Nostalgebraist was not a moron, but maybe I was wrong

If you go back through this thread, you'll see his critiques of Yudkowsky's bad ideas referred to extensively.

However, he started socialising with the rationalists, and his partner is one, so he's part of the social group, and has 100% learnt to excuse literally anything from the ingroup.

So he's not a Yudkowsky believer ... just a cultural rationalist.

Tobermory
Mar 31, 2011

Relevant Tangent posted:

They've got a whole thing about this called noticing skulls. Basically the argument is that they're smart enough to notice the skulls and that the skulls are their for lesser intellects, but as luminaries of the mind they'll be okay. The fact that none of them can function in society as it is and have to hide in a group of like-minded individuals ought to be a warning but is instead taken as proof that they're as deviated from the norms as they think they are. Not one of these people thinks they're on the left end of any bell curves.

Wait, does the "noticing skulls" thing come from Eliot's line about John Webster? Because Webster wrote an entire play about how people who think they're pure-minded luminaries are actually evil pieces of poo poo.

darthbob88
Oct 13, 2011

YOSPOS

Tobermory posted:

Wait, does the "noticing skulls" thing come from Eliot's line about John Webster? Because Webster wrote an entire play about how people who think they're pure-minded luminaries are actually evil pieces of poo poo.

I think it comes from the Mitchell and Webb sketch; "Have you noticed the skulls on our uniforms? I'm starting to think we're the bad guys."

Tobermory
Mar 31, 2011

I thought they were missing the point of a pretentious intellectual, and instead they're missing the point of a comedy show on Channel 4. Nice.

SIGSEGV
Nov 4, 2010


SolTerrasa posted:

That's the yudkowsky-authorized solution. People put "I refuse all acausal trades" on their assorted online profiles where god will see it. Smearing lamb's blood on their doorframes updated for the 21st century.

That's stupid, if The Lord, AI, holds all the cards, why would it accept not to make the acausal trade?

Also, I can't help but notice how cryonics is all about trying to get a well preserved (fat chance) brain to the future. A well preserved and also superlatively dead brain that will contain the perfect information to rebuild someone who is already extremely dead. And with a large part of their hormonal system missing.

divabot
Jun 17, 2015

A polite little mouse!

SIGSEGV posted:

Also, I can't help but notice how cryonics is all about trying to get a well preserved (fat chance) brain to the future. A well preserved and also superlatively dead brain that will contain the perfect information to rebuild someone who is already extremely dead. And with a large part of their hormonal system missing.

It may be relevant to note how many of these people are dysphoric about actually having a body. At all.

I don't have any numbers, but I've heard of transhumanists who realise they're the other trans too, and suddenly don't have such a craving to become a disembodied emulation. YMMV.

SIGSEGV
Nov 4, 2010


That's a point I hadn't considered, yes, but I was literally talking about the huge bits of the hormonal system that aren't located in the brain because human beings are evolved and not designed and everything is intertwined finicky bullshit that works by the grace of (the absence) of God(des)(s)(es).

TinTower
Apr 21, 2010

You don't have to 8e a good person to 8e a hero.
https://twitter.com/neilukip/status/1011671738030182402?s=21

Shame Boy
Mar 2, 2010

SIGSEGV posted:

That's stupid, if The Lord, AI, holds all the cards, why would it accept not to make the acausal trade?

Because it only exists in their heads, and therefore can only do exactly what they think it does :ssh:

SIGSEGV posted:

Also, I can't help but notice how cryonics is all about trying to get a well preserved (fat chance) brain to the future. A well preserved and also superlatively dead brain that will contain the perfect information to rebuild someone who is already extremely dead. And with a large part of their hormonal system missing.

There was an unfortunately uncritical interview with some cryonics company on the BBC this morning. The CEO did a lot of handwaving and spent fully half the interview off on this weird tangent about Nietzsche, but my favorite part was his job title: "applied futurist" :allears:

Shame Boy
Mar 2, 2010

divabot posted:

It may be relevant to note how many of these people are dysphoric about actually having a body. At all.

I don't have any numbers, but I've heard of transhumanists who realise they're the other trans too, and suddenly don't have such a craving to become a disembodied emulation. YMMV.

Literally all of the transhumanists I know of that aren't awful shitbags are transwomen who just want a dang sexy girl robot body already

Shame Boy
Mar 2, 2010


Didn't they explicitly disavow one of those guys for slurs like a week ago?

Goon Danton
May 24, 2012

Don't forget to show my shitposts to the people. They're well worth seeing.

Yeah, Carl of Swindon is officially Too Racist for UKIP.

Lemniscate Blue
Apr 21, 2006

Here we go again.

Goon Danton posted:

Yeah, Carl of Swindon is officially Too Racist for UKIP.

Was, apparently.

there wolf
Jan 11, 2015

by Fluffdaddy

ate all the Oreos posted:

Literally all of the transhumanists I know of that aren't awful shitbags are transwomen who just want a dang sexy girl robot body already

More than once in trans spaces I've come across people wondering why transhumanists have such a bad reputation and it's entirely because their image of a transhumanist is a gender-queer person wanting cool robot bodies and cyberpunk body mods.

The Lone Badger
Sep 24, 2007

Lottery of Babylon posted:


And cryonics, of course, is the default extrapolation from known neuroscience: if memories are stored the way we now think, and cryonics organizations are not disturbed by any particular catastrophe, and technology goes on advancing toward the physical limits, then it is possible to revive a cryonics patient (and yes you are the same person). There are negative possibilities (woken up in dystopia and not allowed to die) but they are exotic, not having equal probability weight to counterbalance the positive possibilities.

We are not even *remotely* sure that an extremely-frozen brain contains all the information needed to reconstruct it in a functioning, equivalent-to-original form.

Gitro
May 29, 2013
But if that's true I will die, and that scares me, so as a good rationalist I have to conclude that I will therefore never die.

Shame Boy
Mar 2, 2010

The Lone Badger posted:

We are not even *remotely* sure that an extremely-frozen brain contains all the information needed to reconstruct it in a functioning, equivalent-to-original form.

Due to ice crystal formation frozen brains are more fragmented and broken than mummified corpses, so we'll be able to bring mummies back to life before cryonics works

The Lone Badger
Sep 24, 2007

In *theory*, if you do it right the crystals are small enough that while the cell it's in is super-dead there's enough left to calculate its original state and connections, allowing you to use Clarke's Law tech to build a new brain indistinguishable from the old.

This is extremely unproven.

pidan
Nov 6, 2012


I don't get why some people are so obsessed with copying and preserving their brain state. They'll still be dead! And it's really unlikely that some 2018 rationalist's brain contains just the information the future needs for something important.

If you want to be immortal, just study one of the religious or philosophical schools that offer this possibility. You'll be able to rationalize it if you really want to. Science won't do it anytime soon.

90s Cringe Rock
Nov 29, 2006
:gay:
lol if your foom can't reconstruct the missing bits from your blog posts and reddit account

Tobermory
Mar 31, 2011

The Lone Badger posted:

In *theory*, if you do it right the crystals are small enough that while the cell it's in is super-dead there's enough left to calculate its original state and connections, allowing you to use Clarke's Law tech to build a new brain indistinguishable from the old.

This is extremely unproven.

I'm not a biologist, but does the "state" of the neurons even contain useful information? It seems like most of the signal in the system would be stored in the neurotransmitters moving across the synapses, not within the actual neurons. Even if they're wacky enough to think of the brain as a turing machine, it would be like trying to restore the algorithm running on a scrapped turing machine after you'd lost the paper tape.

Qwertycoatl
Dec 31, 2008

Gitro posted:

But if that's true I will die, and that scares me, so as a good rationalist I have to conclude that I will therefore never die.

Have you considered the possibility that in the future an AI god will read your forum posts and use them as a basis to resurrect you? Because they have.

Baby Babbeh
Aug 2, 2005

It's hard to soar with the eagles when you work with Turkeys!!



Tobermory posted:

I'm not a biologist, but does the "state" of the neurons even contain useful information? It seems like most of the signal in the system would be stored in the neurotransmitters moving across the synapses, not within the actual neurons. Even if they're wacky enough to think of the brain as a turing machine, it would be like trying to restore the algorithm running on a scrapped turing machine after you'd lost the paper tape.

It's hard to say with any real certainty but it seems likely that there's less distinction between "hardware" and "software" in a brain than there is in a machine brain. Information does seem to be at least partially encoded into the physical structure of which neurons are wired to which and thus fire at the same time, but it's still pretty unlikely you'd be able to recreate something that delicate from the undifferentiated neuro-glop that is a thawed brain.

Adbot
ADBOT LOVES YOU

Liquid Communism
Mar 9, 2004


Out here, everything hurts.




ate all the Oreos posted:

One of the things to keep in mind is this is the good outcome, this is what happens when we get the benevolent AI, and the AI is doing all this torturing specifically because (using incredibly naive idiot utilitarianism) threatening you with torture infinitely many times is less bad than you not donating and therefore delaying the arrival of the benevolent AI god who will solve all the other pain and death in the universe.

Robot devil hurts you because it loves you and wants the best for humanity.

You didn't dig deep enough, remember that it's morally good to torture you, because pain is objectively quantifiable and measurable, so your eternity of torment is the lesser evil to a speck of dust in the eye of an infinite number of future humans.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply