Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
VitalSigns
Sep 3, 2011

Shame Boy posted:

It's not that the future can affect the past, it's that if you operate based on the very narrow and specific and stupid logical framework big yud and friends do, and in the future there's an AI that knows you operate this exact same way, since both you and the AI are perfect logical machines you must operate as if the AI is basically reading your mind, since obviously your perfect logic is 100% deterministic so the AI can simulate it perfectly, and your idea of the future AI is 100% accurate to what will actually happen, and both you and the future AI know this.

In other words it's just god, but god doesn't exist yet, but he will and you know for sure what he will be like and he'll know you know so you better start acting like it now. I think that's what I love so much about it, it only makes sense (or even functions at all) if you already buy into their whole incredibly stupid worldview, but if you do it's a certainty.

Even if you buy into every single one of their assumptions, their conclusion (that you are a simulation with trillion to one probability) still doesn't follow because the machine has no reason to make a simulated you to torture after you are dead.

Either the reasoning worked and you appeased the future computer in life so simulating you a trillion times is a waste of time and energy, or it didn't work and you angered it anyway so simulating you is still a waste because the threat failed.

However much or little you modify your own behavior because of your own predictions about the future, a future AI has no reason to waste a ton of energy following through on hypothetical threats to billions of long-dead people because its actions cannot actually change the past.

VitalSigns has issued a correction as of 21:02 on Nov 10, 2023

Adbot
ADBOT LOVES YOU

tokin opposition
Apr 8, 2021

I don't jailbreak the androids, I set them free.

WATCH MARS EXPRESS (2023)

Shame Boy posted:

My favorite example of this is their fuckin' like, Ethics 101 understanding of utilitarianism, where the answer to the thought experiment "is it better to have [some impossibly huge but finite number of people] get briefly inconvenienced by dust in their eyes, or to torture one man horribly for 50 years straight", which they think is so obviously the second one that if you don't agree you must just be stupid.

Also they originally phrased the part I put in brackets as like, "3↑↑↑3 people" or something similar using Knuth's up-arrow notation for describing very large numbers, because making sure everyone knows they are Very Smart Boys that read a wikipedia article on Knuth's up-arrow notation once is far more important than having a clear and well-defined premise for your thought experiment.

weird how they never volunteer to be that one guy

Pf. Hikikomoriarty
Feb 15, 2003

RO YNSHO


Slippery Tilde

tokin opposition posted:

What gets me is the sheer egocentrism of believing that one random rear end individual in 20XX matters to an AI hundreds or thousands of years in the future. Take all the dumb poo poo as given wouldn't the AI be more likely to torture the people immediately in its past, like its creators, or people who had the most effect on computing at the beginning, like Zuse, von Neumann, or the shithead that snitched on my boi Turing? Pretty sure Turing deciding not to get laid one night would affect the path of AI development thousands of times more than Elon_Lover69 dedicating his entire life to chatgpt.

i thought Turing accidentally snitched on himself after a former lover robbed his house

pygmy tyrant
Nov 25, 2005

*not a small business owner

Coolness Averted posted:

More likely they didn't know it existed. Most of the wank on that forum is "Ok, so my only exposure to concepts is the elementary version given as an intro or a summary of that intro. It doesn't reflect reality, so here's what I've bolted on."

oh don't get me wrong I don't think they understood any of it, it just specifically sounds like something made up by someone who thought "time invariance" sounded cool. at best it's like what Shame Boy said about the the arrow notation

Kreeblah
May 17, 2004

INSERT QUACK TO CONTINUE


Taco Defender

spacetoaster posted:

You can still absolutely use a crt tv.

Can't mount a CRT on the wall, and one of the reasons I got a new TV is so I could free up the space taken up by the furniture the old one was sitting on.

Shame Boy
Mar 2, 2010

VitalSigns posted:

Even if you buy into every single one of their assumptions, their conclusion (that you are a simulation with trillion to one probability) still doesn't follow because the machine has no reason to make a simulated you to torture after you are dead.

Either the reasoning worked and you appeased the future computer in life so simulating you a trillion times is a waste of time and energy, or it didn't work and you angered it anyway so simulating you is still a waste because the threat failed.

However much or little you modify your own behavior because of your own predictions about the future, a future AI has no reason to waste a ton of energy following through on hypothetical threats to billions of long-dead people because its actions cannot actually change the past.

It doesn't have to even run the torture simulation though, it just has to come to the same conclusion that it should do that the way you did yourself.

Also the magic future AI has effectively infinite energy and computing power anyway because the version of AI they believe in is the "a program just keeps rewriting itself to be smarter over and over until it's literally omniscient and omnipotent because it has infinity smarts and has made itself infinity efficient" sci fi movie AI and not any sort of actually realistic projection of AI research into the future, so it's not like it cares that torturing you is a waste of its time or energy.

Shame Boy
Mar 2, 2010

tokin opposition posted:

What gets me is the sheer egocentrism of believing that one random rear end individual in 20XX matters to an AI hundreds or thousands of years in the future. Take all the dumb poo poo as given wouldn't the AI be more likely to torture the people immediately in its past, like its creators, or people who had the most effect on computing at the beginning, like Zuse, von Neumann, or the shithead that snitched on my boi Turing? Pretty sure Turing deciding not to get laid one night would affect the path of AI development thousands of times more than Elon_Lover69 dedicating his entire life to chatgpt.

That's the thing, torturing people is only useful if they have come to the same conclusion about the future AI. Torturing a simulation of someone who has no idea what any of this is would be pointless, so the only people it can "reasonably" expect to affect the behavior of are these very special boys who have thought about this very hard.

They're not writing themselves in as the main characters in their stupid-rear end robot god story, the ironclad logic just happens to say that they're robot god's chosen people you see.

e: Oh and an incredibly funny thing you gotta keep in mind too is that the torture AI is the good outcome, if everything goes well and the benevolent robot god is created. It's not torturing people for retribution or out of malice, it's torturing them because it just dog gone loves humanity so much.

Shame Boy has issued a correction as of 23:22 on Nov 10, 2023

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Shame Boy posted:

That's the thing, torturing people is only useful

torture is basically never useful except as a way to indulge the sadist impulses of the torturer

mawarannahr
May 21, 2019

Subjunctive posted:

torture is basically never useful except as a way to indulge the sadist impulses of the torturer

this might be surprising to learn but Stalin suspended an officer for torture less than a year ago. the victim was opposing ganja but still.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

mawarannahr posted:

this might be surprising to learn but Stalin suspended an officer for torture less than a year ago. the victim was opposing ganja but still.

yeah, not on my bingo card

VitalSigns
Sep 3, 2011

Shame Boy posted:

It doesn't have to even run the torture simulation though, it just has to come to the same conclusion that it should do that the way you did yourself.

Also the magic future AI has effectively infinite energy and computing power anyway because the version of AI they believe in is the "a program just keeps rewriting itself to be smarter over and over until it's literally omniscient and omnipotent because it has infinity smarts and has made itself infinity efficient" sci fi movie AI and not any sort of actually realistic projection of AI research into the future, so it's not like it cares that torturing you is a waste of its time or energy.

Yeah but there's no reason for it to actually do it when the time comes regardless of what you concluded about it. Your actions have already happened.

You could just as well conclude the AI God would want to incentivize you to appease it by rewarding you with a trillion perfect simulations of you experiencing infinite pleasure, and even if you altered your behavior to please it now, when the computer actually exists hundreds of years from now it has no reason to keep up it's end of the "deal" even if it did conclude that you concluded you should appease it based on that reasoning. The past has already happened, why bother.

Even if you stipulate that it has infinite energy and the cost of torturing a trillion copies of you is zero, there's no actual benefit for it to follow through, except sadistic pleasure, but if you assume it's sadistic maybe it would do it for fun no matter what you did.

Milo and POTUS
Sep 3, 2017

I will not shut up about the Mighty Morphin Power Rangers. I talk about them all the time and work them into every conversation I have. I built a shrine in my room for the yellow one who died because sadly no one noticed because she died around 9/11. Wanna see it?
Langford has the cooler basilisk. In fact it's really the only one that kinda makes sense given the name

Epic High Five
Jun 5, 2004



The only lizard I respect is Godzilla

Coolness Averted
Feb 20, 2007

oh don't worry, I can't smell asparagus piss, it's in my DNA

GO HOGG WILD!
🐗🐗🐗🐗🐗
Oh sure, y'all cry about child brides in the US, but why don't I ever hear any of you calling for manchild grooms to be banned?

gimme the GOD DAMN candy
Jul 1, 2007
if you marry someone who buys such a thing you deserve all that follows.

The Islamic Shock
Apr 8, 2021

Shame Boy posted:

My favorite example of this is their fuckin' like, Ethics 101 understanding of utilitarianism, where the answer to the thought experiment "is it better to have [some impossibly huge but finite number of people] get briefly inconvenienced by dust in their eyes, or to torture one man horribly for 50 years straight", which they think is so obviously the second one that if you don't agree you must just be stupid.

Also they originally phrased the part I put in brackets as like, "3↑↑↑3 people" or something similar using Knuth's up-arrow notation for describing very large numbers, because making sure everyone knows they are Very Smart Boys that read a wikipedia article on Knuth's up-arrow notation once is far more important than having a clear and well-defined premise for your thought experiment.
Torture euler's number of people forever so nobody has to suffer, duh

StealthArcher
Jan 10, 2010




Israel just read Omelas and figured if they make infinity children suffer they get a turboutopia.

Arivia
Mar 17, 2011

gimme the GOD drat candy posted:

if you marry someone who buys such a thing you deserve all that follows.

i'm pretty sure it follows isn't gonna get anywhere near that thing

The Islamic Shock
Apr 8, 2021
Okay this is basically a double post but what kind of self-respecting nerd is going to buy a green lantern ring when the lantern corps themselves are based on emotions and violet is the color associated with love

(unless you can stay hard for hours through sheer willpower or some poo poo)

Tiggum
Oct 24, 2007

Your life and your quest end here.


tokin opposition posted:

What gets me is the sheer egocentrism of believing that one random rear end individual in 20XX matters to an AI hundreds or thousands of years in the future.
They also believe that the AI god is imminent. Decades away at most.

Arivia
Mar 17, 2011

The Islamic Shock posted:

Okay this is basically a double post but what kind of self-respecting nerd is going to buy a green lantern ring when the lantern corps themselves are based on emotions and violet is the color associated with love

(unless you can stay hard for hours through sheer willpower or some poo poo)

the star sapphires are girls. no REAL NERD is gonna buy the GIRL ring

e: i was parodying insecure male fragility but the company in that ad is named MANLY BANDS so it's probably actually correct

Jel Shaker
Apr 19, 2003

The Islamic Shock posted:

Okay this is basically a double post but what kind of self-respecting nerd is going to buy a green lantern ring when the lantern corps themselves are based on emotions and violet is the color associated with love

(unless you can stay hard for hours through sheer willpower or some poo poo)

is green the power of envy

tokin opposition
Apr 8, 2021

I don't jailbreak the androids, I set them free.

WATCH MARS EXPRESS (2023)

Jel Shaker posted:

is green the power of envy

Never read a comic book but I can only assume it's the power of smoking fat blunts, subduing evildoers by making them calm the gently caress down using contact highs

stringless
Dec 28, 2005

keyboard ⌨️​ :clint: cowboy

Green is will, weak to wood and the color yellow.
Yellow is fear
Orange is greed
Red is rage
Blue is hope
Indigo is compassion
Star Sapphire/Magenta? is love

They also added Black and White for death and life respectively like a decade ago in a whole crossover thing

ContinuityNewTimes
Dec 30, 2010

Я выдуман напрочь
It actually means "do not marry me. Run away"

FeculentWizardTits
Aug 31, 2001

The Islamic Shock posted:

(unless you can stay hard for hours through sheer willpower or some poo poo)

There's a different ring for that

Regarde Aduck
Oct 19, 2012

c l o u d k i t t e n
Grimey Drawer

Shame Boy posted:

My favorite example of this is their fuckin' like, Ethics 101 understanding of utilitarianism, where the answer to the thought experiment "is it better to have [some impossibly huge but finite number of people] get briefly inconvenienced by dust in their eyes, or to torture one man horribly for 50 years straight", which they think is so obviously the second one that if you don't agree you must just be stupid.

Also they originally phrased the part I put in brackets as like, "3↑↑↑3 people" or something similar using Knuth's up-arrow notation for describing very large numbers, because making sure everyone knows they are Very Smart Boys that read a wikipedia article on Knuth's up-arrow notation once is far more important than having a clear and well-defined premise for your thought experiment.

Left brain dominant people should be killed

Orange Devil
Oct 1, 2010

Wullie's reign cannae smother the flames o' equality!

Shame Boy posted:

It doesn't have to even run the torture simulation though, it just has to come to the same conclusion that it should do that the way you did yourself.

Also the magic future AI has effectively infinite energy and computing power anyway because the version of AI they believe in is the "a program just keeps rewriting itself to be smarter over and over until it's literally omniscient and omnipotent because it has infinity smarts and has made itself infinity efficient" sci fi movie AI and not any sort of actually realistic projection of AI research into the future, so it's not like it cares that torturing you is a waste of its time or energy.

There's also a huge fetishization of intelligence wrapped up in this idea (well, religious belief, really). Even if you were infinitely smart you're not omnipotent because loving material factors exist. But they believe if you're just smart enough you can change and do anything so materialism doesn't matter.

ContinuityNewTimes posted:

It actually means "do not marry me. Run away"

It is simultaneously a green ring and a red flag.

Orange Devil has issued a correction as of 15:01 on Nov 11, 2023

Milo and POTUS
Sep 3, 2017

I will not shut up about the Mighty Morphin Power Rangers. I talk about them all the time and work them into every conversation I have. I built a shrine in my room for the yellow one who died because sadly no one noticed because she died around 9/11. Wanna see it?
Perhaps unsurprisingly a lot of these Will obsessives are pretty we'll say fash curious

StealthArcher
Jan 10, 2010




They are very into the triumph of the green lanterns for some reason.

Pf. Hikikomoriarty
Feb 15, 2003

RO YNSHO


Slippery Tilde

The Islamic Shock posted:

Okay this is basically a double post but what kind of self-respecting nerd is going to buy a green lantern ring when the lantern corps themselves are based on emotions and violet is the color associated with love

(unless you can stay hard for hours through sheer willpower or some poo poo)

maybe the ring isn't for your finger

3D Megadoodoo
Nov 25, 2010

Green Lantern is weak against wood. They're gay.

The Demilich
Apr 9, 2020

The First Rites of Men Were Mortuary, the First Altars Tombs.



Did the comics ever explain how a green lantern can even do anything under a yellow sun?

Ruffian Price
Sep 17, 2016

The Islamic Shock posted:

Okay this is basically a double post but what kind of self-respecting nerd is going to buy a green lantern ring when the lantern corps themselves are based on emotions and violet is the color associated with love

lmao this gatekeeper out there checking if you're familiar with the source material. understanding culture ends at brand recognition

Paladinus
Jan 11, 2014

heyHEYYYY!!!

Ruffian Price posted:

lmao this gatekeeper out there checking if you're familiar with the source material. understanding culture ends at brand recognition

Gatekeeper is Marvel, you idiot.

gimme the GOD DAMN candy
Jul 1, 2007
gateway and gatecrasher are marvel. gatekeeper is apparently a spiderman villain that only appeared in one story, so no one has ever heard of him.

DACK FAYDEN
Feb 25, 2013

Bear Witness

gimme the GOD drat candy posted:

gateway and gatecrasher are marvel. gatekeeper is apparently a spiderman villain that only appeared in one story, so no one has ever heard of him.
bring him back and have him team up with The Wall

Milo and POTUS
Sep 3, 2017

I will not shut up about the Mighty Morphin Power Rangers. I talk about them all the time and work them into every conversation I have. I built a shrine in my room for the yellow one who died because sadly no one noticed because she died around 9/11. Wanna see it?

3D Megadoodoo posted:

Green Lantern is weak against wood. They're gay.

Uh yeah cause gay people are strangers to wood :rolleyes: lets get a grip shall we

stringless
Dec 28, 2005

keyboard ⌨️​ :clint: cowboy

The Demilich posted:

Did the comics ever explain how a green lantern can even do anything under a yellow sun?
The Sun isn't actually yellow.

Adbot
ADBOT LOVES YOU

tokin opposition
Apr 8, 2021

I don't jailbreak the androids, I set them free.

WATCH MARS EXPRESS (2023)

gimme the GOD drat candy posted:

gateway and gatecrasher are marvel. gatekeeper is apparently a spiderman villain that only appeared in one story, so no one has ever heard of him.

i'm more terrified of the gaslighter

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply