Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Ham
Apr 30, 2009

You're BALD!

Cojawfee posted:

If you go in with the intention of not doing what it shows you, it can't show you what you are actually going to do. If it shows you flipping the bird, and then after seeing that, you decide you'll put your hand in your pocket, the machine can't go back in time and actually show you putting your hand in your pocket. It can't take everything into account if you decide not to follow its calculation. There's no way for it to show you what it calculates will happen and then for you to think you're doing the opposite but actually doing what it predicted.

Particles don't have a choice about their interactions based on stimuli, what you're proposing is literally magic or randomness.

It's funny but randomness is actually the mainstream interpretation of quantum mechanics which the show completely ignores.

Adbot
ADBOT LOVES YOU

exmarx
Feb 18, 2012


The experience over the years
of nothing getting better
only worse.

Strange Matter posted:

EDIT: I also don't believe there's anything inherently special about Lily, she's just outside the closed system that Forrest has constructed around the Devs machine.

yeah, i think it's this. throughout the series, lily is presented with fait accompli decisions and rejects them ("gently caress you"), while devs is full of tech freaks who get fired if they challenge a specific type of determinism.

i feel like the machine breaking is some kind of double-slit metaphor. like the devs team is using one method to observe its output, which collapses when a different method is used. idk.

ghostwritingduck
Aug 26, 2004

"I hope you like waking up at 6 a.m. and having your favorite things destroyed. P.S. Forgive me because I'm cuter than that $50 wire I just ate."

Ham posted:

Particles don't have a choice about their interactions based on stimuli, what you're proposing is literally magic or randomness.

It's funny but randomness is actually the mainstream interpretation of quantum mechanics which the show completely ignores.

There’s a scene where Forest says he’s afraid that defying the machine would make them magicians followed by Katie and him not attempting the simple experiment Forest outlines. Forest is afraid, because he knows it’s possible that the machine might allow it.

veni veni veni
Jun 5, 2005


QuoProQuid posted:

Are people angry? I get the sense people are just trying to discuss the ending.


No? I don't think so. People just keep responding to me like it was my idea lol.

The simple answer is that the show didn't cover all of it's bases, and no one is likely going to come to a satisfying conclusion on this.

veni veni veni fucked around with this message at 02:01 on Apr 20, 2020

Rochallor
Apr 23, 2010

ふっっっっっっっっっっっっck
Since the post-death version of Lily and Forrest we see are presumably the ones from the simulation preserved at the moment of their deaths when the future projected ended, do they remember that Lily threw the gun away? I can't remember if that was ever brought up/ if Kate ever said she did a thing to upload the brains of the dead to the simulation.

Tiggum
Oct 24, 2007

Your life and your quest end here.


QuoProQuid posted:

The show seems to come firmly down on the view that Lyndon was right and many worlds do exist both in its filmmaking (all those scenes of alternate Katies and Lyndons) and in its narrative (the machine only starts working when Lyndon throws out Forrest's one-world determinism and switches to the Everett interpretation).
It's not at all clear that those scenes of multiple alternatives actually do represent reality actually following the many worlds model. And surely the fact that the machine broke suggests that whatever it was doing wasn't actually working 100% so its model of the universe was, at best, incomplete.

Cojawfee posted:

They all believe that they have to do exactly what the machine says they will do, so they do it.
They do it exactly, without any measurable deviation? Because even a difference imperceptible to a human should make the prediction wrong as far as a computer is concerned.

D-Pad posted:

Up until Lily is shown the future she is on deterministic rails in which time is a straight arrow. When she is shown the future, whether she is shown herself shooting forest OR throwing the gun or even a bunch of puppies, what the machine shows her becomes the cause for her future actions (the effect). Therefore the cause takes place after the effect in time and this creates a paradox in that it becomes possible for the effect to be different than the prediction even though its still all determistic and no actual choice was made.

Ugh this is hard to put into words but basically from the universes point of view everything that happened was still deterministic and Lily made no actual free will choice, but it caused the machine to bug out because it was trying to calculate outside the straight arrow of time and determinism only works and can be calculated if time runs one way with no ability to see the future.
Nah. If the universe is deterministic and the machine can predict the future then it should be able to account for its own predictions. If the initial prediction being observed alters the outcome then the prediction must be updated until it reaches equilibrium (or the machine runs out of time to calculate it but that doesn't seem to be a factor for whatever reason, it can just simulate the entire universe instantaneously). But apparently in the show the machine showed a static, unchanging prediction. It wasn't iterating through possible futures affected by its own predictions, it was just showing one consistently - which, in a deterministic universe, would have to have happened. You can't beat the prediction because it's already accounting for you trying to do that.

Martman posted:

Otherwise you have not explained at all what force is preventing someone from simply not doing the thing that they see.
The same force that lies behind all of your choices. If the universe is deterministic then choice is an illusion and, having seen the future predicted, you will proceed to act in accordance with the prediction while still feeling like you're making conscious decisions. As Forrest put it, repeating the lines from the simulation he'd watched just felt, to him, like saying the things he wanted to say in that moment even though he knew in advance that he was going to say them. Whatever the machine showed you doing you would do and you would feel as though you were doing it of your own free ill and for perfectly explicable reasons.

A deterministic universe is contrary to our perception of reality, but that doesn't actually mean it has to be wrong.

Ham
Apr 30, 2009

You're BALD!

Tiggum posted:

A deterministic universe is contrary to our perception of reality, but that doesn't actually mean it has to be wrong.

This isn't entirely true, it's just contrary to our conception of self. Our perception of reality quite easily accepts that event A causes event B, it's just unintuitive to apply that to ourselves as well as the world around us so people rarely do so.

Zachack
Jun 1, 2000




Cojawfee posted:

We don't know how it works because the only people we have seen interact with the machine are a bunch of true believers who would do exactly as the machine says, and Lily, who wouldn't do what the machine says. That doesn't tell really tell us anything.

I would have agreed with you until the one second projection scene. The whole point of that scene seemed to be that people who are not true believers (because apparently they didn't think about the implications) got brain hosed. If Lily isn't special or whatever then any one of them could have (and from their reactions, would have) stuck their hands in their pockets. And Stewart, who I took to be as wanting to resist the machine, outright tells them that they are in the box.

I initially took that scene as a ballsy move because it seemed to really "go there" with the deterministic theme, but now I think it may have been something that Garland thought sounded cool but wrote things into a corner.

Mr Shiny Pants
Nov 12, 2012

ghostwritingduck posted:

I feel like the majority of the criticism about the lead’s acting stem from the fact that she’s outside of the typical leading actress mold. You complaining about diversity just adds to that.

All of the characters outside of Katie and Forest independents reminded me and my friend of people we know.

Mwah, the wokeness comment was a bit much, I agree. My apologies.

The acting though, I'll stand by that. I don't have a problem with people not being the typical "leading actress mold" ( What is that anyway? ) but I thought it was just bad.

I expected more from Garland I guess, especially after seeing Annihilation.

Mr Shiny Pants fucked around with this message at 07:25 on Apr 20, 2020

veni veni veni
Jun 5, 2005


I dunno, personally I thought she was a good lead. Everything about the show is pretty deadpan and it all felt very intentional. It's not like she was some outlier amongst the other characters. I liked it. For me it totally worked. However, If you didn't like it I'd say point your finger at Alex Garland's direction and not Sonoya Mizuno's performance because it's not like anyone else was a ball of energy either.

Tbh I don't see how Annihilation was even that different.

I'd almost say it's part of his style, but Ex Machina is way more high energy than Devs or Annihilation.

grate deceiver
Jul 10, 2009

Just a funny av. Not a redtext or an own ok.

Ham posted:

But there is no infinite loop involved since the machine is capable of simulating itself over and over again as seen with Stewart and the other devs 2 episodes prior.

For all intents and purposes the machine seems to have infinite power and is fine simulating a simulation within the simulation to reach the inevitable outcome.
I don't think that's the case. I think we've seen the machine simulate it's physical casing, but do we have any proof it can recreate all of its simulations infinitely nested? I know this is a show with a magical superpowered quantum computer, but I don't think it goes that far. Forrest seems to believe it's all-knowing and perfect, but Forrest is also a brokebrained sociopath driven crazy with grief.

quote:

That explanation is also against the entire message of the show unless that message is "the universe is completely deterministic and you don't have free will but at least you can't predict it".

I mean, that's kinda exactly the message I got.


Zachack posted:

This doesn't work because humans (and particle interactions?) aren't capable of making infinite adjustments to the predictions. From a human standpoint, at some point a person is going to get tired or run out of time or whatever and stop adjusting to the predictions, at which point the prediction locks, in which case a prediction can be made from the start because there is an outcome.

Additionally, if the machine is based on particle interactions alone, then you can't "change your mind" because your mind is constructed of particles bouncing around.

Lily doesn't make infinite adjustments, she makes one. It's Deus that has to make infinite nested predictions. Lily looks at the screen, and based on what she sees she makes a decision. Deus, in order to predict what she does, has to also simulate the simulation she sees. And inside that simulation is another Lily that sees a simulation on a screen, and that simulation also needs to be simulated, etc to infinity.

She doesn't change her mind at all, it's more Deus not being able to cope with it's predicions affecting the object of those predictions. It wasn't a problem with other people before Lily, because they just accepted and went with it. You can read Stewart crashing the cube at the as some kind of Final Destination dumb cop-out, where it was always 'meant' to be like that. Or you could read it as Stewart being so convinced that it 'must' happen, that he did it on purpose as a 'failsafe'.

Zachack posted:

The show explicitly depicts this accuracy in the 7th episode with the one-second delay and IMO it really mangles a lot of the "cult/zealot" arguments because the other characters seemed horrified by being predicted and were trying to "beat" the machine.

They weren't trying to beat it. I don't think 1 second is even physically enough to notice something like this and make a decision. Nowhere in the we see anyone from Devs make an actual experiment to see if they can beat a prediction.

Zachack posted:

Further, that "wrong" information does not seem to matter because Katie and Forrest (and others?) were operating on knowledge from the grainy predictions which are correct for our universe. They may be fuzzy because the machine can't perfectly predict every particle but what it can predict should be correct.

They're not 'correct'. The whole point of the graininess is that it was a probabilistic approximation. Sure, what they saw was close enough, but they didn't do anything that would challenge the system. They never had to deal with the simulation influencing the objects of the prediciton, until Lily. They were actively working towards fullfilling the predictions, watching them several times, memorizing them. Kinda no wonder they came true, that's my point.

grate deceiver
Jul 10, 2009

Just a funny av. Not a redtext or an own ok.
I guess if I had to say in one sentence what I think the show is saying, it's: determinism is not the same as predictability and it's also not a good excuse for passivity. Also Godel's Incompleteness Theorem maybe?

But that last part in computer heaven, I don't know what was the point of that.

Strange Matter
Oct 6, 2009

Ask me about Genocide
I realize now that the machine should work again. It hit a breaking point because it couldn't square Lily's behavior with its own predictive models, but now that it's in the past it should be able to factor what has already happened into future predictions.

Basically this season is a set-up for a long running and disappointingly banal network TV show where the Devs team has to investigate deterministic paradoxes each week in order to keep their predictions running.

EDIT: Forrest's giant talking digital head shows up to brief the team each week.

priznat
Jul 7, 2009

Let's get drunk and kiss each other all night.
One thing I thought was interesting and I’m not sure if it was done to just tweak the nose of story conventions was when Lily can rattle off number sequences in a scene only to have that never come back for a payoff at all. In most shows we’re almost preconditioned to expect that to play an essential part at some point but nope!

Martman
Nov 20, 2006

Tiggum posted:

The same force that lies behind all of your choices.
But this isn't true at all if someone is consciously attempting to do the opposite of what the machine shows.

When I want food because I'm hungry, I act in accordance with that desire and acquire and eat some food. If I want to prove to myself that I can not raise my hand even though the machine says I will, you're saying there's a gap there where I simply stop wanting the thing I want.

To say "you'll just find yourself not wanting to beat the machine for no reason" doesn't square with the way sentience works imo. It sounds more like the machine is actively hypnotizing people.

Martman fucked around with this message at 21:08 on Apr 20, 2020

grate deceiver
Jul 10, 2009

Just a funny av. Not a redtext or an own ok.

Strange Matter posted:

I realize now that the machine should work again. It hit a breaking point because it couldn't square Lily's behavior with its own predictive models, but now that it's in the past it should be able to factor what has already happened into future predictions.

It's running computer heaven 24/7 now, so doing predictions is out of the question unless they build a second one.

Vanilla Bison
Mar 27, 2010




Martman posted:

But this isn't true at all if someone is consciously attempting to do the opposite of what the machine shows.

When I want food because I'm hungry, I act in accordance with that desire and acquire and eat some food. If I want to prove to myself that I can not raise my hand even though the machine says I will, you're saying there's a gap there where I simply stop wanting the thing I want.

To say "you'll just find yourself not wanting to beat the machine for no reason" doesn't square with the way sentience works imo. It sounds more like the machine is actively hypnotizing people.

Since the machine necessarily contains itself and the effects of its own forecasting, if it works it's not going to predict and show you a future that contains you doing something you would want to contradict. It's presumably easy to find this kind of "prediction equilibrium" for the people at Devs, who are already heavily invested in the success of the project by the time they get picked for it, but impossible for someone like Lily, who was going to want to tear the temple down no matter what.

If the people working on Devs were proper scientists trying to prove the system doesn't work, they would have broken it the way Lily did long ago. But they're tech priests worshipping at the altar.

stephenthinkpad
Jan 2, 2020
Nice try show. But historical Jesus didn't exist.

Sharks Eat Bear
Dec 25, 2004

Martman posted:

But this isn't true at all if someone is consciously attempting to do the opposite of what the machine shows.

When I want food because I'm hungry, I act in accordance with that desire and acquire and eat some food. If I want to prove to myself that I can not raise my hand even though the machine says I will, you're saying there's a gap there where I simply stop wanting the thing I want.

To say "you'll just find yourself not wanting to beat the machine for no reason" doesn't square with the way sentience works imo. It sounds more like the machine is actively hypnotizing people.

The whole point is that the Devs machine proves that sentience/free will doesn't work the way we think/feel it does. All of your feelings, decisions, experiences, ideas, states of cognition, etc. can be reduced down to the interaction of particles, and knowing where every particle is in relation to each other in a given moment means we can trace back where that particle came from (i.e. perfect sim of the past) and predict where that particle will be in the future.

This is obviously not remotely plausible to achieve in a "near-future" timeframe that the show seems to depict, but I think the show also makes it pretty clear that we're supposed to suspend disbelief about whether such a machine could be created, and instead it asks the question "what if this machine WAS in fact created?" Which is a tantalizing premise for a smart sci-fi show, but ultimately I think why many of us are dissatisfied with the Devs finale is that it kind of pulls a last-minute bait-and-switch and instead of actually addressing the question it says "ACTUALLY LILY IS A GOD"

I think it would be even more dissatisfying/nonsensical if the show's answer was "actually the machine never really worked in the first place, and all the incredible sims of the past and accurate predictions of the future until Lily's Choice were just a coincidence"


grate deceiver posted:

They weren't trying to beat it. I don't think 1 second is even physically enough to notice something like this and make a decision. Nowhere in the we see anyone from Devs make an actual experiment to see if they can beat a prediction.

I have a different read on the scene. It's true that we don't see anyone from Devs specifically make an experiment to beat it, but I think this scene is supposed to indicate that the machine works, to put it simply. The random no-name Devs programmers are clearly horrified, in a way that I think strongly reads as "they are realizing they do not have the power to do something other than what the machine shows them, but they still feel as though they're making free choices and not just following a script".

https://www.youtube.com/watch?v=vOr9XB5rtJE

ghostwritingduck
Aug 26, 2004

"I hope you like waking up at 6 a.m. and having your favorite things destroyed. P.S. Forgive me because I'm cuter than that $50 wire I just ate."

grate deceiver posted:

I guess if I had to say in one sentence what I think the show is saying, it's: determinism is not the same as predictability and it's also not a good excuse for passivity. Also Godel's Incompleteness Theorem maybe?

But that last part in computer heaven, I don't know what was the point of that.

My friend suggested to me that Forest ending up in the sim was the result of his purposeful plan. The sim breaking had more to do with a purposeful obfuscation by Forest to prevent Katie from seeing his long term plan. Since she was the only one who was looking in the future, he had to use psychological manipulation to keep her from catching on. In other words, Forest believed in free will and used that free will to manipulate people into designing and uploading him into heaven with his family.

Evidence for this plan includes:

-Forest insisting to Jamie and Lily that everything was going to be ok.
-Forest monitoring the tram lines to see if they were intact earlier.
-Forest telling Lily that he would be resurrected
-Forest’s magician speech to Katie being psychological manipulation discouraging him from attempting to disobey the systems predictions

I thought it was an interesting enough theory to write up here, especially since my friend thought it was the obvious interpretation.

Cojawfee
May 31, 2006
I think the US is dumb for not using Celsius

grate deceiver posted:



She doesn't change her mind at all, it's more Deus not being able to cope with it's predicions affecting the object of those predictions. It wasn't a problem with other people before Lily, because they just accepted and went with it. You can read Stewart crashing the cube at the as some kind of Final Destination dumb cop-out, where it was always 'meant' to be like that. Or you could read it as Stewart being so convinced that it 'must' happen, that he did it on purpose as a 'failsafe'.


He does it both times. You can see him messing with the panel in the simulation.

grate deceiver
Jul 10, 2009

Just a funny av. Not a redtext or an own ok.

Sharks Eat Bear posted:

I have a different read on the scene. It's true that we don't see anyone from Devs specifically make an experiment to beat it, but I think this scene is supposed to indicate that the machine works, to put it simply. The random no-name Devs programmers are clearly horrified, in a way that I think strongly reads as "they are realizing they do not have the power to do something other than what the machine shows them, but they still feel as though they're making free choices and not just following a script".

https://www.youtube.com/watch?v=vOr9XB5rtJE

I mean, sure, but at that moment why wouldn't the machine accurately predict them? They're having instinctive emotional reactions, that's like the easiest thing for Deus. Of course Stewart didn't choose his on purpose, but wouldn't a 1 second prediction be kinda the hardest possible to try and disprove?


ghostwritingduck posted:

My friend suggested to me that Forest ending up in the sim was the result of his purposeful plan.

Seems like a super convoluted way to go about it, but then again maybe he was just going through the motions because that's what he saw in the sim. on the other hand, computer heaven happens after the Deus brakedown, so he had no way of knowing that it would end up like this.


Cojawfee posted:

He does it both times. You can see him messing with the panel in the simulation.

Oh right, I forgot about that. So it makes perfect sense then.

Martman
Nov 20, 2006

Sharks Eat Bear posted:

The whole point is that the Devs machine proves that sentience/free will doesn't work the way we think/feel it does. All of your feelings, decisions, experiences, ideas, states of cognition, etc. can be reduced down to the interaction of particles, and knowing where every particle is in relation to each other in a given moment means we can trace back where that particle came from (i.e. perfect sim of the past) and predict where that particle will be in the future.
Showing someone the prediction is different from just predicting it. It affects the outcome.

The show does not in any way portray Lily as a god. A bunch of characters who have completely bought into the machine's correctness react as if she is one when she's the only person we're ever shown even trying to beat the machine.

The fact that the projections of history are correct isn't inconsistent; the machine can accurately guess what would happen at any point in time until it starts changing the future by showing its predictions to people.

Martman fucked around with this message at 23:40 on Apr 20, 2020

Strange Matter
Oct 6, 2009

Ask me about Genocide

Vanilla Bison posted:

If the people working on Devs were proper scientists trying to prove the system doesn't work, they would have broken it the way Lily did long ago. But they're tech priests worshipping at the altar.
this is extremely true and reinforced by the fact that when their machine fails to produce projections past Lily's death their conclusion is that the laws of reality are breaking down and not that there's a flaw in their system.

Tiggum
Oct 24, 2007

Your life and your quest end here.


grate deceiver posted:

I don't think that's the case. I think we've seen the machine simulate it's physical casing, but do we have any proof it can recreate all of its simulations infinitely nested?
Yes. The mere fact that it works on people who've seen their own future is proof that it is accounting for their reactions to seeing the predictions.

grate deceiver posted:

They weren't trying to beat it. I don't think 1 second is even physically enough to notice something like this and make a decision.
You don't have to (and in fact can't) make a decision. Seeing the prediction is going to alter your behaviour in some way. In order to be accurate the machine has to already be accounting for that. Even if the way you reacted was only microscopically or undramatically different, to the computer it would still be wrong. If the prediction diverging from reality is what breaks it then it should already be broken.

Strange Matter posted:

I realize now that the machine should work again.
Yeah? It does. Did you not see the bit at the end where Forrest and Lily were in the simulation?

Martman posted:

But this isn't true at all if someone is consciously attempting to do the opposite of what the machine shows.

When I want food because I'm hungry, I act in accordance with that desire and acquire and eat some food. If I want to prove to myself that I can not raise my hand even though the machine says I will, you're saying there's a gap there where I simply stop wanting the thing I want.

To say "you'll just find yourself not wanting to beat the machine for no reason" doesn't square with the way sentience works imo. It sounds more like the machine is actively hypnotizing people.
If the universe is deterministic then choice is an illusion. Always; regardless of whether you've predicted the future or not. If you've got a machine that can predict the future by simulating the entire universe then it must be able to predict itself because it is part of the universe and its predictions must account for people seeing its predictions. You will feel like you're consciously making decisions because that's how the human brain works but you will do exactly as you're predicted to do because that's (in this hypothetical scenario) how the universe works. There's no way around it.

Sharks Eat Bear posted:

I have a different read on the scene. It's true that we don't see anyone from Devs specifically make an experiment to beat it, but I think this scene is supposed to indicate that the machine works, to put it simply.
There's really no other way to read it that doesn't turn the entire premise into nonsense.

ghostwritingduck posted:

-Forest insisting to Jamie and Lily that everything was going to be ok.
He says this because he doesn't believe in free will. To his way of thinking there is no difference between the past, present and future. As he says to Lily, his daughter is as much alive as anyone else is, because she was alive in the past. It's why the idea of getting off the rails, breaking determinism, horrifies him; because if the universe is not deterministic then his daughter is really dead because the present is fundamentally different from the past in that it's the only point at which the future can be altered.

grate deceiver posted:

I mean, sure, but at that moment why wouldn't the machine accurately predict them?
Do you behave exactly the same when looking at a blank wall as you do when looking at a mirror? There's no prediction there, just reflection, but the mere fact that you can see yourself will change your behaviour to some extent. There is no way the machine can predict their reactions to seeing the simulation without also simulating itself.

Martman
Nov 20, 2006

Tiggum posted:

If the universe is deterministic then choice is an illusion. Always; regardless of whether you've predicted the future or not. If you've got a machine that can predict the future by simulating the entire universe then it must be able to predict itself because it is part of the universe and its predictions must account for people seeing its predictions. You will feel like you're consciously making decisions because that's how the human brain works but you will do exactly as you're predicted to do because that's (in this hypothetical scenario) how the universe works. There's no way around it.

There's really no other way to read it that doesn't turn the entire premise into nonsense.

...

Do you behave exactly the same when looking at a blank wall as you do when looking at a mirror? There's no prediction there, just reflection, but the mere fact that you can see yourself will change your behaviour to some extent. There is no way the machine can predict their reactions to seeing the simulation without also simulating itself.
But... the machine stops predicting reactions after Lily decides to change her behavior in response to seeing its prediction. That's the exact scenario you're talking about that would prove Forest's view of determinism wrong. And it happens the first time we see anyone bother to try beating the machine.

I also think, on the other side of what you're claiming about determinism, that there's "no way around" the fact that sentient creatures can react to their perceptions. So if they want to beat the machine, the machine has to be doing something to actively change the way their brain works in order to guarantee they still do as it shows them. I feel like you're saying something along the lines of "If you have a machine that can draw green squares that are also red circles, then there's simply no way around the fact that green squares and red circles are the same thing," and it ignores the actual underlying meaning of the claims involved.

EDIT: Sorry, rereading your posts I realize I misunderstood you. I thought you were arguing that the show comes out in support of the determinism answer and requires that Lily must be magic or something in order to beat it. I was only meaning to argue about what I believe is shown in the show, not what a separate hypothetical machine that definitely does always make correct predictions would do.

Martman fucked around with this message at 05:33 on Apr 21, 2020

Tiggum
Oct 24, 2007

Your life and your quest end here.


Martman posted:

But... the machine stops predicting reactions after Lily decides to change her behavior in response to seeing its prediction.
Does it? Everyone seems to be taking this as fact but the machine doesn't stop when Lily contradicts it. It stops shortly after that for unknown reasons. We also never learn why it was unable to accurately predict Lily's actions. That's part of the problem with the final episode; the stuff that happens apparently contradicts the way the machine has worked in previous episodes. The characters don't understand it and the show never offers an explanation that fits the facts as previously established.

Martman posted:

I also think, on the other side of what you're claiming about determinism, that there's "no way around" the fact that sentient creatures can react to their perceptions.
You don't need a way around it. It's not a contradiction. If the machine is predicting itself (as it has to to be able to predict people's reactions to it in previous instances) then it will always work whether you're trying to beat it or not. If the universe is deterministic and can be perfectly simulated then you simply cannot beat the simulation because it's impossible by definition.

Martman posted:

I feel like you're saying something along the lines of "If you have a machine that can draw green squares that are also red circles, then there's simply no way around the fact that green squares and red circles are the same thing," and it ignores the actual underlying meaning of the claims involved.
This is essentially what the show is saying, because in reality a perfect simulation of the universe actually is impossible. The machine is magic and it stops working because, I guess, the magic ran out. The show is dumb.

Martman posted:

I thought you were arguing that the show comes out in support of the determinism answer and requires that Lily must be magic or something in order to beat it.
"Lily is magic" is one of two consistent interpretations of the show. The other is that the machine just didn't work right. Neither is satisfying, but that's what we've got because the show is dumb.

Sharks Eat Bear
Dec 25, 2004

Martman posted:

Showing someone the prediction is different from just predicting it. It affects the outcome.

The show does not in any way portray Lily as a god. A bunch of characters who have completely bought into the machine's correctness react as if she is one when she's the only person we're ever shown even trying to beat the machine.

The fact that the projections of history are correct isn't inconsistent; the machine can accurately guess what would happen at any point in time until it starts changing the future by showing its predictions to people.

The prediction correctly predicts what happens after people are shown the prediction, as clearly shown by the 1-second in the future scene. That’s why Stewart says “uh oh” and “there’s a box within the box, ad infinitum” during that scene. This scene really doesn’t make sense if the show doesn’t want us to think the machine works.

Think about it another way. If determinism is false, then there would be “random” phenomena (choices made by free will) throughout all of human history that the machine wouldn’t be able to extrapolate into the past, let alone predict the future. In order to have a perfect simulation of the past, the world would have to have been deterministic.

The show is asking us to accept that the world is (or rather, was) deterministic, up until Lily makes a free choice. I think the show actually does itself a disservice by not spending more time grappling with how awe-inspiring a Devs machine would be. Most of the Kenton and Russia subplots would have been time better spent on this (or spent on showing the signs of flaws in the machine, if that actually is the intention). But based on what’s shown to us, there’s no reason to think the machine doesn’t work as intended.

I actually find the assessment of Forrest and Katie being sociopathic monsters to be a little... shallow. Like yes clearly that would be true in a world where they’re making choices... except what if they’ve just legitimately discovered that the world is deterministic? Which is the scenario the show presents us with, and I think is a clever way to get the viewer to consider how morality in a deterministic world is not so straightforward.

grate deceiver
Jul 10, 2009

Just a funny av. Not a redtext or an own ok.
If the machine didn't exist, would Katie's and Forrest's choices be any different morally? The fact that you know the future does not absolve you of responsibility.

Tiggum posted:

You don't have to (and in fact can't) make a decision. Seeing the prediction is going to alter your behaviour in some way. In order to be accurate the machine has to already be accounting for that. Even if the way you reacted was only microscopically or undramatically different, to the computer it would still be wrong. If the prediction diverging from reality is what breaks it then it should already be broken.

I'm not saying the machine is completely broken, I'm saying it's working only to a degree. I'm also not saying that anyone in the show has any 'magical' free will. They don't. The show's universe is completely deterministic. Our universe is probably completely deterministic. But deterministic does not necessarily mean predictable.

Martman
Nov 20, 2006

Tiggum posted:

Does it? Everyone seems to be taking this as fact but the machine doesn't stop when Lily contradicts it. It stops shortly after that for unknown reasons. We also never learn why it was unable to accurately predict Lily's actions.
Whatever happens afterwards doesn't really matter because the machine has failed. It is no longer infallible. Unless you add Lily being magic (which I don't think the show hints at at all), then the show is straightforwardly showing that the machine can be beaten.

Sharks Eat Bear posted:

The prediction correctly predicts what happens after people are shown the prediction, as clearly shown by the 1-second in the future scene. That’s why Stewart says “uh oh” and “there’s a box within the box, ad infinitum” during that scene. This scene really doesn’t make sense if the show doesn’t want us to think the machine works.
It makes perfect sense to me that people who are comfortable buying into the perfection of the machine could be guided to do what it says. There's no reason to believe any of them have any intention of trying to beat it.

quote:

Think about it another way. If determinism is false, then there would be “random” phenomena (choices made by free will) throughout all of human history that the machine wouldn’t be able to extrapolate into the past, let alone predict the future. In order to have a perfect simulation of the past, the world would have to have been deterministic.
My argument is that the predictions only work as long as people aren't shown their future. I'm saying that specific act, and only that act, actually necessarily changes the way the predictions work. I'm not even really arguing about determinism vs. no determinism; I think it's perfectly possible that the show's universe is deterministic, but the machine is shown to be unable to predict its own impact on the universe. It is unable to predict the future correctly, because it showing people the future eventually leads someone to violate its prediction.

I mean this is Alex Garland. I'm much more likely to believe "It turns out we shouldn't have trusted the premises of these tech bros when they went around murdering people and claiming it was destiny" over "a magic person showed up for some reason and broke a machine that was otherwise perfect."

Martman
Nov 20, 2006

In talking through this, I'm reminded of the Spanish movie Los Cronocrímenes (Timecrimes in English). In it, the main character is attacked by a spooky masked attacker, then encounters a mysterious lab and yada yada yada gets sent back in time by one hour. A bunch of time fuckery occurs, and he realizes he was actually the masked maniac from the future. He ends up recreating the actions of his own future self in order to not gently caress up the timeline, including committing murder, etc... Ultimately, there's this big hanging question of why any of that had to happen in the first place. The only answer I can understand is that that behavior was in the man all along waiting for an excuse to come out.

I think this show goes in similar directions with its exploration of the future-predicting device. I think Forest and Lily are opposite sides of the way the machine can work. Forest buys into it completely, and this actually causes/allows him to become evil. His willingness to commit murder is predicated not just on his investment in the correctness of the machine, but the resulting belief that people are machines and don't really exist anymore as "people."

I kind of think Lily breaking it is a big gently caress you to the whole thing. I don't think there's anything magical about her, except her ability to continue believing in her own free will in the face of the machine's predictions.

grate deceiver
Jul 10, 2009

Just a funny av. Not a redtext or an own ok.
Let's not forget that Forest straight up ordered the murder of Sergei and stood there watching, and also openly discussed eliminating other witnesses with his hitman employee. No amount of speaking softly and staring wistfully into the horizon can change the fact that he's a sociopath techbro scumbag.

Katie also effectively killed Lyndon and walked it off like nbd. The machine predicted them doing those things because they always were terrible people. Maybe the reason that Deus didn't experience the infinite loop/whatever brakedown before was because they would have done those things even without Deus.

grate deceiver fucked around with this message at 09:55 on Apr 21, 2020

Tiggum
Oct 24, 2007

Your life and your quest end here.


grate deceiver posted:

But deterministic does not necessarily mean predictable.
The show's universe is predictable though. The machine works.

Martman posted:

I think it's perfectly possible that the show's universe is deterministic, but the machine is shown to be unable to predict its own impact on the universe.
Actually the exact opposite is shown. Several times. Imagine you walk into an empty room. Now imagine you walk into the same room but there's a big mirror on the wall. Your actions are going to be different. You're not trying to "beat the mirror", you're just behaving differently because the situation is different. The people seeing their one-second future were essentially looking into a mirror. If the machine's simulation didn't contain itself and account for its predictions in its predictions then it could not have accounted for the people's behaviour when looking at themselves one second in the future. The fact that it did proves that its predictions are self-consistent and cannot be disrupted by intent or by accident.

Martman posted:

Forest buys into it completely, and this actually causes/allows him to become evil. His willingness to commit murder is predicated not just on his investment in the correctness of the machine, but the resulting belief that people are machines and don't really exist anymore as "people."
There's no need to speculate about this, he explicitly explained himself. If the universe is deterministic then there's no difference between past, present and future; everything that has or will happen is set. It's no different to be alive today than to be alive twenty years ago or 200 years in the future. Anyone who was or will be alive is alive (at some point in time, but time is irrelevant). You can't really kill anyone because they're still alive in the past - just as alive as anyone is now or will be in the future.

Martman posted:

I kind of think Lily breaking it is a big gently caress you to the whole thing. I don't think there's anything magical about her, except her ability to continue believing in her own free will in the face of the machine's predictions.
Believing in free will doesn't make it real any more than disbelief in gravity lets you fly. So yeah, "Lily is magic" is one possible explanation. The other is "the machine just broke for some unexplained reason."

Martman
Nov 20, 2006

Tiggum posted:

If the machine's simulation didn't contain itself and account for its predictions in its predictions then it could not have accounted for the people's behaviour when looking at themselves one second in the future. The fact that it did proves that its predictions are self-consistent and cannot be disrupted by intent or by accident.
This is not even close to true.

First of all, the machine absolutely, unquestionably failed. You can't deny it. You can't just point to the times it succeeded and say that's proof that it works. If it works sometimes and doesn't work other times, then it doesn't work.

Second, it is not hard to believe it can "predict" or comfortably influence the behavior of people who want to go along with it. That doesn't prove much. Would you watch a hypnotist succeed on a bunch of unpaid, unscripted audience members and decide that hypnotism must be completely real?

quote:

There's no need to speculate about this, he explicitly explained himself. If the universe is deterministic then there's no difference between past, present and future; everything that has or will happen is set. It's no different to be alive today than to be alive twenty years ago or 200 years in the future. Anyone who was or will be alive is alive (at some point in time, but time is irrelevant). You can't really kill anyone because they're still alive in the past - just as alive as anyone is now or will be in the future.
This is the point of view of an insane murderer who goes into complete denial when he sees the proof that the machine doesn't actually do what he thinks it does. Why take his justification for murder at face value?

quote:

Believing in free will doesn't make it real any more than disbelief in gravity lets you fly. So yeah, "Lily is magic" is one possible explanation. The other is "the machine just broke for some unexplained reason."
I'm not saying she has free will. I'm saying her believing she has free will allows her to violate the predictions of the machine. Those are different.

Tiggum
Oct 24, 2007

Your life and your quest end here.


Martman posted:

First of all, the machine absolutely, unquestionably failed. You can't deny it. You can't just point to the times it succeeded and say that's proof that it works. If it works sometimes and doesn't work other times, then it doesn't work.
It worked perfectly right up until the point that it failed completely for unknown reasons. It wasn't working inconsistently or intermittently, it just worked. Until it didn't.

Martman posted:

Second, it is not hard to believe it can "predict" or comfortably influence the behavior of people who want to go along with it.
Yeah it is. It's impossible to believe that, in fact, unless you think that seeing yourself in a mirror is the same as seeing a blank wall. At the very least the activity inside those people's brains will be different for having seen the prediction, which makes the prediction wrong. You keep coming back to this idea that if it looks the same to a human observer then it is the same but that's simply not true. From the perspective of a machine simulating the universe on a subatomic scale there's no such thing as imperceptible differences. The prediction is wrong and if being wrong was enough to break the machine it would already be broken. And not just when someone looked at their own future but as soon as they looked at anything. If the simulation doesn't contain itself then it can't account for anything anyone does in response to anything it shows them. And we know that people's behaviour was altered by things they saw in the simulation, so that clearly cannot be the case. The machine works, so it must contain itself.

Martman posted:

That doesn't prove much. Would you watch a hypnotist succeed on a bunch of unpaid, unscripted audience members and decide that hypnotism must be completely real?
The situation is not remotely analogous.

Martman posted:

This is the point of view of an insane murderer who goes into complete denial when he sees the proof that the machine doesn't actually do what he thinks it does. Why take his justification for murder at face value?
Because it's consistent with his actions.

Martman posted:

I'm saying her believing she has free will allows her to violate the predictions of the machine.
How?

Martman
Nov 20, 2006

Tiggum posted:

It worked perfectly right up until the point that it failed completely for unknown reasons. It wasn't working inconsistently or intermittently, it just worked. Until it didn't.
This is a description of a thing that doesn't work.

quote:

You keep coming back to this idea that if it looks the same to a human observer then it is the same but that's simply not true.
I haven't said anything like this so I don't understand this part.

quote:

The machine works, so it must contain itself.
The machine only works until the first time we see someone bother to attempt to beat it, at which point it fails. In other words, it only works as long as people do not attempt to deviate from its predictions.

quote:

How?
All she has to do is not do what it shows, just the way the events play out in the show. Forest never bothers to try that because he believes he has no choice.

Martman fucked around with this message at 12:31 on Apr 21, 2020

Tiggum
Oct 24, 2007

Your life and your quest end here.


Martman posted:

This is a description of a thing that doesn't work.
So if I tell you that my fridge used to keep things cold until one day when it stopped your conclusion is that my fridge never worked?

Martman posted:

I haven't said anything like this so I don't understand this part.
You're not using those exact words but you keep drawing this distinction between "trying to beat the machine" and "going along with it" when both are reactions to the predictions. They are not fundamentally different. They look different to a human because of the story we build in our minds but the universe doesn't operate on the rules of human perception.

Martman posted:

The machine only works until the first time we see someone bother to attempt to beat it, at which point it fails. In other words, it only works as long as people do not attempt to deviate from its predictions.
See, this is the bit you don't seem to get: it doesn't matter if you attempt to beat it or attempt to go along with it or attempt to ignore it entirely. Seeing the simulation alters your behaviour. If the machine couldn't account for someone trying to undermine it then it also couldn't account for someone trying to follow the script or someone just reacting to seeing literally anything in the simulation. If it works in one case then it must work in all these cases because they are identical.

Martman posted:

All she has to do is not do what it shows
HOW? If the universe is deterministic and the simulation contains itself then it has already accounted for her seeing the prediction and therefore the prediction should show what she will do in response to seeing it. There is no room for deviation unless you introduce some element from outside the system (ie. the universe). Either Lily is magic or the machine suddenly broke.

Strange Matter
Oct 6, 2009

Ask me about Genocide
I think the issue is that the Devs machine and the actual reality that we live in aren't the same. Reality is the actual sum of physical laws and the progression of causality collectively experienced by the universe. Devs is code running on an immensely powerful computer in a highly controlled environment. It crashes because it encounters a bug that throws it into a loop, one that it's never experienced before and which its programmers didn't account for, which is something that happens all the time in actual coding, because for as powerful as the machine is it's still something that was developed by fallible humans. I'm not a programmer but I do work with plenty of them and not to paint too broad a portrait of an entire profession but they tend to be highly results oriented and tend to fall into a mindset that as a long a system works it doesn't matter too hard how or why it works.

Forrest complicates this by adding ideology into the mix. Lyndon successfully makes the system work by using Many Worlds and Forrest fires him because it contradicts his ideology instead of investigating why it works and whether that knowledge could be used to improve his version of the system. The process, the details don't matter, he's just aiming at a specific result, in essence trying to force a specific cause for the effect he's pursuing.

Devs appears to perfectly simulate reality, but at the end of the day it's still a simulation. Katie and Forrest insist that there's no difference between the realities inside Devs and outside, but there is, because Devs is a machine that can be turned off and is guided by conscious, fallible input, whereas true reality, metaphysics aside, is not. There's no hardware it's running on, no source code, so it can't suffer a fatal error. It's impossible to crash reality, but not so with Devs.

True Reality has no problem with what Lily does, because her actions are deterministically coherent. At point A Lily sees a mathematical prediction of her future at Point B, so when point B occurs she does the opposite. The chain of events leading from Point A to Point B are unbroken and perfectly deterministic in the real world. But Devs can't cope with that because it correctly calculates that she will defy whatever projection it shows her, which traps the system in a loop. If it shows her X she does Y, and vice versa, which means its ability to predict beyond that event cannot progress and reacts like any computer would and crashes. It's an inherent flaw with its concept, one that would have been exposed pretty early on if the Devs team was pursuing the implications with actual scientific rigor instead of ideology.

grate deceiver
Jul 10, 2009

Just a funny av. Not a redtext or an own ok.

Tiggum posted:

Yeah it is. It's impossible to believe that, in fact, unless you think that seeing yourself in a mirror is the same as seeing a blank wall. At the very least the activity inside those people's brains will be different for having seen the prediction, which makes the prediction wrong. You keep coming back to this idea that if it looks the same to a human observer then it is the same but that's simply not true. From the perspective of a machine simulating the universe on a subatomic scale there's no such thing as imperceptible differences. The prediction is wrong and if being wrong was enough to break the machine it would already be broken. And not just when someone looked at their own future but as soon as they looked at anything. If the simulation doesn't contain itself then it can't account for anything anyone does in response to anything it shows them. And we know that people's behaviour was altered by things they saw in the simulation, so that clearly cannot be the case. The machine works, so it must contain itself.

I mean sure, if we're talking from an objective standpoint, then the existence of a machine like this is straight up impossible, it would never have worked in the first place. But within the confines of a TV show, accepting that small insignificant changes are fine, and big 'deliberate' changes are system-breaking is not that big of a suspension of disbelief. It's a better reading for me than Lucy just being special and destroying Deus with magic or w/e.

Adbot
ADBOT LOVES YOU

Tiggum
Oct 24, 2007

Your life and your quest end here.


Strange Matter posted:

But Devs can't cope with that because it correctly calculates that she will defy whatever projection it shows her, which traps the system in a loop.
This is inconsistent with how the system has been shown to operate in previous episodes. It does seem to be the writer's intended interpretation, but it is a problem not with the fictional computer but with the actual, real-world writing of the show. They made the rules and then they refused to engage with them. The show is dumb.

grate deceiver posted:

But within the confines of a TV show, accepting that small insignificant changes are fine, and big 'deliberate' changes are system-breaking is not that big of a suspension of disbelief.
I mean, it's fine if it's magic. But it's inconsistent with the rest of what we're being asked to believe of this fictional system/reality.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply