Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Strange Matter
Oct 6, 2009

Ask me about Genocide

Tiggum posted:

loving finally. You finally get it. That's been my point this whole time. THE SHOW IS DUMB. It breaks its own rules. This is what I'm saying.
I mean, it's science-fiction, my dude, I don't know what else to tell you. It requires a certain degree of suspension of disbelief to tell the story; it's no different from Star Trek or Back to the Future. And you're ignoring the second part of my statement, which is that the story doesn't hinge on whether the technology is 100% theoretically accurate, it's about how people interact with it. The story hinges on the idea that the characters believe their system works in a way that it actually doesn't and are blind to the possibly that they might be wrong because of their ideology.

And even if it's not 100% scientifically coherent due to chaos theory or whatever it's still thematically coherent which is just as important, if not more so. Katie and Forrest are convinced that their system cannot be defied and choose to believe that causality itself has broken down instead of the possibility that they are wrong and their system and thus their ideology is fundamentally flawed. That's the whole point of the story.

ghostwritingduck posted:

Agreed on all points. We never see the machine go beyond human history or after Lily’s death so speculation about meteor strikes isn’t really necessary anyway.
In theory it wouldn't be hard to manually adjust the prediction to account for Lily's paradox. Now that it's occurred and causality as moved past Katie should be able to go into the system and correct the projection to account for what she actually did rather than for what she was supposed to have done. But is there even a point to that? Lily proved that Devs is fallible. If Katie was looking at it pragmatically or even just scientifically then that's fine, because it can still produce results in all but the most fringe circumstances, and if those were adequately controlled it could continue to function. But it's just as likely that Katie's ideology would cause her to abandon that angle entirely because even if it produced results it's still flawed and therefore, for all intents and purposes, a false idol. Better instead to focus on what it can do correctly, which is simulate many worlds whose inhabitants can't defect from the simulation because they exist inside it, and more specifically give Forrest the opportunity to have his family back.

Strange Matter fucked around with this message at 18:00 on Apr 22, 2020

Adbot
ADBOT LOVES YOU

Martman
Nov 20, 2006

Strange Matter posted:

I think the point you're ultimate shooting at is that the Devs system is either flawless or it doesn't work at all, because there's no such thing as an inconsequential deviation in a system where every part is dependent on the absolute behavior of every other part. Which is probably true, and it's a concession that the show has made in order to tell a story that's as much about hubris, delusion and grief as it is about speculative technology and philosophy.
There has to be a limit on how far it can project though. It's still a machine, and unless that limit is equal to the expansion rate of the universe it's going to reach a point where it just runs out of memory.

Tiggum posted:

loving finally. You finally get it. That's been my point this whole time. THE SHOW IS DUMB. It breaks its own rules. This is what I'm saying.
I don't see how what Strange Matter said is drastically different from what I've been saying. When I said the failure meant Devs doesn't actually work (at least, the way Forest and Katie think it does), you likened it to a fridge that does a pretty good job.

SaTaMaS
Apr 18, 2003
Or a weather prediction simulation that can be perfect one second from now and useless more than a week from now. There are other options between flawless and useless.

Tiggum
Oct 24, 2007

Your life and your quest end here.


Strange Matter posted:

I mean, it's science-fiction, my dude, I don't know what else to tell you. It requires a certain degree of suspension of disbelief to tell the story
OK, I thought we were finally on the same page but it turns out we're not. What I'm saying - what I have been saying all along - is that the show contradicts itself. The way the machine works in one instance is not how it works in a different instance. This has nothing to do with suspension of disbelief or realism - you can have straight-up magic in a fictional world and it's fine - you just need to be consistent. If, to use magic as an example, you say that magic can heal any injury but not revive the dead then in the last episode the wizard brings someone back to life just because, that is a contradiction of the established rules of the system. If you don't justify it then it ruins the story. It destroys the impact of anything that was built around that limitation, or really any other limitation because it's proved that the writers don't care about their own rules and nothing matters at all because they'll just write their way out of any problem with whatever contrivance is easiest. That is what happened here. The machine was shown (and explained) to work in one particular way and then it didn't. That is the problem.

Strange Matter posted:

And even if it's not 100% scientifically coherent due to chaos theory or whatever it's still thematically coherent which is just as important, if not more so.
I have no idea how scientifically accurate it is (I suspect not at all) but you are 100% wrong about it being thematically consistent. Everything hangs together except for the machine breaking and the happy ending being pulled out of nowhere. The final episode ruins the story. Not because the machine is unrealistic, but because it suddenly works in a different way to how it has worked previously, which undermines everything the show is trying to say.

Strange Matter posted:

Katie and Forrest are convinced that their system cannot be defied and choose to believe that causality itself has broken down instead of the possibility that they are wrong and their system and thus their ideology is fundamentally flawed.
They know the machine is simulating many worlds though? They know that its predictions are not 100% accurate. It's the exact thing that Forrest is complaining about when Lyndon first implements the many worlds solution. It's not that they think their system can't be defied. It certainly surprises Forrest that Lily doesn't behave as predicted but that's not the thing they said "broke causality". It's the point in time the machine can't predict past, which they speculate is caused by something Lily does but never actually find an explanation for. And it's that barrier that is the massive, gaping plot hole that ruins the story.

Strange Matter posted:

In theory it wouldn't be hard to manually adjust the prediction to account for Lily's paradox.
What theory? What paradox? There is no paradox!

Strange Matter posted:

Lily proved that Devs is fallible.
I do not know how you can actually watch the show and come to this conclusion. It is absolute nonsense. It requires you to flat out ignore things that happened in previous episodes.


Martman posted:

When I said the failure meant Devs doesn't actually work (at least, the way Forest and Katie think it does), you likened it to a fridge that does a pretty good job.
No. To a fridge that does a perfect job until it doesn't work at all. Because that's what happens in the show. The machine works perfectly up until the point at which it breaks. It works exactly as they think it does, and then it stops working for reasons that are never explained.

Martman
Nov 20, 2006

Tiggum posted:

No. To a fridge that does a perfect job until it doesn't work at all. Because that's what happens in the show. The machine works perfectly up until the point at which it breaks. It works exactly as they think it does, and then it stops working for reasons that are never explained.
You're the one arguing for the view of determinism where time is basically an illusion. The machine is always failing because Lily violating its prediction is baked into it. It's a guaranteed outcome.

Martman
Nov 20, 2006

Tiggum posted:

To clarify, because I had to click back through several quotes to remember what this conversational thread is even about, we're talking about whether there's any difference between "choosing" to do as the simulation predicted or not. And it doesn't matter whether the machine is good at predictions or not, the fact is that in a deterministic universe choice is an illusion.
You said the "one second in advance" prediction proves beyond a shadow of a doubt that "the universe does not operate on the rules of human perception." This is utter nonsense. Another possible explanation is that the machine is simply very good at guessing the outcomes of people seeing its predictions, but not perfect.

e: oops double post

IMO you've taken a scene of the machine doing a very impressive trick, and have decided for no reason that the characters within that scene are speaking unadulterated truths about the rules of the show in that scene. People are in awe about what's happening and Stewart, a man who loves to wax poetic, worries in a moment that the machine must be an impossible bit of magic.

Martman fucked around with this message at 20:02 on Apr 22, 2020

Clanpot Shake
Aug 10, 2006
shake shake!

You guys are getting really hung up on the exact nature of reality and the exercise of free will versus an omniscient machine and are missing the extremely obvious and dumb reason why the machine failed: Stewart broke it. It had nothing at all to do with Lily, or Forest, or causality or free will. Stewart sent the elevator pod crashing into the machine's precious golden vacuum, ruining who knows how many sensors or whatever, and with garbage input, the machine produced garbage output.

Martman
Nov 20, 2006

Clanpot Shake posted:

You guys are getting really hung up on the exact nature of reality and the exercise of free will versus an omniscient machine and are missing the extremely obvious and dumb reason why the machine failed: Stewart broke it. It had nothing at all to do with Lily, or Forest, or causality or free will. Stewart sent the elevator pod crashing into the machine's precious golden vacuum, ruining who knows how many sensors or whatever, and with garbage input, the machine produced garbage output.
But it was wrong before that happened. Unless him breaking it at any time broke it forever which I dunno.

Strange Matter
Oct 6, 2009

Ask me about Genocide
Boy it sure has been a long time since I've seen someone get this mad over media on a forum. Takes me back to console warring when I was a teenager.

Tiggum posted:

OK, I thought we were finally on the same page but it turns out we're not. What I'm saying - what I have been saying all along - is that the show contradicts itself. The way the machine works in one instance is not how it works in a different instance. This has nothing to do with suspension of disbelief or realism - you can have straight-up magic in a fictional world and it's fine - you just need to be consistent. If, to use magic as an example, you say that magic can heal any injury but not revive the dead then in the last episode the wizard brings someone back to life just because, that is a contradiction of the established rules of the system. If you don't justify it then it ruins the story. It destroys the impact of anything that was built around that limitation, or really any other limitation because it's proved that the writers don't care about their own rules and nothing matters at all because they'll just write their way out of any problem with whatever contrivance is easiest. That is what happened here. The machine was shown (and explained) to work in one particular way and then it didn't. That is the problem.
What looks to you like a contradiction is actually the entire point of the show and justifies it from episode 1 onward, because the show isn't about the scientific plausibility of the Devs system, it's about the implications of what that system would have on the people affected by it. Katie and Forrest do cruel things across the scope of the show and absolve themselves of responsibility through their faith in determinism, faith that their machine demonstrates, but the machine only works because they have that faith. It goes back to the folded hands experiment; if they actually attempted the experiment the machine would have failed right then and there, but they didn't, and the machine correctly calculated that they wouldn't because to do so would contradict their ideology. Jamie says "the problem with tech leaders is that they start to think they're Messiahs", and a messiah by definition is ideologically driven. Their understanding of the machine is that its projections are absolute, but their understanding is proven wrong, because the machine doesn't work as well as they think it should. Should it have even worked in the first place? Maybe not. Does that undermine the entire show? No because the show is speculating on the idea of "what if this machine exists, how would it affect people?." If your answer is "the machine is dumb, it doesn't matter" then I guess go read a book instead? I'm really not sure what you think this show is actually supposed to be about beyond the boot of causality stomping on a human face forever, at all points in time.

quote:

I have no idea how scientifically accurate it is (I suspect not at all) but you are 100% wrong about it being thematically consistent. Everything hangs together except for the machine breaking and the happy ending being pulled out of nowhere. The final episode ruins the story. Not because the machine is unrealistic, but because it suddenly works in a different way to how it has worked previously, which undermines everything the show is trying to say.
The fact that they specifically bring out the folded arms experiment a couple episodes earlier shows the consistency. Forrest and Katie think about it for a second, and the notion that maybe it would work chills Forrest to the bone. As a result he doesn't do it, and the machine predicted that he wouldn't, which is why it keeps chugging along. Katie says it wouldn't matter because they live in a physical world, not a magical one. Forrest ponders that maybe they're just magicians living in a physical world. But, again, this ideology speaking, not actual science. What ends up happening is much simpler, which is that his computer breaks because it can't process the contradiction. It either breaks, or it diverges from their world track.

quote:

They know the machine is simulating many worlds though? They know that its predictions are not 100% accurate. It's the exact thing that Forrest is complaining about when Lyndon first implements the many worlds solution. It's not that they think their system can't be defied. It certainly surprises Forrest that Lily doesn't behave as predicted but that's not the thing they said "broke causality". It's the point in time the machine can't predict past, which they speculate is caused by something Lily does but never actually find an explanation for. And it's that barrier that is the massive, gaping plot hole that ruins the story.
Here's what Katie tells Lily in episode 6:

"Tomorrow night, just after 1am, something happens. An unknown event. And it triggers a total breakdown of cause and effect. A breakdown of determinism. A breakdown of the literal laws of the universe. And we think it involves you."

Sure sounds to me like Katie is saying that whatever Lily does violates causality rather than just proves their system is broken.

quote:

What theory? What paradox? There is no paradox!
I mean I outlined the loop pretty exhaustively in the post that you're quoting. The Devs system experiences what can best be described as a paradox because it cannot successfully simulate Lily's actions inside the lift. But once those actions take place in reality I don't see why an engineer couldn't go into the system and say "okay so here's what actually happened" and set the system back on track.

quote:

I do not know how you can actually watch the show and come to this conclusion. It is absolute nonsense. It requires you to flat out ignore things that happened in previous episodes.
A computer program that runs perfectly until someone does something it didn't expect and causes it to crash isn't nonsense. The show isn't just about determinism, it's about the tech bro god complex taken to a logical extreme.

Strange Matter fucked around with this message at 20:36 on Apr 22, 2020

Tiggum
Oct 24, 2007

Your life and your quest end here.


I don't care any more.

Haptical Sales Slut
Mar 15, 2010

Age 18 to 49
Man this show was pretty god damned fun to watch. My favorite thing is the music, felt almost Trent Reznorish.

The end did leave me disappointed. But, overall it's great just to see this type idea explored in a mainstream show. It brings up all sorts of uncomfortable questions, which is really all that matters.

In any case, I never did buy Lily being special. How did no one ever deviate from the path after watching the future, just to see what would happen? That doesn't make Lily special, that makes everyone else a moron.

Zachack
Jun 1, 2000




Strange Matter posted:

What looks to you like a contradiction is actually the entire point of the show and justifies it from episode 1 onward, because the show isn't about the scientific plausibility of the Devs system, it's about the implications of what that system would have on the people affected by it...
Should it have even worked in the first place? Maybe not. Does that undermine the entire show? No because the show is speculating on the idea of "what if this machine exists, how would it affect people?."
Generally, I think you're correct about the first part: the show is about the impact of a determinism machine on the human psyche. The problem with the show, and this has befallen other shows/works*, is that the creator invests too much time and interest on the mystery/mcguffin/donkey wheel, to the point where claims that the story is "about the characters" or whatever rings hollow. Without counting the minutes of each episode, I'd hazard over half the show focuses on Lily's pot boiler adventure, which isn't exploring the implications of the machine at all (the hobo spy could have been replaced with a friendly but crazy SF parrot that saves the day and it would have had the same effect). When the show does focus on the machine, it spends a lot of effort on what the machine can do, rather than what it does. Determinism Mindfuck is like... given 3rd place in focus time. And when a show spends a lot of time talking about the fireworks factory, the factory is expected to deliver.

*Lost is an obvious example, as is the comic book Y: The Last Man, which for its claim of being about characters spends a huge amount of effort exploring the setup mystery. BSG is another one.

quote:

I mean I outlined the loop pretty exhaustively in the post that you're quoting. The Devs system experiences what can best be described as a paradox because it cannot successfully simulate Lily's actions inside the lift.
I just don't agree with your description because the 1-second scene works directly against that, and dismissing that as "one second isn't enough time" is simply wrong, because one second is actually a lot of time for True Free Will to break determinism by just folding arms or looking left instead of right. The machine isn't guessing actions of humans, it's tracking the particles that happen to, at times, be someone's brain or skull or balls or pee. Further, during that scene the machine is successfully predicting an infinite chain - the people in the machine are watching people one second ahead watching one second ahead etc, so the machine must have a ludicrous amount of magic computer juice.

Remove that scene and the show has a lot more leeway to argue the Tech Priesthood outcome (although I think that's kinda stupid and not explored nearly enough), but I think Garland et al thought that scene was just too cool to leave out... and it kinda was, because it's a big "oh poo poo" moment that confirms rules the show was stating (until, suddenly, it isn't).

quote:

A computer program that runs perfectly until someone does something it didn't expect and causes it to crash isn't nonsense. The show isn't just about determinism, it's about the tech bro god complex taken to a logical extreme.
I think the show really loves the latter statement but I think it did a terrible job of it and it comes across like his idea of "tech bros" comes from an SNL sketch of the last episode of Silicon Valley - "tech bros", if anything, would turn the Box into a game that they try to beat and if you beat it then you get a new hat or knife skin and tells all your AmayaFriends about how much you love breaking determinism and want to hug and kiss breaking determinism/anti-vaxxers.

Martman
Nov 20, 2006

Zachack posted:

I just don't agree with your description because the 1-second scene works directly against that, and dismissing that as "one second isn't enough time" is simply wrong, because one second is actually a lot of time for True Free Will to break determinism by just folding arms or looking left instead of right. The machine isn't guessing actions of humans, it's tracking the particles that happen to, at times, be someone's brain or skull or balls or pee. Further, during that scene the machine is successfully predicting an infinite chain - the people in the machine are watching people one second ahead watching one second ahead etc, so the machine must have a ludicrous amount of magic computer juice.
My argument is that the show is never saying Free Will exists, but is instead saying that human perception and intention make a difference in the possible range of a human's (still deterministic) choices. And it's saying that this intention matters enough that the act of showing someone their prediction when that person commits to deviating from it is doomed to fail. I don't think hardly anyone is actually arguing that the end of the show proves free will exists.

Think about how blatantly the show avoids any of the characters doing any kind of serious attempt at scientifically testing the machine's limitations. I think that scene works perfectly as an example of people being so dazzled by a flashy display of technology that they can be convinced of things far beyond what the tech actually does. Their gullibility is what allows it to be accurate.

EDIT:

Nuts and Gum posted:

That doesn't make Lily special, that makes everyone else a moron.
I think Garland having made Ex Machina makes it very plausible that this is intentional. The dude is not a fan of "tech bros."

Martman fucked around with this message at 07:16 on Apr 23, 2020

ghostwritingduck
Aug 26, 2004

"I hope you like waking up at 6 a.m. and having your favorite things destroyed. P.S. Forgive me because I'm cuter than that $50 wire I just ate."

Zachack posted:


I just don't agree with your description because the 1-second scene works directly against that, and dismissing that as "one second isn't enough time" is simply wrong, because one second is actually a lot of time for True Free Will to break determinism by just folding arms or looking left instead of right. The machine isn't guessing actions of humans, it's tracking the particles that happen to, at times, be someone's brain or skull or balls or pee. Further, during that scene the machine is successfully predicting an infinite chain - the people in the machine are watching people one second ahead watching one second ahead etc, so the machine must have a ludicrous amount of magic computer juice.

Remove that scene and the show has a lot more leeway to argue the Tech Priesthood outcome (although I think that's kinda stupid and not explored nearly enough), but I think Garland et al thought that scene was just too cool to leave out... and it kinda was, because it's a big "oh poo poo" moment that confirms rules the show was stating (until, suddenly, it isn't).


I don't think 1 second is all that much time if the machine is making undetectable adjustments (to the user) based on subconscious decisions that the user isn't aware of. It's sort of like this, but in Devs, the machine has way more information to make a near perfect calculation with only 1 second of processing time, especially when the surprise and shock takes up so much of the 30 second experiement.

14:09 if the time doesn't work.
https://www.youtube.com/watch?v=lmI7NnMqwLQ&t=849s

https://www.youtube.com/watch?v=AR3hY9iB5-I

ghostwritingduck fucked around with this message at 12:58 on Apr 23, 2020

Strange Matter
Oct 6, 2009

Ask me about Genocide

Zachack posted:

Generally, I think you're correct about the first part: the show is about the impact of a determinism machine on the human psyche. The problem with the show, and this has befallen other shows/works*, is that the creator invests too much time and interest on the mystery/mcguffin/donkey wheel, to the point where claims that the story is "about the characters" or whatever rings hollow. Without counting the minutes of each episode, I'd hazard over half the show focuses on Lily's pot boiler adventure, which isn't exploring the implications of the machine at all (the hobo spy could have been replaced with a friendly but crazy SF parrot that saves the day and it would have had the same effect). When the show does focus on the machine, it spends a lot of effort on what the machine can do, rather than what it does. Determinism Mindfuck is like... given 3rd place in focus time. And when a show spends a lot of time talking about the fireworks factory, the factory is expected to deliver.

*Lost is an obvious example, as is the comic book Y: The Last Man, which for its claim of being about characters spends a huge amount of effort exploring the setup mystery. BSG is another one.
See this is a very valid take, and I've had to ask myself repeatedly whether the show did or didn't work for me on those ground. Speaking personally I'd say it did, because I truly felt that the show placed very clear emphasis on Forrest and Katie's psychology in relation to the Devs system. My favorite scene in the entire show is actually Stanley grilling Forrest about history, because it horrifies him that a machine which allows a person to look at any point in human history is in the hands of a person who could not care less about it. For Stanley, Devs is the answer to every question mankind has ever had about its own existence, and to Forrest it's a vanity project.

Personally it worked for me, but that's entirely subjective. I was wholly let down by Mr. Robot's ending, a show that shares a lot Devs' DNA for exactly the reason you listed, but other people really liked it.

quote:

I just don't agree with your description because the 1-second scene works directly against that, and dismissing that as "one second isn't enough time" is simply wrong, because one second is actually a lot of time for True Free Will to break determinism by just folding arms or looking left instead of right. The machine isn't guessing actions of humans, it's tracking the particles that happen to, at times, be someone's brain or skull or balls or pee. Further, during that scene the machine is successfully predicting an infinite chain - the people in the machine are watching people one second ahead watching one second ahead etc, so the machine must have a ludicrous amount of magic computer juice.

Remove that scene and the show has a lot more leeway to argue the Tech Priesthood outcome (although I think that's kinda stupid and not explored nearly enough), but I think Garland et al thought that scene was just too cool to leave out... and it kinda was, because it's a big "oh poo poo" moment that confirms rules the show was stating (until, suddenly, it isn't).
You could be right about this, but the way I read it is that the machine is very good at predicting, for lack of a better word, reflexive behavior. Tap a person's kneecap and their leg jumps. Blow in their face and they wince, shine a light in their eyes and they blink. The one-second experiment only lasts like 30 seconds and during that time the users are so thoroughly freaked out that testing the machine's limits doesn't cross their mind. BUT your point does stand and it highlights how incredibly unstable the system actually is. Independent of Lily's behavior it's inevitable that someone would have attempted the arms folding test and caused it to crash.

Really, look at the timeline; the one second experiment is done earlier that day and it's evident that it's the first time the team has really got together to test the completed system. Up until that point they've been following the wrong course by rejecting the Many Worlds interpretation, plus the fear of getting fired by Forrest and exiled like Lyndon. Maybe 12 hours later, after running without issue for months the system crashes. It can't even go a full day without breaking down once it's capable of giving perfect projections.

Actually, the timeline is my biggest issue with the story. Forrest and Katie talk about the last day as if they've watched it a hundred times, yet it's only in the last two days (or so) of the story that the system reaches a point where it can clearly project sound and visuals. Were they just in crunch time for those two days? Because before when they show the projection of Lily's death it's so grainy that it's very unclear who we're even looking at. THAT feels like a plot hole to me.

Strange Matter fucked around with this message at 14:15 on Apr 23, 2020

ghostwritingduck
Aug 26, 2004

"I hope you like waking up at 6 a.m. and having your favorite things destroyed. P.S. Forgive me because I'm cuter than that $50 wire I just ate."

Strange Matter posted:

Actually, the timeline is my biggest issue with the story. Forrest and Katie talk about the last day as if they've watched it a hundred times, yet it's only in the last two days (or so) of the story that the system reaches a point where it can clearly project sound and visuals. Were they just in crunch time for those two days? Because before when they show the projection of Lily's death it's so grainy that it's very unclear who we're even looking at. THAT feels like a plot hole to me.

I think the closer they were in time, the clearer the projection. Arthur Miller and Marilyn Monroe were pretty clear despite being decades further away.

Strange Matter
Oct 6, 2009

Ask me about Genocide
That is true, it's just not super well communicated by the story.

One other thing that I keep coming back to is that the statement Devs is trying to make is that whether or not determinism is true it can't be reasonably used as a system of ethics or morality. At various points Katie and Forrest are both tortured by the things they are party to-- Forrest ordering Sergei's murder and Katie enabling Lyndon and Forrest's deaths. Both of them feel that they have no choice in the matter because the events are all predetermined, but they're only predetermined because they built the Devs system and saw the projections. In the absence of a machine mapping out their actions based on deterministic interactions those choices would have seemed freely chosen, and based on their emotional responses would never even have occurred to them, had they not seen the projections. The creation of the machine in the first place sets in place a fixed sequence of events for them-- they never would have built it if they didn't believe in determinism, and because they are strict determinists they are impelled to follow it, which is what allows it to continue functioning. In that case is the Devs system just the slit experiment again, where the result is changed by introducing a fixed and objective measurement?

Terry Grunthouse
Apr 9, 2007

I AM GOING TO EAT YOU LOOK MY TEETH ARE REALLY GOOD EATERS
Listen there are two types of people in this show. Those that believe in God and have complete faith that it exists and infallible, and you've got your people (Lilly) who doesn't believe in God and rejects its existence. The faithful can't believe that she did this because, like, that's religion. To those who don't believe. There can be no all-knowing all-seeing God. It's impossible. If you have faith there's nothing anyone or anything can do to change your mind. You are devout. If you don't have faith, then you don't subscribe to that higher power, and you are in control of your own choices.

Forest transcends to an afterlife. Lilly ends up in a computer simulation.

Martman
Nov 20, 2006

https://twitter.com/Nick_Offerman/status/1253348142436810752

some kind of bug
Jul 10, 2011

Just binged the whole show, really enjoyed it, but was there a point to the vacuum Devs was in besides making it so the entrance was really cool... Computer seemed to keep working just fine once the seal was broken

Strange Matter
Oct 6, 2009

Ask me about Genocide

some kind of bug posted:

Just binged the whole show, really enjoyed it, but was there a point to the vacuum Devs was in besides making it so the entrance was really cool... Computer seemed to keep working just fine once the seal was broken
My guess would be for the initial testing where they were digitizing objects down to the molecular level they needed to ensure that there could be no possible outside interference, and I guess Forrest thought that crazy set-up was preferable to building it at the bottom of a salt mine.

dpkg chopra
Jun 9, 2007

Fast Food Fight

Grimey Drawer

Strange Matter posted:

My guess would be for the initial testing where they were digitizing objects down to the molecular level they needed to ensure that there could be no possible outside interference, and I guess Forrest thought that crazy set-up was preferable to building it at the bottom of a salt mine.

I figured it was an extreme form of air gapping for security but your take makes sense

Also metaphorically I guess it works as a way to show that tech people are completely disconnected from our reality

Strange Matter
Oct 6, 2009

Ask me about Genocide

Ur Getting Fatter posted:

I figured it was an extreme form of air gapping for security but your take makes sense

Also metaphorically I guess it works as a way to show that tech people are completely disconnected from our reality
It also works as a physical representation of the Devs' extreme sensitivity. It only functions as hermetically sealed system, and as soon as you break that seal by introducing contradictory ideologies it all comes crashing down.

some kind of bug
Jul 10, 2011

Strange Matter posted:

It also works as a physical representation of the Devs' extreme sensitivity. It only functions as hermetically sealed system, and as soon as you break that seal by introducing contradictory ideologies it all comes crashing down.

Yeah I get that in a metaphorical sense, it did continue to function though

bewilderment
Nov 22, 2007
man what



OK I finally watched the show.

I feel a little bit let down at the ending not going to as exciting places as I wanted - especially given the idea of an 'infinite chain' teased, because it's popped up in scifi before and so it wasn't so mindblowing to me. It's even in... two Destiny lore cards.
For context: a Vex is a kinda-sorta alien AI.

quote:

SI: Maya, I need your help. I don't know how to fix this.

SUNDARESH: What is it? Chioma. Sit. Tell me.

ESI: I've figured out what's happening inside the specimen.

SUNDARESH: Twelve? The operational Vex platform? That's incredible! You must know what this means - ah, so. It's not good, or you'd be on my side of the desk. And it's not urgent, or you'd already have evacuated the site. Which means...

ESI: I have a working interface with the specimen's internal environment. I can see what it's thinking.

SUNDARESH: In metaphorical terms, of course. The cognitive architectures are so -

ESI: No. I don't need any kind of epistemology bridge.

SUNDARESH: Are you telling me it's human? A human merkwelt? Human qualia?

ESI: I'm telling you it's full of humans. It's thinking about us.

SUNDARESH: About - oh no.

ESI: It's simulating us. Vividly. Elaborately. It's running a spectacularly high-fidelity model of a Collective research team studying a captive Vex entity.

SUNDARESH:...how deep does it go?

ESI: Right now the simulated Maya Sundaresh is meeting with the simulated Chioma Esi to discuss an unexpected problem.

[indistinct sounds]

SUNDARESH: There's no divergence? That's impossible. It doesn't have enough information.

ESI: It inferred. It works from what it sees and it infers the rest. I know that feels unlikely. But it obviously has capabilities we don't. It may have breached our shared virtual workspace...the neural links could have given it data...

SUNDARESH: The simulations have interiority? Subjectivity?

ESI: I can't know that until I look more closely. But they act like us.

SUNDARESH: We're inside it. By any reasonable philosophical standard, we are inside that Vex.

ESI: Unless you take a particularly ruthless approach to the problem of causal forks: yes. They are us.

SUNDARESH: Call a team meeting.

ESI: The other you has too.

quote:

SUNDARESH: So that's the situation as we know it.

ESI: To the best of my understanding.

SHIM: Well I'll be a [profane] [profanity]. This is extremely [profane]. That thing has us over a barrel.

SUNDARESH: Yeah. We're in a difficult position.

DUANE-MCNIADH: I don't understand. So it's simulating us? It made virtual copies of us? How does that give it power?

ESI: It controls the simulation. It can hurt our simulated selves. We wouldn't feel that pain, but rationally speaking, we have to treat an identical copy's agony as identical to our own.

SUNDARESH: It's god in there. It can simulate our torment. Forever. If we don't let it go, it'll put us through hell.

DUANE-MCNIADH: We have no causal connection to the mind state of those sims. They aren't us. Just copies. We have no obligation to them.

ESI: You can't seriously - your OWN SELF -

SHIM: [profane] idiot. Think. Think. If it can run one simulation, maybe it can run more than one. And there will only ever be one reality. Play the odds.

DUANE-MCNIADH: Oh...uh oh.

SHIM: Odds are that we aren't our own originals. Odds are that we exist in one of the Vex simulations right now.

ESI: I didn't think of that.

SUNDARESH: [indistinct percussive sound]

quote:


SUNDARESH: I have a plan.

ESI: If you have a plan, then so does your sim, and the Vex knows about it.

DUANE-MCNIADH: Does it matter? If we're in Vex hell right now, there's nothing we can -

SHIM: Stop talking about 'real' and 'unreal.' All realities are programs executing laws. Subjectivity is all that matters.

SUNDARESH: We have to act as if we're in the real universe, not one simulated by the specimen. Otherwise we might as well give up.

ESI: Your sim self is saying the same thing.

SUNDARESH: Chioma, love, please hush. It doesn't help.

DUANE-MCNIADH: Maybe the simulations are just billboards! Maybe they don't have interiority! It's bluffing!

SHIM: I wish someone would simulate you shutting up.

SUNDARESH: If we're sims, we exist in the pocket of the universe that the Vex specimen is able to simulate with its onboard brainpower. If we're real, we need to get outside that bubble.

ESI: ...we call for help.

SUNDARESH: That's right. We bring in someone smarter than the specimen. Someone too big to simulate and predict. A warmind.

SHIM: In the real world, the warmind will be able to behave in ways the Vex can't simulate. It's too smart. The warmind may be able to get into the Vex and rescue - us.

DUANE-MCNIADH: If we try, won't the Vex torture us for eternity? Or just erase us?

SUNDARESH: It may simply erase us. But I feel that's preferable to...the alternatives.

ESI: I agree.

SHIM: Once we try to make the call, the Vex may...react. So let's all savor this last moment of stability.

So - Deus isn't hostile like a Vex and it doesn't have an interest in messing with the people in it - although those in control of it can probably fiddle with those settings. But the idea of that story above comes from a nerd hypothetical called 'Roko's Basilisk' and while the details of the Basilisk itself are very silly and not worth getting into, it's been pointed out that the concept is basically 'nerds reinventing god'... which is exactly what the Deus project is.

So it's a little disappointing to see that in this show, the equivalent of the 'warmind' is instead someone who is able to see themselves on the screen and not feel beholden to it. I'm totally on board with the idea that Forest and Katie are 'true believers' and can't imagine going against a prediction, the programmer room was off-balance because of the one-second limit and initial shock, but Lily can easily do it because she doesn't feel like she has to and so... she can't.
The prediction model keeps going for a few seconds and then stops working because that's as far as it can easily extrapolate in that mode.

Elias_Maluco
Aug 23, 2007
I need to sleep
I found this show kinda by accident and decided to give it a go and it was surprisingly interesting.

But to me it kinda felt like the story that was conceived to be told in several seasons, but then was compressed to fit 8 episodes. It would have worked a lot better, imo. By half of it pretty much everything but the ending is explained, and then the rest feels like a series finale

And I have to admit I was a bit annoyed they never tried a simple test to see if they could consciously contradict the machine predictions (like a test where someone could choose between two objects A and B; before the test, they see ahead what the choice will be, and then do the opposite). I know determinism is like their ideology, but they also knew something was wrong from the fact they could not predict after that point. It is kinda dumb not a single person involved decided to try such a simple test, that would broken the machine a lot earlier. Forrest even suggests something similar, but they never try it, which would make sense only if they didnt knew already that something was wrong with their assumptions

And also yeah, is weird that its determined that the machine can only work in many universes theory. But then in the last episode everyone forgot about it. Like, in that conversation between Forrest and Katie, after the fact, is weird that they woulnd realize "yeah, maybe many universes is real, like we already knew it is, since the machine only works inside that theory, so what we saw was just one possible outcome, not the only, and what we got was another" instead of going "yelp, causality was broken. Lily has the magical power of Choice"

Elias_Maluco fucked around with this message at 13:23 on Apr 27, 2020

precision
May 7, 2006

by VideoGames
I finally watched this and it's probably the best show I've seen in a while.

Didn't feel like I needed to suspend my disbelief too much once I realized the real reason the machine wouldn't work didn't involve souls or free will at all; it's just a problem of infinite data. It'd take an infinite amount of time to simulate the result of showing someone a simulation which they then react to because the loop would never end.

It's a limitation of the machine and premise, not reality or free will.

Strange Matter
Oct 6, 2009

Ask me about Genocide

precision posted:

I finally watched this and it's probably the best show I've seen in a while.

Didn't feel like I needed to suspend my disbelief too much once I realized the real reason the machine wouldn't work didn't involve souls or free will at all; it's just a problem of infinite data. It'd take an infinite amount of time to simulate the result of showing someone a simulation which they then react to because the loop would never end.

It's a limitation of the machine and premise, not reality or free will.
Exactly; the core issue isn't about whether or not determinism is a valid vision of reality, the core theme is Forrest's ideology and delusion preventing him from acknowledging the necessary limitations of his technology.

precision
May 7, 2006

by VideoGames
I'm surprised that didn't come up more since there were a few direct references in that direction, including the literal scene of watching them watch themselves. The machine would have to simulate an infinite nested loop of that, which is impossible.

You might still be reacting deterministically, but you're reacting to data that changes the instant you observe it (like the dual slit experiment lecture).

I feel like that's what was meant when Lyndon says "I get it, it's a circle". Whatever the machine shows you can never be the future because the act of observing it changes the way you behave, whether you have free will or not.

It's essential just a time travel paradox.

dpkg chopra
Jun 9, 2007

Fast Food Fight

Grimey Drawer
If that was true then his decision to test what fate had determined for him (by hanging off the bridge) made no sense.

Elias_Maluco
Aug 23, 2007
I need to sleep

Strange Matter posted:

Exactly; the core issue isn't about whether or not determinism is a valid vision of reality, the core theme is Forrest's ideology and delusion preventing him from acknowledging the necessary limitations of his technology.

I like that interpretation but Im not sure thats what the show creators had in mind, since is never mentioned and even Lily seems to accept the explanation that at the end that determinism was broken somehow when he refused to kill Forrest

precision
May 7, 2006

by VideoGames

Ur Getting Fatter posted:

If that was true then his decision to test what fate had determined for him (by hanging off the bridge) made no sense.

No, it fits with that, because she refuses to tell him whether he lives or dies. If the machine shows you anything at all, it can be contradicted. It's a binary problem, as Forest says, but not in the way he thinks. You can never predict THE future, because the act of observing the prediction changes the data and makes the prediction flawed.

So Lyndon is choosing to commit quantum suicide. He believes that in some worlds, he lives. The show implies that Katie knows he always falls.

Elias_Maluco
Aug 23, 2007
I need to sleep

precision posted:

No, it fits with that, because she refuses to tell him whether he lives or dies. If the machine shows you anything at all, it can be contradicted. It's a binary problem, as Forest says, but not in the way he thinks. You can never predict THE future, because the act of observing the prediction changes the data and makes the prediction flawed.

So Lyndon is choosing to commit quantum suicide. He believes that in some worlds, he lives. The show implies that Katie knows he always falls.

Inst Lyndon a she?

edit: also, how the "we can't predict after this point" thing fits on that interpretation? It implies something different actually did happen when Lily did different than predicted, which seems to contradict that "seeing the future automatically invalidates the prediction " rule

Elias_Maluco fucked around with this message at 19:20 on May 2, 2020

Oasx
Oct 11, 2006

Freshly Squeezed

Elias_Maluco posted:

Inst Lyndon a she?

No. Lyndon is a female actor playing a male character (both gender and sex)

Elias_Maluco
Aug 23, 2007
I need to sleep

Oasx posted:

No. Lyndon is a female actor playing a male character (both gender and sex)

I really didn't noticed that

precision
May 7, 2006

by VideoGames

Elias_Maluco posted:

Inst Lyndon a she?

edit: also, how the "we can't predict after this point" thing fits on that interpretation? It implies something different actually did happen when Lily did different than predicted, which seems to contradict that "seeing the future automatically invalidates the prediction " rule

Oh I think there is still a lot of ambiguity, I was just speaking specifically to the point that I didn't find the show implausible at any point; I would have found it more implausible if the machine just worked and people who saw their future inexplicably just did the same things even if they tried not to

Though that might make for an interesting show or movie on its own; having people be like "it's so weird, when those moments are happening, I literally lose control of my body AND mind"

Looks to the Moon
Jun 23, 2017

You are not the only lost soul in this world.
I found the ending bittersweet but as optimistic as it could be, considering what happened right before. I'm just glad Lily found Jamie again, even if they were devs (deus) simulations. They were easily the sweetest element of the series.

Goreld
May 8, 2002

"Identity Crisis" MurdererWild Guess Bizarro #1Bizarro"Me am first one I suspect!"

Nuts and Gum posted:


In any case, I never did buy Lily being special. How did no one ever deviate from the path after watching the future, just to see what would happen? That doesn't make Lily special, that makes everyone else a moron.

I was really disappointed at the ending mostly because the Prisoneresque ‘I am a free woman!!’ human ex machina out of nowhere could have gone to tons of interesting places but went for the ‘I guess she has free will because the plot says so’ instead.

I was expecting something crazy like the machine proving we’re all living in a sim, followed by security protocols starting to zap poo poo out of existence or something.

Lily and most of the characters were also intensely narcissistic and unlikeable. Tech bro to the extreme.

Elias_Maluco
Aug 23, 2007
I need to sleep

Nuts and Gum posted:

Man this show was pretty god damned fun to watch. My favorite thing is the music, felt almost Trent Reznorish.

The end did leave me disappointed. But, overall it's great just to see this type idea explored in a mainstream show. It brings up all sorts of uncomfortable questions, which is really all that matters.

In any case, I never did buy Lily being special. How did no one ever deviate from the path after watching the future, just to see what would happen? That doesn't make Lily special, that makes everyone else a moron.

Pretty much. Anyone could have tested at any time if they could do deviate from the predictions, they never even try. Even Lilly follows the prediction word by word up to the moment she throws away the gun

Maybe we are to understand thats because they are blinded by their deterministic ideology, like people here said, but nothing in the ending or the aftermath seems to even hint that. Is just "wow she is special, she has free will"

Adbot
ADBOT LOVES YOU

WaffleACAB
Oct 31, 2010
Great series. Rewatched the last episode - Julia almost literally says Lily is supernatural and she broke determinism. It feels like the point is what some people have been saying, that Forest and the others were so invested in proving determinism that they followed what the machine showed them.

Forest's end goal was a perfect simulation so that he could live with his daughter again. Using the Many Worlds interpretation meant that he would experience all the bad timelines as well, like he says. Julia asks him if he understands this when he is on the screen.

The only thing that I am confused about is just after that - Forest clearly says 'I just want him back so much'. Subtitles say the same.

Who is he referring to? Could be:
-Himself
-Kenton
-Lyndon
-Jamie - frisbee was fun
-Sergei??

Is it combination of Forest and Lily saying that? Or is it meant to show a mixture of all possible Forests so he has a son or something?

I don't think the show was dumb, the plot was reasonably straightforward in the end but only if you pay close attention to... what the characters are saying I guess?

I also enjoyed that the Russians were the good guys.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply