Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Aramoro
Jun 1, 2012




Scruffpuff posted:

It works for sorting, or financial predictions, or probability calculations, or random simulations - this is true. But they're claiming this can be used to predict player actions in a video game. Think of the infinite number of actions you can perform in even a relatively simple game. How much computational power will it take a machine to predict every possible player outcome, or even every likely outcome, or poo poo even a single random outcome, in a game like Red Dead Redemption 2? Thank god their tech predicted I would pull out my pistol on a whim, try to shoot a water bucket and miss, hop on a nearby horse and knock myself out on a lowhanging branch because I fumbled the controller swatting at a fly. You can really feel that half-frame the predictive algorithm saved us from having to render.

A player predictive algorithm is, by definition, true AI. Indistinguishable from a player because it's attempting to be a player.

You're getting caught up in the word prediction, usually prediction just means it assumes what you were just doing you're going to continue doing. It's not going to guess you're about to turn right, but if you're walking straight down a corridor and have been for the last 20ms then probably going to be doing so for the next 20ms. In the end it's just latency mitigation and I can't see it working over more than maybe 10's of milliseconds for latency mitigation. I think it could work but seems tremendously difficult for such a minor gain.

Adbot
ADBOT LOVES YOU

Scruffpuff
Dec 23, 2015

Fidelity. Wait, was I'm working on again?

Aramoro posted:

You're getting caught up in the word prediction, usually prediction just means it assumes what you were just doing you're going to continue doing. It's not going to guess you're about to turn right, but if you're walking straight down a corridor and have been for the last 20ms then probably going to be doing so for the next 20ms. In the end it's just latency mitigation and I can't see it working over more than maybe 10's of milliseconds for latency mitigation. I think it could work but seems tremendously difficult for such a minor gain.

If that's what they mean then :agreed:. It comes across as a bunch of starry-eyed wavy hands trying to jam what they think is a visionary idea sideways into an unrelated industry to solve a problem that does not exist.

So Star Citizen, basically.

TheDeadlyShoe
Feb 14, 2014

?Net Citizen?

Sanya Juutilainen
Jun 19, 2019

by Jeffrey of YOSPOS

Scruffpuff posted:

(snip) A player predictive algorithm is, by definition, true AI. Indistinguishable from a player because it's attempting to be a player.

Basically this, yes. A machine that will be able to predict a player in a complex game will be a player by itself. Heck, I believe that to be able to predict moves in chess you have to be on level of neural engines like AlphaZero or Leela (and they'd have to be slightly modified to work with predictions) - classical engines like Stockfish won't be able to do that.

chaosapiant
Oct 10, 2012

White Line Fever

IronicDongz posted:

it's not really what Stadia wants to accomplish, since Stadia aims to eliminate user ownership/local hardware and making users subscription dependent

And this is why I'll never use it. I'm relatively ok with different stores/platforms on PC, but subscription based services to play video games can go gently caress itself. At least for me. It's not for me.

Aramoro
Jun 1, 2012




Sanya Juutilainen posted:

Basically this, yes. A machine that will be able to predict a player in a complex game will be a player by itself. Heck, I believe that to be able to predict moves in chess you have to be on level of neural engines like AlphaZero or Leela (and they'd have to be slightly modified to work with predictions) - classical engines like Stockfish won't be able to do that.

Except it's not trying to predict decision points, it's trying to predict non-decision.

Imagine right now you're playing a game, running down a corridor and you hit a lag spike, generally what happens is you freeze then jump forwards then continue. All this has to do is when you hit the lag spike predict that yeah you are going to continue down the corridor. If it gets it right then you don't notice the difference, if it gets it wrong you stutter and jump to the right place just like before. So then you have to think about that the probabilities of getting it right, probably pretty good for times <500 ms.

Scruffpuff
Dec 23, 2015

Fidelity. Wait, was I'm working on again?

Aramoro posted:

Except it's not trying to predict decision points, it's trying to predict non-decision.

Imagine right now you're playing a game, running down a corridor and you hit a lag spike, generally what happens is you freeze then jump forwards then continue. All this has to do is when you hit the lag spike predict that yeah you are going to continue down the corridor. If it gets it right then you don't notice the difference, if it gets it wrong you stutter and jump to the right place just like before. So then you have to think about that the probabilities of getting it right, probably pretty good for times <500 ms.

It's funny that the further we go into the viability of the approach, the further it gets from the marketing hype-speech. Marketing is basically fraud.

Carcer
Aug 7, 2010

Aramoro posted:

Except it's not trying to predict decision points, it's trying to predict non-decision.

Imagine right now you're playing a game, running down a corridor and you hit a lag spike, generally what happens is you freeze then jump forwards then continue. All this has to do is when you hit the lag spike predict that yeah you are going to continue down the corridor. If it gets it right then you don't notice the difference, if it gets it wrong you stutter and jump to the right place just like before. So then you have to think about that the probabilities of getting it right, probably pretty good for times <500 ms.

The problem here is that there are infinite variations of "Running down a corridor", and guessing wrong is going to lead to you jittering all over the place as the game corrects from what it predicted to what you actually did. Am I running on the left, right or middle? Am I rapidly changing sides, suddenly bunnyhopping, or doing literally anything that could change how fast I'm moving? Am I switching weapons, reloading, interrupting a reload to do whatever, opening a menu?

commando in tophat
Sep 5, 2019

Aramoro posted:

Except it's not trying to predict decision points, it's trying to predict non-decision.

Imagine right now you're playing a game, running down a corridor and you hit a lag spike, generally what happens is you freeze then jump forwards then continue. All this has to do is when you hit the lag spike predict that yeah you are going to continue down the corridor. If it gets it right then you don't notice the difference, if it gets it wrong you stutter and jump to the right place just like before. So then you have to think about that the probabilities of getting it right, probably pretty good for times <500 ms.

Except what you describe isn't anything new. It is basically what every multiplayer FPS was doing since the age of times. At least I read article about it about Quake 1. But funnily enough, that approach described in quake 1 is even better, because it can do stuff like "i pressed key forward" so your local machine begins executing it right away and later smoothly correct it if server disagrees. In google case your local machine doesn't do anything, so this doesn't really affect latency. Or maybe I am too tired to think about their marketing bullshit right know for it to make any sense :shrug:

e: also, "it's trying to predict non-decision". Isn't this like: just run game as if no input was changed? That's truly a big prediction

commando in tophat fucked around with this message at 16:22 on Oct 10, 2019

Nicholas
Mar 7, 2001

Were those not fine days, when we drank of clear honey, and spoke in calm tones of our love for the stuff?
A game that starts feeling laggy as poo poo as soon as you do anything besides walk straight down a hallway.

chaosapiant
Oct 10, 2012

White Line Fever

Nicholas posted:

A game that starts feeling laggy as poo poo as soon as you do anything besides walk straight down a hallway.

The good news is that Star Citizen will never be coded in such a way that you could just walk straight down a hallway. Either you'll fall through the floor and swim, or collapse, or launch into the sun, or any number of things.

Hav
Dec 11, 2009

Fun Shoe

chaosapiant posted:

The good news is that Star Citizen will never be coded in such a way that you could just walk straight down a hallway. Either you'll fall through the floor and swim, or collapse, or launch into the sun, or any number of things.


The possibilities are endless!

Rotten Red Rod
Mar 5, 2002

commando in tophat posted:

In google case your local machine doesn't do anything, so this doesn't really affect latency. Or maybe I am too tired to think about their marketing bullshit right know for it to make any sense :shrug:

e: also, "it's trying to predict non-decision". Isn't this like: just run game as if no input was changed? That's truly a big prediction

It's more like, in Stadia's case, since your local machine doesn't do anything, EVERYTHING is affected by latency. Imagine playing COD and trying to just AIM with input lag, let alone land accurate headshots. The idea they could execute any sort of prediction of what you might do in that scenario is laughable. Will it predict which direction you move the camera? Will it predict when you're going to shoot? Are you just going to fire your gun every time you aim at an enemy? Will the firing animation play, and then "take back" the shot if you didn't actually do it?

I imagine the reason they chose Assassin's Creed for the demo of Stadia is because it's a game that doesn't need precision aiming and already has significant input lag, so a little more wouldn't be noticed. There's actually people that think that Stadia will be viable for VR, which is laughable. Any input lag with VR is vomit city.

Aramoro
Jun 1, 2012




Nicholas posted:

A game that starts feeling laggy as poo poo as soon as you do anything besides walk straight down a hallway.

I think the idea is that it will be no more laggy than normal and might be better.

The infinite variations thing is just not that realistic though is it, I mean you might flail around completely unpredictably every couple of milliseconds but are you really going to do that? nope.

chaosapiant
Oct 10, 2012

White Line Fever

Rotten Red Rod posted:

It's more like, in Stadia's case, since your local machine doesn't do anything, EVERYTHING is affected by latency. Imagine playing COD and trying to just AIM with input lag, let alone land accurate headshots. The idea they could execute any sort of prediction of what you might do in that scenario is laughable. Will it predict which direction you move the camera? Will it predict when you're going to shoot? Are you just going to fire your gun every time you aim at an enemy? Will the firing animation play, and then "take back" the shot if you didn't actually do it?

There's also people that thing that Stadia will be viable for VR, which is laughable. Any input lag with VR is vomit city.

I remember trying to play Quake over TCP/IP back in 1996/97 with a 56k modem. I'd go to move forward a bit, then I could get up and do some pushups or something, and when I sat back down my dude had begun to move. It was the first time I'd ever experienced "lag" and was dumbfounded.

rePool
Apr 18, 2019

Undercover Goon.

So long and thanks for all the fish

Reading this felt like some awards ceremony. I couldn't figure out why someone would be thanking an unreleased video game.

...then I saw his profile picture and it all made sense. I wonder if he teared up writing this?

Rotten Red Rod
Mar 5, 2002

chaosapiant posted:

I remember trying to play Quake over TCP/IP back in 1996/97 with a 56k modem. I'd go to move forward a bit, then I could get up and do some pushups or something, and when I sat back down my dude had begun to move. It was the first time I'd ever experienced "lag" and was dumbfounded.

Yeah, same, Quake was my first online game. Now imagine if your camera/aiming controls were tied to the same lag. That's Stadia!

Aramoro posted:

I think the idea is that it will be no more laggy than normal and might be better.


Yeaaaah I'll believe it when I see it, outside of Google's tightly controlled demos.

chaosapiant
Oct 10, 2012

White Line Fever

rePool posted:

Reading this felt like some awards ceremony. I couldn't figure out why someone would be thanking an unreleased video game.

...then I saw his profile picture and it all made sense. I wonder if he teared up writing this?

That croberts fist bump looks misaligned. Is croberts blind, or did the poster just photoshop croberts into the frame?

Nicholas
Mar 7, 2001

Were those not fine days, when we drank of clear honey, and spoke in calm tones of our love for the stuff?

Rotten Red Rod posted:

It's more like, in Stadia's case, since your local machine doesn't do anything, EVERYTHING is affected by latency. Imagine playing COD and trying to just AIM with input lag, let alone land accurate headshots. The idea they could execute any sort of prediction of what you might do in that scenario is laughable. Will it predict which direction you move the camera? Will it predict when you're going to shoot? Are you just going to fire your gun every time you aim at an enemy? Will the firing animation play, and then "take back" the shot if you didn't actually do it?

There's also people that thing that Stadia will be viable for VR, which is laughable. Any input lag with VR is vomit city.

Exactly. Even input noise from the analogue sticks would destroy any prediction.

Nicholas
Mar 7, 2001

Were those not fine days, when we drank of clear honey, and spoke in calm tones of our love for the stuff?

Aramoro posted:

I think the idea is that it will be no more laggy than normal and might be better.

There have been studys on input/reaction time perception that show that this is worse than just being consistantly laggy.

Sanya Juutilainen
Jun 19, 2019

by Jeffrey of YOSPOS

Aramoro posted:

Except it's not trying to predict decision points, it's trying to predict non-decision.

Imagine right now you're playing a game, running down a corridor and you hit a lag spike, generally what happens is you freeze then jump forwards then continue. All this has to do is when you hit the lag spike predict that yeah you are going to continue down the corridor. If it gets it right then you don't notice the difference, if it gets it wrong you stutter and jump to the right place just like before. So then you have to think about that the probabilities of getting it right, probably pretty good for times <500 ms.

But that is extremely limited. Even in chess - which is way simpler than any FPS - the non-decision/forced moves are pretty rare. And I might be wrong about this, but these non-decision moves are also the ones that are the least network and performance expensive (like in your example the hallway needs to be empty of stuff and enemies, because introducing any of those will cause the player to move randomly, i.e. non-predictably; but empty straight hallway is easy to render and transfer, so why even bother predicting the movement instead of just transferring it?)

commando in tophat
Sep 5, 2019

Rotten Red Rod posted:

It's more like, in Stadia's case, since your local machine doesn't do anything, EVERYTHING is affected by latency. Imagine playing COD and trying to just AIM with input lag, let alone land accurate headshots. The idea they could execute any sort of prediction of what you might do in that scenario is laughable. Will it predict which direction you move the camera? Will it predict when you're going to shoot? Are you just going to fire your gun every time you aim at an enemy? Will the firing animation play, and then "take back" the shot if you didn't actually do it?

I imagine the reason they chose Assassin's Creed for the demo of Stadia is because it's a game that doesn't need precision aiming and already has significant input lag, so a little more wouldn't be noticed. There's actually people that think that Stadia will be viable for VR, which is laughable. Any input lag with VR is vomit city.

So what the gently caress are they smoking?

crobear working for google posted:

Google Stadia will be faster and more responsive than local gaming systems in “a year or two,”

It cannot be more responsive if you introduce input lag. What the gently caress :psyboom:

Mirificus
Oct 29, 2004

Kings need not raise their voices to be heard
https://twitter.com/CloudImperium/status/1170066790933598208
https://twitter.com/CloudImperium/status/1181207263400333312

trucutru
Jul 9, 2003

by Fluffdaddy

Bofast posted:

or somehow rolling back the entire memory state to the moment where prediction and actual input diverges. The latter might have been ok on a SNES emulator, but hardly reasonable for a full on PC game and would probably introduce a lot of rubber banding style disjointed moments on top of the network latency issues.

This is not an issue at all if you're running the game in multiple machines, which is kinda what cloud centers do.

The tech is perfectly doable if you consider that using a shitton of resources and tech to reduce perceived latency is worthwhile.

Scruffpuff
Dec 23, 2015

Fidelity. Wait, was I'm working on again?

Rotten Red Rod posted:

I imagine the reason they chose Assassin's Creed for the demo of Stadia is because it's a game that doesn't need precision aiming and already has significant input lag, so a little more wouldn't be noticed. There's actually people that think that Stadia will be viable for VR, which is laughable. Any input lag with VR is vomit city.

The choice of Assassin's Creed is actually pretty revealing - I think anyone can tell you, particularly in the earlier games in the series, you're barely controlling the character at all. At best you're roughly influencing where he goes. (Later games do away with this.) Most of the emergent gameplay in AC was reacting to the triple backflip faceplant into a crowd of guards you made while trying to climb a wall and go the opposite direction. A game like this is completely compatible with the concept in play here because the input is so loosely connected to the result.

Lots of games might work with this, now that I think about it. Think about the Batman and Shadow of War type games, where your combat is based on things like "hit a button when an enemy highlight comes on, and we'll play a canned animation that you literally can't affect or even move while it's playing." Dragon's Lair for the modern age. That poo poo, yeah, it'll work. But twitch games with full freedom?

Scruffpuff
Dec 23, 2015

Fidelity. Wait, was I'm working on again?

Nicholas posted:

There have been studys on input/reaction time perception that show that this is worse than just being consistantly laggy.

This is the same thing that makes some people queasy when they watch TV with motion smoothing on. Your brain detects that the computer-generated images between actual filmed frames are ... not exactly right, so you feel a bit off. It affects different people to different degrees and not everyone notices it.

Your brain evolved to fill in gaps, and it does it better than any modern system, "cloud" or otherwise. Maybe decades from now that will change, but it won't be due to "TEH CLOUDS".

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
It's actually really quite complicated. Your brain expects a certain degree of correlation between image clarity and motion clarity/consistency. This is why increasing resolution while maintaining the same framerate actually looks worse. Your brain's ability to determine your trajectory eventually breaks down, and you have a worse experience than the exact same framerate at a lower resolution. Obviously the hitching introduced by failed branch predictions and switching tracks is going to be much more dramatic and a worse experience than that.

Dark Off
Aug 14, 2015




https://massivelyop.com/2019/10/10/shroud-of-the-avatar-missing-kickstarter-book/

quote:

Shroud of the Avatar finally admits the Kickstarter book is not happening, offers in-game items instead
sneak peek to the future of star citizen

Agony Aunt
Apr 17, 2018

by LITERALLY AN ADMIN

skeletors_condom posted:

Another fun crobearism by Google:


They even have a variation on "answer the call 201X." Although I doubt they will keep up with "just wait till N+1," they will probably just shut it down in a year.

Perhaps we can test it on a VT220

colonelwest
Jun 30, 2018

Rotten Red Rod posted:

It's more like, in Stadia's case, since your local machine doesn't do anything, EVERYTHING is affected by latency. Imagine playing COD and trying to just AIM with input lag, let alone land accurate headshots. The idea they could execute any sort of prediction of what you might do in that scenario is laughable. Will it predict which direction you move the camera? Will it predict when you're going to shoot? Are you just going to fire your gun every time you aim at an enemy? Will the firing animation play, and then "take back" the shot if you didn't actually do it?

I imagine the reason they chose Assassin's Creed for the demo of Stadia is because it's a game that doesn't need precision aiming and already has significant input lag, so a little more wouldn't be noticed. There's actually people that think that Stadia will be viable for VR, which is laughable. Any input lag with VR is vomit city.

I’ve been trying out the Shadow streaming service on my laptop for a few months now. Unlike Stadia you are renting a virtual gaming pc running on a remote server, so you can play any game in your existing library. For the most part it works well, but it’s still basically in an Alpha state and has a lot of kinks.

Latency is definitely an issue in multiplayer FPS games, and I don’t think that can ever be fully resolved. I have a 100 mbps internet connection and Battlefield V is basically unplayable and in Apex Legends it’s very difficult to aim to properly.

Single player games are mostly fine, but there are some graphical issues due to the compression of the stream, like colors being washed out and a general graininess like in YouTube videos.

It’s a fine stopgap for me right now, since I sold my gaming rig, but I can’t say it’s a viable full replacement for your own hardware. Overall I think all of the companies piling on this bandwagon are going to face the exact same technical issues that they faced during the last big streaming gold rush in 2012-2013.

colonelwest fucked around with this message at 17:31 on Oct 10, 2019

Bofast
Feb 21, 2011

Grimey Drawer

trucutru posted:

If they are saying the truth that is the way to do it, yeah. Really efficient since each additional player in your (presumably) multi-player experience increases the need for extra computations exponentially.

I didn't even take the multiplayer aspect into consideration. A single player game would be possible even if it's wonky enough because each player could be connected to a separate server which would be in charge of everything for that player and could do whatever was necessary. I think it would be a poor experience, but still.

However, I would say that a multiplayer game couldn't just run multiple sessions in parallel for each player unless they were going to run the entire host server session in parallel, too. My thinking being that they can't just run two or more client sessions that both run/drive/fly/shoot around at the exact same coordinates in the exact same multiplayer map on the same host server and are somehow treated as the same player unless the developers specifically rewrote the game for that. As such, a typical multiplayer game should need to run enough host and client sessions in parallel to catch most prediction errors for every single participating player at once, no?

How bad would that be?
Let us take a rather simple 2D sidescrolling shooter like Teeworlds or Soldat and then strip it down further in terms of possible player actions. We'll take away things like variable height jumps, jetpacks, grappling hooks, weapon switching and the ability to combine actions (shooting while moving, shooting while jumping, jumping while moving etc.) so let us say that a player character can do one of five things: stand still (no input), move left, move right, jump or shoot. That's five really basic actions, far simpler than even a Gameboy game, and should be far easier to predict than more complex games. Teeworlds allows for 16 players per server so let us set that as a cap.

If I'm thinking about this correctly 16 players, each with 5 actions available, means the system would constantly be having to deal with any one of 5^16 possible control inputs. That would be something like 152 587 890 625 combinations. Even limiting it to 8 players (390 625 combinations) and assuming that 99.9% of possible combinations will be too rare to ever include in predictions leaves you with 390 possibilities. Would anyone really want to run something like 390 instances of clients and hosts just for 8 people to maybe get a slightly lower (but far less consistent) latency? A 4 player game with 20 possible control inputs and 99.9% of possible combinations not included in predictions would still be around 160 parallel sessions. These parallel sessions would all have to keep discarding wrong states and syncing up to the "right" state whenever predictions were wrong, too, so that's a massive amount of extra work and should be very difficult to make even remotely profitable.

The whole thing reads as Google management feeling like they are stagnating in their current markets and needing to display new growth opportunities, just to keep less tech savvy investors happy and stock prices up, with some talk of neural nets and other things like that to dismiss questions about how it even makes any sense.

Sabreseven
Feb 27, 2016


Some wisdom :

"The more you inflate the balloon, the closer it gets to the pin"

"Will have = Doesn't have" (Unrelated to quote but I like it)

Strangler 42
Jan 8, 2007

SHAVE IT ALL OFF
ALL OF IT

All the code is being sold to that guy that looks like that other guy from Nickelback. Chris and Co. are no longer involved in the project but will get producer credits. You get nothing sold to you outside the game. All the ships you bought are wiped. The game will be instanced. And you have to buy it all over again. There will only be one system that you can't leave. If you have a problem with that you can sue one of the shell corporations in the Caymans.

Pixelate
Jan 6, 2018

"You win by having fun"

monkeytek
Jun 8, 2010

It wasn't an ELE that wiped out the backer funds. It was Tristan Timothy Taylor.

Did the local tech/trade schools just have a graduation or whatever?

G0RF
Mar 19, 2015

Some galactic defender you are, Space Cadet.
It’s about that time:



quote:

”$25,000 for anyone curious”

I love the copywriting. “It is clear from your manner and bearing that you are Legatus Navium.”

Especially love the horked old Lexus slogan sandwiched in there like it ain’t no thing.

Strangler 42
Jan 8, 2007

SHAVE IT ALL OFF
ALL OF IT

G0RF posted:

It’s about that time:




I love the copywriting. “It is clear from your manner and bearing that you are Legatus Navium.”

Especially love the horked old Lexus slogan sandwiched in there like it ain’t no thing.

It's whale hunting season.

Dementropy
Aug 23, 2010



G0RF posted:

It’s about that time:




I love the copywriting. “It is clear from your manner and bearing that you are Legatus Navium.”

Especially love the horked old Lexus slogan sandwiched in there like it ain’t no thing.

G0RF
Mar 19, 2015

Some galactic defender you are, Space Cadet.

:lol:

Montoya ain’t wrong!

Adbot
ADBOT LOVES YOU

Bofast
Feb 21, 2011

Grimey Drawer

Aramoro posted:

For people not in the loop, leaving without a deal will mean people will die. That's the immediate part he's keen to get past.

While it will no doubt be a bit of an economic shock and a rough time for quite a while, I'm not sure how exactly a no deal Brexit would directly cause people to die. From what, stress related heart attacks? That seems a bit alarmist.

They could still trade with the EU and other countries, albeit with a whole bunch of tariffs on goods and a lot of red tape, tariffs and potential lack of access on the service side. Had they been wise (lol), they would have held behind-the-scenes negotiations ahead of time with a bunch of non-EU countries to have trade treaties ready to take effect upon an exit from the EU, though I totally expect the UK politicians to be like politicians everywhere else and have little to no preparations done.

chaosapiant posted:

The good news is that Star Citizen will never be coded in such a way that you could just walk straight down a hallway. Either you'll fall through the floor and swim, or collapse, or launch into the sun, or any number of things.

Croberts will truly save us

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply