Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Precambrian Video Games
Aug 19, 2002



https://www.reddit.com/r/starcitizen/comments/dfpzlt/just_an_fyi_from_an_astronomer_it_is_perfectly/

Counterpoint: no

Also :lol: at some scrub undergrad liking this poo poo rear end game with broken physics

Adbot
ADBOT LOVES YOU

Thoatse
Feb 29, 2016

Lol said the scorpion, lmao
https://i.imgur.com/jKAWnb1.mp4

Contingency
Jun 2, 2007

MURDERER

trucutru posted:

It's the opposite, the tele-kill happens because the games are server-authoritative. (modern ones, stuff like doom and so on is peer-to-peer anything-goes bullshit)

Anyways, we have reached such a level of self-parody as a species that I'm pretty sure that if someday we figure a way to time-travel the first thing the scientists will do is use it to shave a few extra milliseconds from the Super Mario Bros speedrun record.

I believe you're right. In BF3, if the client said "I just killed everyone on the server with a medpack," everyone would fall over dead. That doesn't mean the client is authoritative, it just means the server blindly trusts client inputs.

Nicholas
Mar 7, 2001

Were those not fine days, when we drank of clear honey, and spoke in calm tones of our love for the stuff?
Predicting your next movement would literally be impossible to do accurately in a game running at 60 frames per second. Even if some AI or heuristic data can predict you're about to choose some input, it would have to also compensate for lag and your own reaction time, within a 16ms window, as well as correctly guess all of the other controller inputs it might recieve at that time. It's pure bullshit.

Tokyo Sexwale
Jul 30, 2003

using advanced heuristics sounds like cheating, it should just run a brute force comparison of every possible input against the one you use

if there's one thing I've learned from Chris "no cheating" Roberts it's that the only good algorithm is a brute force one

Zazz Razzamatazz
Apr 19, 2016

by sebmojo

Scruffpuff posted:

It's bullshit at every level. It doesn't even pass the smallest effort at scrutiny. Say you're playing Pac-Man - the game will predict which way you'll go to make it faster! So let's say if there's ghosts coming from down, up, and left, it predicts you'll turn right. It's the most logical move, the other 3 will result in death.

But no matter what it predicts, here we go:

1) It takes more CPU cycles to run a prediction algorithm than it takes to just translate your input directly
2) If you go right, it still has to take your input (he went right) and compare it to what the AI chose for you (he should go right) - that's twice as much effort as just saying (he went right)
3) If you go in any other direction, it STILL takes your input (he went down), compares it to what the AI chose (he should go right), determine you did something unexpected, and take your loving input anyway.

I'm convinced it's 100% for the Star Citizen type minds out there. It's complete bullshit. If I lived in any reality except the parody one we're currently inhabiting it'd call it parody.

At some point Chris is going to rip off his mask and reveal that it's actually Ashton Kutcher and we've all been Punk'd, or whatever the kids are watching these days...

Sabreseven
Feb 27, 2016

I for one, am glad that Peter Molyduex is on the Google Stadia tech team. :thumbsup:

He's doing well these days, even making games for CR:

quote:

petermolydeux
‏Sep 24

Imagine a murder mystery game where you can interrogate ANY pixel in the game and ask it questions?

Sabreseven fucked around with this message at 04:07 on Oct 10, 2019

trucutru
Jul 9, 2003

by Fluffdaddy

Nicholas posted:

Predicting your next movement would literally be impossible to do accurately in a game running at 60 frames per second. Even if some AI or heuristic data can predict you're about to choose some input, it would have to also compensate for lag and your own reaction time, within a 16ms window, as well as correctly guess all of the other controller inputs it might recieve at that time. It's pure bullshit.

It can be done, they just have to fork the game code into a predictive and non-predictive version when their neural network (it's always a neural network) is fairly confident that it can guess which player inputs will arrive before the next frame. Then they can immediately feed those predictions to one of the code branches to calculate the next frame to be sent to the client without waiting for the real input. Then, if the client input matches the guess then they just instantly send the pre-computed frame to the client, otherwise they use the other branch of the code to do things the regular way. Rinse and repeat. As such they achieve negative (processing) latency when the guess is correct. There is -of course- some network latency to be added on top.

Funnily with this approach the more accurate their predictions are the more resources they'll need. If you limit the system to a single prediction and you're 100% accurate you are basically doubling the amount of work (plus whatever the cost of forking is). Also, there is nothing stopping you from having more than one prediction and one fork, if you have processing power to burn.

Kosumo
Apr 9, 2016

All they have to do is ask Chris Roberts what would be the best move to do and then do that. Any other move is poo poo and an insult to the game and should not be allowed anyway.

You don't spend that much money on perfection only for some player to gently caress it up.

Rotten Red Rod
Mar 5, 2002

trucutru posted:

It can be done, they just have to fork the game code into a predictive and non-predictive version when their neural network (it's always a neural network) is fairly confident that it can guess which player inputs will arrive before the next frame. Then they can immediately feed those predictions to one of the code branches to calculate the next frame to be sent to the client without waiting for the real input. Then, if the client input matches the guess then they just instantly send the pre-computed frame to the client, otherwise they use the other branch of the code to do things the regular way. Rinse and repeat. As such they achieve negative (processing) latency when the guess is correct. There is -of course- some network latency to be added on top.

Funnily with this approach the more accurate their predictions are the more resources they'll need. If you limit the system to a single prediction and you're 100% accurate you are basically doubling the amount of work (plus whatever the cost of forking is). Also, there is nothing stopping you from having more than one prediction and one fork, if you have processing power to burn.

or you could just play the game installed on a console or pc that is already in your house

Stadia is trying to solve a problem no one has and creating multitudes of other problems in the process

TheDeadlyShoe
Feb 14, 2014

i guarantee the pitch for stadia was 'Spotify for Minecraft!'

BumbleOne
Jul 1, 2018

by Fluffdaddy

Experimental Skin posted:

I have not read the article, but off the top of my head, perhaps it predictively renders results of future actions to display, and discards them if incorrect. That could help in simpler games to reduce perceived lag.

i know of only one trick to really reduce input lag. the retroarch emulator has that. it runs two instances of a game, one is lagging a bit behind but you see the one “ahead“ on your screen. your button press is then applied to the instance which is “behind“ which really does reduce input lag. after the button press the redudant instance is discarded and yet another instances “behind“ is created.

BumbleOne fucked around with this message at 07:03 on Oct 10, 2019

DigitalPenny
Sep 3, 2018

Rotten Red Rod posted:

Holy poo poo that's literally what they're saying

We’ve received a heads-up (thanks!) that negative latency, powered by a datacentre’s worth of compute silicon, may offer future cloud gaming systems flexibility to anticipate the likely action of a user, and ensure a speedy response ready for that potential eventuality.

In what universe does that make sense or even mean

This is Chris Roberts dream, negative latency predicts they players actions so by playing the game you are really just watching his movie ...

trucutru
Jul 9, 2003

by Fluffdaddy

Rotten Red Rod posted:

or you could just play the game installed on a console or pc that is already in your house

Stadia is trying to solve a problem no one has and creating multitudes of other problems in the process

Sure, you can play games in your console, but that's not the point here, isn't it? Stadia does have a user case: netflix for games. Yeah, it didn't work before but they are betting that they can reduce the latency to low enough levels for players not to care.

quote:

i know of only one trick to really reduce input lag. the retroarch emulator has that. it runs two instances of a game, one is lagging a bit behind but you see the one “ahead“ on your screen. your button press is then applied to the instance which is “behind“ which really does reduce input lag. after the button press the redudant instance is discarded and yet another instances “behind“ is created.

Yeah, this is the approach I mentioned


In any case, it should work for non-twichy games so it'll be perfect for waiting-for-my-space-jump-to-finish simulator. Goog has money to space so I am sure they'll be happy to lose a few billions trying to create a new monopoly.

TheDeadlyShoe
Feb 14, 2014

stream stadia ingame, so you can play star citizen piloting a fighter while sitting at a console in your whale barge

Kibner
Oct 21, 2008

Acguy Supremacy
There is a predictive peer-to-peer network code that works pretty similarly to what Stadia wants to accomplish. It even has fighting game fan approval:

https://twitter.com/Pond3r/status/1181800172184977408?s=20

It's not just plug-and-play, though:

https://twitter.com/Pond3r/status/1181822296618496001?s=20

https://twitter.com/pedrothedagger/status/1182027968803037184?s=20

LazyMaybe
Aug 18, 2013

oouagh
it's not really what Stadia wants to accomplish, since Stadia aims to eliminate user ownership/local hardware and making users subscription dependent

Kibner
Oct 21, 2008

Acguy Supremacy

IronicDongz posted:

it's not really what Stadia wants to accomplish, since Stadia aims to eliminate user ownership/local hardware and making users subscription dependent

:hmmyes:

Pixelate
Jan 6, 2018

"You win by having fun"
Predictive renders of the future:


CATERPILLAR FRONT TURRET ENTRY ANIMATION IS INCORRECT



quote:

You can see how one leg in 180 degrees turned upwards and one leg is turned 45 degrees. this is obviously incorrect.


PORT OLLISAR - DOOR TO CASABA GRAPHIC BUG




MILES ECKHARDT - BEGINS ELEVATING WHEN USING HIS MOBI

Kosumo
Apr 9, 2016


And what's the bug? All gruff bad arse space dudes with 3 or more coats on do this! It's true fidelity.

It's the Best Dam Space Sim Ever;

commando in tophat
Sep 5, 2019

google posted:

bullshit

This is either some galaxy brain poo poo, or complete bullshit. So you have some latency between your input and google, and then latency between google and your display (if you play like a caveman on your own hardware none of it is there). Now if they are brute forcing every possible input, to have no latency requires them to send images for every possible input to you, which will never happen (your basically streaming zillion videos simultaneously). Now also add all inputs that happen within that latency and it just cannot happen.
Other option is predictions (haha, good luck with FPS trying to predict some random looking around). I would think this has so limited usage it is not even worth doing. Even if they would correctly predict 90% of user inputs, remaining 10% would have some latency, and it would just be annoyingly inconsistent. There is a super old gamasutra article about netcode for Age of Empires 2, where they tried adjusting latency as fast as they could and everybody hated it. So they ended up rather having longer latency for longer but consistent, than lower latency but sometimes suddenly changing to higher and back quickly.

Kikas
Oct 30, 2012

trucutru posted:

Sure, you can play games in your console, but that's not the point here, isn't it? Stadia does have a user case: netflix for games. Yeah, it didn't work before but they are betting that they can reduce the latency to low enough levels for players not to care.

Except it doesn't adopt what made Netflix successful - single subscription for everything. You still have to pay for every separate game, on top of the subscription. You'll also need a controller, dunno how the 3rd party support is gonna work for that. And the games are expected to be full price.


The pricing model is what will be the nail in the coffin, not some imaginary algorithms.

I really really want a detailed explanation on why stuff like this KEEPS loving HAPPENING. Why does the leg go loving wild and bends in stupid ways instead of actually clipping though stuff?

AbstractNapper
Jun 5, 2011

I can help
Enemy sighted at 5 o' clock.

Jonny Shiloh
Mar 7, 2019
You 'orrible little man

AbstractNapper posted:

Enemy sighted at 5 o' clock.

It's the new improved CIG turret animation.

quote:

"Enemy sighted at eleventy fourteen o'clock commando!"
"On it sir!"

*twists legs to eleventy fourteen o'clock - glitches through turret and dies*

Rotten Red Rod
Mar 5, 2002

Kikas posted:

Except it doesn't adopt what made Netflix successful - single subscription for everything. You still have to pay for every separate game, on top of the subscription. You'll also need a controller, dunno how the 3rd party support is gonna work for that. And the games are expected to be full price.


The pricing model is what will be the nail in the coffin, not some imaginary algorithms.


Yeah, this. The numerous subscription services - XBox's in particular - prove the model already works, but that's not the model Stadia is using. Having to buy the games mean all you're gaining is the streaming technology. MAYBE you could argue it's good that the initial hardware is cheaper, but you still also need a high speed internet connection and the subscription free. I just don't see who this is for.

Sanya Juutilainen
Jun 19, 2019

by Jeffrey of YOSPOS
I can already do the Stadia stuff. I watch a speedrun on YouTube/twitch, while randomly pressing keys. Works 100 % of time, I am the best speedrunner.

Jobbo_Fett
Mar 7, 2014

Slava Ukrayini

Clapping Larry
Oh how I long to be bulldozed off the deck of a space carrier...

Tokyo Sexwale
Jul 30, 2003

commando in tophat posted:

This is either some galaxy brain poo poo, or complete bullshit. So you have some latency between your input and google, and then latency between google and your display (if you play like a caveman on your own hardware none of it is there). Now if they are brute forcing every possible input, to have no latency requires them to send images for every possible input to you, which will never happen (your basically streaming zillion videos simultaneously). Now also add all inputs that happen within that latency and it just cannot happen.
Other option is predictions (haha, good luck with FPS trying to predict some random looking around). I would think this has so limited usage it is not even worth doing. Even if they would correctly predict 90% of user inputs, remaining 10% would have some latency, and it would just be annoyingly inconsistent. There is a super old gamasutra article about netcode for Age of Empires 2, where they tried adjusting latency as fast as they could and everybody hated it. So they ended up rather having longer latency for longer but consistent, than lower latency but sometimes suddenly changing to higher and back quickly.

Yeah I've been reading about this too, but for operating systems - people have been finding it's more preferable to have consistent and predictable CPU processing (burst?) times than to have widely-varying times where the fastest is very fast but it might also get very slow.

Mirificus
Oct 29, 2004

Kings need not raise their voices to be heard
https://twitter.com/Wiborg1978/status/1182222135923789824
https://twitter.com/Wiborg1978/status/1182222151719571456
https://twitter.com/Wiborg1978/status/1182222156786274305
https://twitter.com/Wiborg1978/status/1182222161605484544

Mirificus fucked around with this message at 14:09 on Oct 10, 2019

CrazyLoon
Aug 10, 2015

"..."
Is this year's CitCon gonna coincide with Brexit (as in happening right at the same time it finally goes down)?

Because if so, lols will ensue.

Mirificus
Oct 29, 2004

Kings need not raise their voices to be heard
https://twitter.com/TarkaRoshe/status/1171940758766653442

Daztek
Jun 2, 2006



CrazyLoon posted:

Is this year's CitCon gonna coincide with Brexit (as in happening right at the same time it finally goes down)?

Because if so, lols will ensue.

Brexit happens the 31st of October, if it happens

Citizencon is like 23 November?

Quavers
Feb 26, 2016

You clearly don't understand game development

:chloe:

skeletors_condom
Jul 21, 2017

Another fun crobearism by Google:

quote:

“Ultimately, we think in a year or two we’ll have games that are running faster and feel more responsive in the cloud than they do locally,” Bakar says to Edge, “regardless of how powerful the local machine is.”

They even have a variation on "answer the call 201X." Although I doubt they will keep up with "just wait till N+1," they will probably just shut it down in a year.

Aramoro
Jun 1, 2012





For people not in the loop, leaving without a deal will mean people will die. That's the immediate part he's keen to get past.

Scruffpuff
Dec 23, 2015

Fidelity. Wait, was I'm working on again?

skeletors_condom posted:

Another fun crobearism by Google:


They even have a variation on "answer the call 201X." Although I doubt they will keep up with "just wait till N+1," they will probably just shut it down in a year.

That's a great quote, my favorite is "regardless of how powerful the local machine is." How exactly does that work, since "the cloud" is also just machines, local to wherever they're located? If you're running the cloud datacenters, are the local machines in front of you, which are part of that cloud, now laboring under the aforementioned statement of "regardless of how powerful the local machine is" and now other machines in remote datacenters magically become faster?

This is part of that insane "more = faster" mindset that we're still laboring under with regard to cloud computing. It's only faster when you're sharing an immense computational task amongst hundreds of CPUs. There are no gaming applications where the number of CPUs is the problem. 99% of the work being done in hyperconvergence is minimizing the significant speed overhead they incur. It's bottlenecks inside of bottlenecks.

Aramoro
Jun 1, 2012




Scruffpuff posted:

Sounds great until you factor in that there will need to be a check after every player action checking whether or not the code guessed correctly, and if it didn't, it will have to recalculate to compensate for the move the player actually made, which is even loving slower than just doing what the player is doing.

Having a conversation with someone who tries to finish every one of your sentences is not faster than having a conversation with someone who shuts the gently caress up and listens when it's your turn to talk.

It sounds like expanding on what processors already do, it's called branch prediction. Basically say you've got a list of numbers between 1 and 100 and you're running a loop which has a condition in it that say if it's <50 do A or >= 50 do B. If your list of numbers are sorted 1 to 100 it will run faster than if your loop is unsorted due to the processor predicting what the outcome of the condition will be based on previous results.


Here's one you can run in a browser

https://jsperf.com/sorted-loop/2

It's trivial to write one in Java or C which will show the same thing.

Doing this level of prediction is obviously much more complex in something like a game vs a basic loop but the idea is the same. So not as far fetched as it sounds.

Bofast
Feb 21, 2011

Grimey Drawer

Scruffpuff posted:

It's bullshit at every level. It doesn't even pass the smallest effort at scrutiny. Say you're playing Pac-Man - the game will predict which way you'll go to make it faster! So let's say if there's ghosts coming from down, up, and left, it predicts you'll turn right. It's the most logical move, the other 3 will result in death.

But no matter what it predicts, here we go:

1) It takes more CPU cycles to run a prediction algorithm than it takes to just translate your input directly
2) If you go right, it still has to take your input (he went right) and compare it to what the AI chose for you (he should go right) - that's twice as much effort as just saying (he went right)
3) If you go in any other direction, it STILL takes your input (he went down), compares it to what the AI chose (he should go right), determine you did something unexpected, and take your loving input anyway.

I'm convinced it's 100% for the Star Citizen type minds out there. It's complete bullshit. If I lived in any reality except the parody one we're currently inhabiting it'd call it parody.

I'm not sure the CPU cycle demands are all that high for input prediction itself, given that a lot of game clients already predict what will happen in a game while they are waiting for the server to respond with what actually happened. It's more that the only ways I can see them hypothetically even implementing it would require them to either have multiple (constantly syncronizing) game sessions running in parallel so the video stream can jump to an alternate source if the prediction was wrong, or somehow rolling back the entire memory state to the moment where prediction and actual input diverges. The latter might have been ok on a SNES emulator, but hardly reasonable for a full on PC game and would probably introduce a lot of rubber banding style disjointed moments on top of the network latency issues.

Scruffpuff
Dec 23, 2015

Fidelity. Wait, was I'm working on again?

Aramoro posted:

It sounds like expanding on what processors already do, it's called branch prediction. Basically say you've got a list of numbers between 1 and 100 and you're running a loop which has a condition in it that say if it's <50 do A or >= 50 do B. If your list of numbers are sorted 1 to 100 it will run faster than if your loop is unsorted due to the processor predicting what the outcome of the condition will be based on previous results.


Here's one you can run in a browser

https://jsperf.com/sorted-loop/2

It's trivial to write one in Java or C which will show the same thing.

Doing this level of prediction is obviously much more complex in something like a game vs a basic loop but the idea is the same. So not as far fetched as it sounds.

It works for sorting, or financial predictions, or probability calculations, or random simulations - this is true. But they're claiming this can be used to predict player actions in a video game. Think of the infinite number of actions you can perform in even a relatively simple game. How much computational power will it take a machine to predict every possible player outcome, or even every likely outcome, or poo poo even a single random outcome, in a game like Red Dead Redemption 2? Thank god their tech predicted I would pull out my pistol on a whim, try to shoot a water bucket and miss, hop on a nearby horse and knock myself out on a lowhanging branch because I fumbled the controller swatting at a fly. You can really feel that half-frame the predictive algorithm saved us from having to render.

A player predictive algorithm is, by definition, true AI. Indistinguishable from a player because it's attempting to be a player.

Adbot
ADBOT LOVES YOU

Scruffpuff
Dec 23, 2015

Fidelity. Wait, was I'm working on again?

Bofast posted:

I'm not sure the CPU cycle demands are all that high for input prediction itself, given that a lot of game clients already predict what will happen in a game while they are waiting for the server to respond with what actually happened. It's more that the only ways I can see them hypothetically even implementing it would require them to either have multiple (constantly syncronizing) game sessions running in parallel so the video stream can jump to an alternate source if the prediction was wrong, or somehow rolling back the entire memory state to the moment where prediction and actual input diverges. The latter might have been ok on a SNES emulator, but hardly reasonable for a full on PC game and would probably introduce a lot of rubber banding style disjointed moments on top of the network latency issues.

The predictions in current client server games are have extremely brief deltas. If you're running forward in World of Warcraft, the server is going to assume you're still running forward each 50ms until it receives data that you stopped running. It's also not really "predicting" anything - it's maintaining the previously received input - "last known good." Assume the player is continuing to do the last thing they started to do until you get data that says otherwise.

This poo poo they're talking about is on a whole other level. Like you said maybe in a super-simple game, but as soon as people did something weird, the game would lag out and feel like poo poo. "Frank, you're not playing right!"

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply