Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Klyith
Aug 3, 2007

GBS Pledge Week

K8.0 posted:

Fighting games were the last genre to get it. Half-Life pioneered it in 1999 or 2000 and almost every non-fighting game made since at least 2004 has it.

Say what?

Adbot
ADBOT LOVES YOU

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

TheFluff posted:

reminder that everyone on a 60hz monitor that's using v-sync (which is to say most people) likely has ~90ms input lag already

Yeah, but a good chunk of that is the frame waiting in the buffer and then actually being sent to the monitor, processed there, and displayed. All of which is after the frame is actually rendered, which means it'd be there regardless of whether it's your local GPU or Stadia rendering the frame. The frametime render target is still sub-17ms even for 60Hz/60FPS, which is a hell of a challenge if you're spending half of that at a bare minimum just for network transfer. And by "challenge" I mean you basically can't do it in any meaningful sense, hence their discussion of predictive algorithms. Which then raises all sorts of questions about what happens with an input "miss".

eames
May 9, 2009

TheFluff posted:

reminder that everyone on a 60hz monitor that's using v-sync (which is to say most people) likely has ~90ms input lag already

True but a lot of it comes from the same screen that stadia will also run on.
I get what you are saying though. There's a huge difference between the input lag of an average PC and the setups on esports stages.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

Klyith posted:

Say what?

Without digging super deep into all the ways it's implemented and optimized, clients are always lagging behind the server. When you shoot someone in an FPS, you shoot them where they used to be. The server rewinds time and checks if this was valid, then proceeds from there. This is why you sometimes die behind a wall. You actually got shot in the open, but all the latency involved gave you time to get behind a wall before you died.

Klyith
Aug 3, 2007

GBS Pledge Week

K8.0 posted:

Without digging super deep into all the ways it's implemented and optimized, clients are always lagging behind the server. When you shoot someone in an FPS, you shoot them where they used to be. The server rewinds time and checks if this was valid, then proceeds from there. This is why you sometimes die behind a wall. You actually got shot in the open, but all the latency involved gave you time to get behind a wall before you died.

I've always understood these effects to just be consequences of standard client side prediction. You get hit when you're apparently around a corner because on the server tick that everything got simulated you weren't really around the corner, to the authoritative server positioning. Not the server rolling back to re-calculate where everyone was at the time a shot was fired.

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS
coming off a 1080ti at 1440p I cant imagine any games that will make me want to upgrade to a 2180ti at least for 2 years
The upcoming big games like cyberpunk are for current gen consoles and not pc exclusive so i would expect them to run well on current pc hardware.
Games that used RTX like battlefield were very much not worth it in terms of performance loss. Games that struggle like BL3 are because of poo poo optimization and dont even run well on a 2080ti.

Im more likely to upgrade my cpu when ddr5 hits and migrate the 1080ti over.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Here's Valve's wiki entry on it, for example.

There's a bunch more stuff that they aren't covering there, some of which may not be implemented in Source but definitely is in other games.

Cygni
Nov 12, 2005

raring to post

i guess the upside is the next gen consoles are clearly going to want to market 4k VRR to the max, so hopefully that means more pressure on the big developers to not leave their engines in complete dogshit lazily programmed states cause even 4k/60 is a pretty big ask on the rumored hardware specs

ah who am i kidding, Asscreed still gonna run like trash

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
It'll be a lot easier when they completely skip any attempt at rendering at 4k and instead render at like 1080p/30 and upscale.

Almost no engine rendered at 1080p internally for the XBox/PS4 at launch, and most still don't. No reason to expect any difference for 4k, and actual 4k@60 just isn't in the cards at all.

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

DrDork posted:

Almost no engine rendered at 1080p internally for the XBox/PS4 at launch, and most still don't.
Um...what

Are you like counting lower-precision render targets for things like DOF/Godrays against 1080p? Because the above is absolutely false. There's dynamic res, upscaling etc but there are plenty of 1080p games on PS4 - the Xbox One is routinely 1600x900, but this was known at launch.

Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.
I hope a 21 series is far off for entirely selfish reasons, I literally just bought a 2070 Super in prep for CoD and Cyberpunk :(

Klyith
Aug 3, 2007

GBS Pledge Week

K8.0 posted:

Here's Valve's wiki entry on it, for example.

There's a bunch more stuff that they aren't covering there, some of which may not be implemented in Source but definitely is in other games.

This is a lot less complex than GGPO rollback and I don't think is worth saying "other games already did it". It's a one-way event -- there's no decision to make about whether the other player did something that blocks the bullet or a retroactive undo for player 1's shot. GGPO rewinds both players and re-runs their input to determine what really happened.

Different game styles need different types and amounts of lag compensation, and compensating for input lap on a streaming platform needs something different from both. I don't think Stadia having anti-lag is impossible, but I have doubts about it being done without special versions of the game made just for Stadia with hooks for it.

B-Mac
Apr 21, 2003
I'll never catch "the gay"!

Fauxtool posted:

coming off a 1080ti at 1440p I cant imagine any games that will make me want to upgrade to a 2180ti at least for 2 years
The upcoming big games like cyberpunk are for current gen consoles and not pc exclusive so i would expect them to run well on current pc hardware.
Games that used RTX like battlefield were very much not worth it in terms of performance loss. Games that struggle like BL3 are because of poo poo optimization and dont even run well on a 2080ti.

Im more likely to upgrade my cpu when ddr5 hits and migrate the 1080ti over.

From what I’ve played ray tracing has only been worth it for metro exodus and control. I’ve just started control but the reflections are really great just based on the environment. That said it’s about a 35-40% performance loss to use them and I haven’t decided if I’ll play all the way through with medium RTX on, this is with a 2080 ti.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
It's actually enormously more complex than GGPO. All GGPO has to do is buffer the last X frames of gamestate and input for extremely computationally simple 2 player games, and re-simulate from the timestamp when input is received. Other kinds of games have to do much more complex things, for numerous reasons, some of which are that the game states are many orders of magnitude more complex, the number of actions the player can take is essentially infinite, and the number of players is greater than 2.They do in fact do a lot more than just consider where player models were when the timestamp was received. You can go read through the source SDK if you really care to get an idea of how much goes into it in just Source.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

DrDork posted:

I have high quality FTTH, and even then average RTT to local datacenters is in the 8-10ms range. Most of my gaming targets sub-15ms frame times. With just the network layer taking half or more of the time it takes my local system to render an entire frame, good luck ever getting latency to compare.

The only way I could see this being true is if they cherry picked the scenario so you had someone on fiber sitting next to their datacenter but playing on some potato computer with an iGPU. Then maybe it would technically be faster because their local processing capacity would be almost nil, but that's not what their clipping said, so :shrug:

Wouldn't be the first tech startup to outright lie about the capabilities of their product, though.

I have no doubts whatsoever that first-gen streaming services are not going to be as good as a high-end gaming PC delivering 144+ Hz from a box right next to you

if it ends up being something that lives in every telco/cable company local office then maybe but I think you're right that traversing the network stack is just too time consuming, it's like the difference between ethernet and RDMA layer networking, even if the physical layer is just as fast the software slows it down too much.

Cygni posted:

i guess the upside is the next gen consoles are clearly going to want to market 4k VRR to the max, so hopefully that means more pressure on the big developers to not leave their engines in complete dogshit lazily programmed states cause even 4k/60 is a pretty big ask on the rumored hardware specs

ah who am i kidding, Asscreed still gonna run like trash

VRR is overdue and is definitely going to be a big thing this generation. I think LG's new OLEDs have it now? (eg 55C9) Pretty much the only TV worth paying attention to IMO.

Paul MaudDib fucked around with this message at 23:59 on Oct 11, 2019

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Can't wait for every developer to drop their framerate target from 30 FPS to 25.

Seriously though, the input latency benefits to VRR on consoles should be pretty significant. It's definitely one of the areas you can really feel at times.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Happy_Misanthrope posted:

Um...what

Are you like counting lower-precision render targets for things like DOF/Godrays against 1080p? Because the above is absolutely false. There's dynamic res, upscaling etc but there are plenty of 1080p games on PS4 - the Xbox One is routinely 1600x900, but this was known at launch.

If you include the PS4 Pro, sure. But the launch PS4 targeted 1080p@30 only for less complex games early on: BF4, COD BOPs, AssCreed Syndicate and Unity, and a slew of others all render internally at less than 1080p. Over time, as engines improved, more and more targeted 1080p@30, and now some of the less demanding (or very well optimized) games target 1080p@60. But you really need the Pro to hit 1080p@60 for most of them, and you still have a bunch that won't. The XBox was even worse at launch. It's gonna be the same story for 4k: launch edition will render most complex games at less than 4k@30 and upscale from there, and then over time we'll see internal resolution creep up. With VRR we might not actually see as much of a push to 60 FPS, though, since it helps make 30FPS feel a lot smoother anyhow. For TV gaming that's honestly a perfectly reasonable approach, but people expecting native 4k@60 for whatever AAA FPS is in the launch lineup are gonna be disappointed.

ufarn
May 30, 2009

Paul MaudDib posted:

VRR is overdue and is definitely going to be a big thing this generation. I think LG's new OLEDs have it now? (eg 55C9) Pretty much the only TV worth paying attention to IMO.
VRR has this standardization issue where TVs like C9 only has VRR at framerates above 40 which doesn't mean a whole lot on the consoles you're likely to play your games on.

This is going to be HDR all over again where games either just won't work, don't have HDR with Game Mode on, or just an insufficient brightness (nit) to give you actual HDR.

It's HD Ready and Full HD all over again.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
Nvidia is the one sole culprit behind the VRR landscape being a loving mess. Just adopt the HDMI 2.1 standards you shower of bastards.

Budzilla
Oct 14, 2007

We can all learn from our past mistakes.

DrDork posted:

Wouldn't be the first tech startup to outright lie about the capabilities of their product, though.
Google is no longer a startup, it's quite a successful company now. Stadia is more like Valve's Powerplay.

Having said that I would like to have a a system like Stadia but I doubt it will happen soon.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

Zedsdeadbaby posted:

Nvidia is the one sole culprit behind the VRR landscape being a loving mess. Just adopt the HDMI 2.1 standards you shower of bastards.

They do support it now.

ufarn posted:

VRR has this standardization issue where TVs like C9 only has VRR at framerates above 40 which doesn't mean a whole lot on the consoles you're likely to play your games on.

This is going to be HDR all over again where games either just won't work, don't have HDR with Game Mode on, or just an insufficient brightness (nit) to give you actual HDR.

It's HD Ready and Full HD all over again.

40-120 is a wide enough window for frame doubling to work just fine, so it doesn't really matter how low the framerate goes.

Optimus Subprime
Mar 26, 2005

Ideas are more powerful than guns. We would not let our enemies have guns, why should we let them have ideas?

Has anyone here ever had an issue with a 144hz monitor where the output for their color dynamic range switches from full to limited as well as the output color format switching out of RGB? It won't let me swap settings back to RGB and full dynamic range unless I drop the output to 100hz or lower. Sometimes replugging the display port cable fixes it, but it'll change again later. I'm getting the feeling that it may be a cable or a port issue.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Optimus Subprime posted:

Has anyone here ever had an issue with a 144hz monitor where the output for their color dynamic range switches from full to limited as well as the output color format switching out of RGB? It won't let me swap settings back to RGB and full dynamic range unless I drop the output to 100hz or lower. Sometimes replugging the display port cable fixes it, but it'll change again later. I'm getting the feeling that it may be a cable or a port issue.

If it sometimes works, sometimes doesn't, that absolutely sounds like a cable or port issue, and would be where I'd start my troubleshooting.

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride
Yeah, reminds me of hdmi cables with insufficient bandwidth for like 4k 4:4:4. Maybe try a different cable. I’ve had the worst drat luck with DisplayPort.

Optimus Subprime
Mar 26, 2005

Ideas are more powerful than guns. We would not let our enemies have guns, why should we let them have ideas?

Its strange because I've used this cable for years without issue, but it seems to be giving up the ghost. The color output will be throttled after I wake it up from sleep or turn it on, but if I shut the pc down, unplug both sides of the cable, replug, and turn it on, it goes back to full color output at 144 hz again.

eames
May 9, 2009

I can think of a number of reasons why that could happen (corroded connectors, broken solder joints, bent cable, corrosion inside the cable, etc), I'd also suggest buying a new Displayport 1.4 certified cable, maybe from a known brand.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

If I have to care (too much ) what brand my cable is, your spec is bad

TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE

Optimus Subprime posted:

Has anyone here ever had an issue with a 144hz monitor where the output for their color dynamic range switches from full to limited as well as the output color format switching out of RGB? It won't let me swap settings back to RGB and full dynamic range unless I drop the output to 100hz or lower. Sometimes replugging the display port cable fixes it, but it'll change again later. I'm getting the feeling that it may be a cable or a port issue.
I have had something similar happen with some regularity. I have an XV273K (4k 120Hz) and sometimes when resuming from sleep the Nvidia display driver decides to set the output to YCbCr 422 instead of RGB, which the monitor doesn't actually support, so I get "invalid input format" and no picture. Fortunately I have a second monitor running off of the iGPU so I can drag the Nvidia control panel over there and set the output to 2560x1440 RGB, because if I try to set it to 4K RGB that doesn't work and it just resets back to YCbCr. Then I need to log out and log in again, and then it'll let me set the output back to 4K RGB. It's super obnoxious and happens maybe once or twice a month or so. Hasn't happened in a while though so maybe they fixed it in the driver, I dunno.

This happened at least once with a regular 4K 60hz monitor before I got this one, too, but that one at least supported YCbCr input so it didn't go blank when it happened. Also that time it seemed impossible to get back to RGB without nuking the drivers with DDU. :iiam:

TheFluff fucked around with this message at 17:53 on Oct 12, 2019

Optimus Subprime
Mar 26, 2005

Ideas are more powerful than guns. We would not let our enemies have guns, why should we let them have ideas?

Yeah pretty much my situation, and I swapped the cable for a new VESA certified one but can recreate the problem, so I’ll try rolling back the drivers to a older set and see if that does anything.

edit: I've just given up and swapped to using a HDMI cable instead with no issues since I only need to push 1080p 144hz.

Optimus Subprime fucked around with this message at 19:11 on Oct 12, 2019

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Statutory Ape posted:

If I have to care (too much ) what brand my cable is, your spec is bad

You just have to care whether it’s certified, which is the point of the spec.

Freakazoid_
Jul 5, 2013


Buglord

Statutory Ape posted:

If I have to care (too much ) what brand my cable is, your spec is bad

Still using DVI in 2019 :smug:

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride

Freakazoid_ posted:

Still using DVI in 2019 :smug:

I was until I got my 2070 super, but it only has displayports and an hdmi port :mad:

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
lmao if you're not using a converter to rf adaptor

Optimus Subprime
Mar 26, 2005

Ideas are more powerful than guns. We would not let our enemies have guns, why should we let them have ideas?

Out of curiosity, I rolled my geforce drivers back to 430.86, and I am able to get my dynamic range back to RGB Full at 2560x1080 144hz with my displayport output. Guess something went bad from 431 onward.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
So I've been waiting for the C9 Gsync update, supposedly it's dropping this month, and I couldn't be more excited! From what I can tell, C9 GSync compatibility makes it the best gaming display on the market besides the super-expensive Nvidia wide format gaming displays.

Of course, a common refrain from the community is "it's not REAL GSync just Freesync!". I get this, but I've seen some opposing viewpoints that I would appreciate some clarification on.

1) OLED has absurdly responsive pixels. This causes stutter on low fps sources (which includes blu rays at 24hz). However word on the street is that this low latency, which frankly sucks for movies, is going to be a huge boon to GSync as it essentially removes the need for variable overdrive.

My understanding is that this feature is one of the main differentiating points of Freesync vs "real" GSync.

This seems to be corroborated by RTings, which gives the C9 a perfect 10/10 for response time:

quote:

80% Response Time : 0.2 ms
100% Response Time : 2.4 ms
Like all OLED TVs, the LG C9 has a nearly-instantaneous response time. There is some very subtle overshoot in near-black scenes, but this shouldn't be very noticeable.

This extremely fast response time can cause the image to stutter, which may bother some people.

Given this, many people are making the argument that, though the C9 doesn't have onboard GSync, it's roughly as good as onboard since it has no need for variable overdrive. So in practice you wouldn't really be able to tell the difference. Does anyone have an opinion on this? It seems true from what I've seen (and you can't really argue with those numbers afaik) but I'm no expert so I was curious about the wider viewpoint.

2) The C9 GSync will be using an HDMI 2.1 implementation akin to the one currently enjoyed by Xbox one. It's 40-120hz, which covers PC gaming just fine. My other question is, are there any benefits of HDMI 2.1 VRR that are worth discussing? Or is it ultimately all the same in the end, within its FPS range?

Taima fucked around with this message at 04:04 on Oct 13, 2019

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
There are Freesync displays with good variable overdrive implementations. That said, no LCD responds as fast as an OLED. You get horrible artifacting if you try to get even close. I wouldn't buy one as a monitor unless your "desk" is at least 5-6 feet deep and you only want to keep it a few years at most, but if you want a TV/living room monitor, it's a good choice.

K8.0 fucked around with this message at 04:13 on Oct 13, 2019

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

K8.0 posted:

There are Freesync displays with good variable overdrive implementations. That said, no LCD responds as fast as an OLED. You get horrible artifacting if you try to get even close. I wouldn't buy one as a monitor unless your "desk" is at least 5-6 feet deep and you only want to keep it a few years at most, but if you want a TV/living room monitor, it's a good choice.

Yeah I got you. I'm not trying to use it as a monitor.

Taima fucked around with this message at 04:41 on Oct 13, 2019

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

Optimus Subprime posted:

Out of curiosity, I rolled my geforce drivers back to 430.86, and I am able to get my dynamic range back to RGB Full at 2560x1080 144hz with my displayport output. Guess something went bad from 431 onward.

lol, this has been my experience for the last three years. I couch surf using a 4k TV and it's a crapshoot if the next nvidia driver release supports 4:4:4 or not. Sometimes I get a pink screen! Sometimes it's green! Sometimes it's washed out! Who the gently caress knows!

B-Mac
Apr 21, 2003
I'll never catch "the gay"!
Digital Foundry has a video up about the ray traced emissive lightning used in the new Metro Exodus DLC, it looks fantastic.

https://youtu.be/PBhnTVuD31I

Excited for nvidia and AMD next gen GPUs to see how ray traced performance improves. Metro Exodus ray traced global illumination was great and the ray traced reflections in Control are really good as well, will be great to see games start to incorporate several of these options with a lesser performance hit. DLSS in Control is the best implementation I’ve played with too, it’s nice that you can also edit an INI file and set a manual DLSS resolution. I’m using a DLSS resolution of 2176x1226 on my 1440p monitor, hard to tell the difference and have a decent performance bump.

Adbot
ADBOT LOVES YOU

Mr.PayDay
Jan 2, 2004
life is short - play hard

Thelonius Van Funk posted:

I have a GTX 1080 Ti that I am pretty pleased with, but the prices for the RTX 2080 Ti have finally started dropping in Sweden. Is there any point in upgrading or should I just hold out for the new cards in 2020?

I have 2 gaming buddies at work that actually stepped up from their 1080Ti to a 2080Ti several months ago because a 2180Ti or 3080 Ti before summer 2020 is highly unlikely.

Even the beast 1080Ti already struggles at some games at 1440p Ultra Settings to get or keep 60 avg fps. The 2080Ti adds a serious performance punch and additional FPS corridor above the 1080Ti.
Plus the Raytracing Option as additional perk/feature.

If
1.) you are a „all sliders at ultra/extreme settings“ junkie and want to have the currently fastest Raytracing option (for only some games tho)
2.) you play on 1440p or 3440*1400 or even 4K
3.) you sometimes even like to add additional reshade filters that cost fps
4.) you prefer a faster GPU instead of a bigger SSD or a new monitor for example
5.) you don’t want to wait another 9-15months for a 2180Ti or 3080Ti
6.) you want to play RDR2 and Cyberpunk in all graphical glory with the fastest GPU
(Or if you have that disposable income anyway)

... then the 2080Ti is the correct step up and buy to replace the 1080Ti.
The 2080 Ti still is far ahead of all other AMD and Nvidia GPUs fps wise.

If it is worth it, is your decision. My irrational fps infected mind says : Buy that beautiful Monster Framefernale that is the 2080Ti

Mr.PayDay fucked around with this message at 17:32 on Oct 13, 2019

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply