Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
eames
May 9, 2009

Taima posted:

So the 2019 LG OLEDs are getting support for HDMI 2.1 VRR Gsync when paired with a 2000 series Nvidia card.

This is a huge deal. Is this the very first implementation we know of re: HDMI 2.1 GSYNC?

https://www.engadget.com/2019/09/09/g-sync-lg-oled/

In case anyone else is wondering what kind of G-Sync this is talking about, it's G-Sync Compatible, so Nvidia certified VRR without the FPGA. (I can't open engadget)

I imagine we'll see a lot more of these announcements leading up to the launch of the next gen consoles.

Adbot
ADBOT LOVES YOU

craig588
Nov 19, 2005

by Nyc_Tattoo
Nvidia driver 436.30 released https://www.geforce.com/drivers/results/151275
Lots of minor changes, nothing major

Worf
Sep 12, 2017

If only Seth would love me like I love him!

Isn't laptop monitor gsync done via HDMI standard

Riflen
Mar 13, 2009

"Cheating bitch"
Bleak Gremlin

Statutory Ape posted:

Isn't laptop monitor gsync done via HDMI standard

No. It's achieved using embedded DisplayPort and LVDS.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

Riflen posted:

No. It's achieved using embedded DisplayPort and LVDS.

Ty, I must have misunderstood what I read (or misremembered)

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

Taima posted:

So the 2019 LG OLEDs are getting support for HDMI 2.1 VRR Gsync when paired with a 2000 series Nvidia card.

This is a huge deal. Is this the very first implementation we know of re: HDMI 2.1 GSYNC?

https://www.engadget.com/2019/09/09/g-sync-lg-oled/

If this move puts an end to DisplayPort and all its bullshit I will buy Nvidia GPUs until the end of time.

Zarin
Nov 11, 2008

I SEE YOU

K8.0 posted:

If this move puts an end to DisplayPort and all its bullshit I will buy Nvidia GPUs until the end of time.

Rookie question here, but what is bad about DisplayPort?

The only thing I can think of is the computer thinking the monitor was disconnected when it goes to sleep/shuts off. Which, is annoying, but something I've just managed to get used to I guess.

Sure would be nice to get un-used to that, though. Anything else?

wolrah
May 8, 2006
what?

Zarin posted:

Rookie question here, but what is bad about DisplayPort?

The only thing I can think of is the computer thinking the monitor was disconnected when it goes to sleep/shuts off. Which, is annoying, but something I've just managed to get used to I guess.

Sure would be nice to get un-used to that, though. Anything else?

That's not DisplayPort, that's the monitor. If the monitor stops sending the "I'm here" signal the graphics card stops reporting that it's there. The same signal exists in HDMI, DVI, even VGA. That behavior is your monitor doing something weird, not some inherent nature of the protocol.

There seems to be a popular display controller IC (or line of them) that has this quirk, because there are a few manufacturers where a large portion of their product line do what you describe, but there are also many that don't.

Indiana_Krom
Jun 18, 2007
Net Slacker
Look for a HDMI/DisplayPort/Etc "deep sleep" feature/toggle in your monitors OSD, it is responsible for the monitor disconnecting when in sleep mode. Disable it and the monitor stops disconnecting when it is in sleep/off modes so your desktop and all its icons will stay in place.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

Zarin posted:

Rookie question here, but what is bad about DisplayPort?

The only thing I can think of is the computer thinking the monitor was disconnected when it goes to sleep/shuts off. Which, is annoying, but something I've just managed to get used to I guess.

Sure would be nice to get un-used to that, though. Anything else?

The point of DisplayPort is that it's supposed to be HDMI but better because it's just for monitors. The problem is that because the standards are always lagging way behind, it's HDMI but worse. HDMI 2.1 standardized enough bandwidth for proper 4k high refresh along with VRR in November 2017. DP 2.0 which finally brings enough bandwidth for 4k high refresh was just ratified almost two years later. We've had two years of extremely lovely and limited high refresh 4k monitors, not because the panels aren't good enough, but because Displayport is rear end.

Similarly, you see all the wake/sleep bugs and similar protocol issues crop up constantly with DP and not HDMI because it's not as widespread and frequently not implemented as well. It just needs to loving die, it has nothing to offer.

K8.0 fucked around with this message at 22:59 on Sep 10, 2019

Geemer
Nov 4, 2010



Indiana_Krom posted:

Look for a HDMI/DisplayPort/Etc "deep sleep" feature/toggle in your monitors OSD, it is responsible for the monitor disconnecting when in sleep mode. Disable it and the monitor stops disconnecting when it is in sleep/off modes so your desktop and all its icons will stay in place.

Unless your monitor goes :downs: when you press the power button again to start it and disconnects for half a second, making most of your windows 640x480 in the top-left corner because the defaults for Windows' virtual monitor are hot garbage.
At least it doesn't do it when waking from sleep.


Also, DP cable connectors loving suck with their little retaining latch that serves zero purpose other than annoying the user. I've never had a connector fall out on its own with HDMI, or even DVI and VGA (w/o tightening the screws on those).

wargames
Mar 16, 2008

official yospos cat censor

Taima posted:

So the 2019 LG OLEDs are getting support for HDMI 2.1 VRR Gsync when paired with a 2000 series Nvidia card.

This is a huge deal. Is this the very first implementation we know of re: HDMI 2.1 GSYNC?

https://www.engadget.com/2019/09/09/g-sync-lg-oled/

VRR is standard for HDMI 2.1 so AMD cards can do it with ease, note all consoles will use AMD stuff so tvs have to support consoles VRR.

Also G-Sync Compatible is just freesync by another name.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
All I know is that I'm about to have Nvidia tested and approved HDMI 2.1 VRR on my C9 OLED with my 2080 soon, and it's going to be great.

Can an AMD graphics card do that?

greasyhands
Oct 28, 2006

Best quality posts,
freshly delivered

wargames posted:

VRR is standard for HDMI 2.1 so AMD cards can do it with ease, note all consoles will use AMD stuff so tvs have to support consoles VRR.

Also G-Sync Compatible is just freesync by another name.

Yeah but doesn't it have to be whitelisted by Nvidia to work? So being specifically listed as gsync compatible "matters"

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
NVidia approved "GSync Compatible" just means they've tested it and it actually works right. You can still run FreeSync on non-approved monitors, it just may not function well. Or it might be perfectly fine. NVidia just didn't bother to test it.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

DrDork posted:

NVidia approved "GSync Compatible" just means they've tested it and it actually works right. You can still run FreeSync on non-approved monitors, it just may not function well. Or it might be perfectly fine. NVidia just didn't bother to test it.

How common is it that consumer monitors are untested but would otherwise pass? Nvidia had already tested over 500 different consumer monitors, and that was as of 6 months ago. They said only 6% of those monitors passed.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
You shouldn't really care about Nvidia's testing results, you should just care about buying known-good monitors. Tons of monitors work just fine but won't pass due to odd, specific requirements of Nvidia's certification process, and due to how time consuming their tests are it's probably more a measure of how much a manufacturer was willing to pay for priority on certification than anything else.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
that's fair, do you know what they're testing that they're failing that a consumer wouldn't notice?

I do find that a little confusing because unless I'm missing something (which is very possible) Nvidia shouldn't have any vested interest in writing certification criteria that disqualify monitors for no reason.

craig588
Nov 19, 2005

by Nyc_Tattoo
They really want Gsync to sell so all of their time and money wasn't wasted developing it.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
A lot of FreeSync monitors only actually can execute the VRR functions across a fairly narrow band, say 40-70 FPS, which I imagine would itself be enough to not meet NVidia's requirements, but still work just fine within the limitations of the monitor.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Yeah, if you want “it works as well as an AMD card would with this FreeSync monitor” there are a lot more monitors than meet NVIDIA’s higher bar of “works roughly as well this NVIDIA card would with a GSync monitor”.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
Oooh ok, understood thanks.

Forgive my naivete on this subject, I've been a pretty dedicated PC gamer since forever but the last few years I've preferred big screen TV gaming over smaller monitors, hence why VRR hasn't really been on my radar (and also why I'm extra stoked on this C9 announcement).

A few questions about the specifics if y'all don't mind:

1) I have a 2080 and really enjoy RTX on games that support it. Does VRR make RTX more feasible at higher resolutions, since you can have dips in the FPS and not "feel" it as much? I played through Exodus on a native 1080p TV with full RTX and it was a blast, but I did miss the resolution as I'm usually gaming at 4k/60.

2) What happens if you go over or under the FPS "range" that a given VRR display utilizes? Like for instance DrDork posted that a theoretical range could be 40-70fps. Sometimes you would think that the FPS would go above 70 given, say, a scene that was particular GPU non-intensive. What happens then?

I assume that FPS limiters wouldn't work, as my understanding is that FPS limits are essentially VSYNC? Though I could be wrong on that for sure.

3) Are there any other benefits of VRR that I'm missing beyond the elimination of tearing and the ability to not feel FPS dips as much?

Picardy Beet
Feb 7, 2006

Singing in the summer.

Taima posted:

How common is it that consumer monitors are untested but would otherwise pass? Nvidia had already tested over 500 different consumer monitors, and that was as of 6 months ago. They said only 6% of those monitors passed.

Last I Iooked, the AW2518HF wasn't listed. But, after using it in gsync compatible mode, I know the AW2518HF works perfectly this way. But as it is the FreeSync version of the AW2518H with GSync and the nvidia tax, maybe, just maybe, neither nVidia or Dell don't really want to advertise the AMD solution.

sauer kraut
Oct 2, 2004

Taima posted:

2) What happens if you go over or under the FPS "range" that a given VRR display utilizes? Like for instance DrDork posted that a theoretical range could be 40-70fps. Sometimes you would think that the FPS would go above 70 given, say, a scene that was particular GPU non-intensive. What happens then?

I assume that FPS limiters wouldn't work, as my understanding is that FPS limits are essentially VSYNC? Though I could be wrong on that for sure.

Hitting the upper limit just enables vsync. I don't actually know what happens when you crash through the low on a narrow band lovely FreeSync display, that is the reason why they are pointless.
On a proper one with say ~48 to 120/144fps, when you test the lower end it switches instantly to frame doubling so your 50fps become 100 at the output. If you crash even lower it can triple or even quad frames as needed.
That is a pretty huge hurdle for cheapo displays, as rapid changes in output fps like that manifest with perceived brightness changes (ie flickering or pulsating) if not compensated for. I would bet a modest sum that's the reason most monitors fail Nvidias tests, even if they have the range to support Low Framerate Compensation.

Frame limiting with xsync to say 140fps on a 144 display can be a good idea (if your limiter is reliable) since it avoids hitting the upper end and enabling vsync with a sudden input lag spike. If you're not sensitive to that, don't bother.

sauer kraut fucked around with this message at 13:33 on Sep 11, 2019

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

I don't know when nVidia did this but in the control panel there's an option for Fast vsync, and it seems like it's awesome. Not as good as GSync or Freesync technologically I am sure, but it seems to eliminate frame tearing but not have input lag issues. Some reading online suggests that it perhaps works well with some games but not others, and it's not super popular w/ reddit gamers apparently, but I don't see this stuttering some of them are insisting is just part of how fast vsync works? It just seems to be better than the other vsync methods in some things I am playing right now, and that includes under 60 fps, not just as some are claiming "when FPS is a multiple of monitor refresh." (On reading further, it seems the stuttering thing was more prevalent when it first released and some things have improved - also, it seems to work best with the low latency mode on Ultra, which hasn't even existed for more than like a month now, with some users reporting much better results previously when using the then-best max 1 pre-rendered frames setting).

Would love to try one of the real sync monitors though.

Agreed fucked around with this message at 14:52 on Sep 11, 2019

Saukkis
May 16, 2003

Unless I'm on the inside curve pointing straight at oncoming traffic the high beams stay on and I laugh at your puny protest flashes.
I am Most Important Man. Most Important Man in the World.

K8.0 posted:

HDMI 2.1 standardized enough bandwidth for proper 4k high refresh along with VRR in November 2017. DP 2.0 which finally brings enough bandwidth for 4k high refresh was just ratified almost two years later. We've had two years of extremely lovely and limited high refresh 4k monitors, not because the panels aren't good enough, but because Displayport is rear end.

I think that's your error. It was DisplayPort 1.3 in 2014 that enabled 4K@120Hz. HDMI has been lagging behind this whole decade and it wasn't until 2.1 that HDMI got slightly ahead, but DP 2.1 again left it far behind.

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

Saukkis posted:

I think that's your error. It was DisplayPort 1.3 in 2014 that enabled 4K@120Hz. HDMI has been lagging behind this whole decade and it wasn't until 2.1 that HDMI got slightly ahead, but DP 2.1 again left it far behind.

Not without some form of compression however if you want 10-bit colour with HDR. Linus has a handy chart where you can plug in your needs from a monitor and see the max refresh rate each standard would support. At 4k with 10bit colour, RGB/YCbCr and no compression, Displayport 1.4 gives you a max of 97 hz. HDMI 2.1 will give you 155.

DP was largely up the to task to deliver high refresh rates at 4k+ well before HDMI yes, but albeit while HDR is not really a concern for me in a PC monitor it has become somewhat of a bullpoint item in a high-end display (or at least 10bit colour), so DP could be indeed be limiting with it in terms of a 120hz display.

And yes DP 2.1 goes far ahead of HDMI 2.1. It's definitely not going away as a primary PC connection anytime soon.

repiv
Aug 13, 2009

Happy_Misanthrope posted:

And yes DP 2.1 goes far ahead of HDMI 2.1.

Although to make things even more complicated, DP2.1 is split up into 3 speed tiers and only the slowest one (which is similar to HDMI 2.1 speed) works with conventional cables.

The two faster modes will probably require short cables that are permanently attached to the display in order to meet the insanely tight signaling requirements.

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

repiv posted:

Although to make things even more complicated, DP2.1 is split up into 3 speed tiers and only the slowest one (which is similar to HDMI 2.1 speed) works with conventional cables.

The two faster modes will probably require short cables that are permanently attached to the display in order to meet the insanely tight signaling requirements.

oh ffs

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

It's not as stupid as it sounds: passive copper cables will still get you ~40Gbps bandwidth, which is around where HDMI 2.1 is targeting with their passive copper cables, too (~48Gbps for them, IIRC). DP 2.0's lowest bandwidth tier is literally TB3, but with all four channels used to push data in one direction, instead of the normal bi-directional 2/2 setup. Modern passive copper cabling more or less taps out around 40-50Gbps, and anything above that for either spec is going to require fancy solutions, most likely involving active cables or a switch to fiber at some point in the future.

The other part is that DP 2.0/HDMI 2.1's copper cabling can still support 4k@120@30b without compression, so the monitors that would actually need the special-snowflake tethered cables are going to be some pretty bleeding-edge / niche products for a while. Hopefully by the time a >4k >120Hz monitor costs something any of us here could actually afford, DP or HDMI will have figured out a better cabling option.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

Can't I just get solid gold cables

Then I could at least justify the monitor as an investment

Geemer
Nov 4, 2010



Statutory Ape posted:

Can't I just get cryogenic superconductor cables

Then I could at least justify the monitor as an investment

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

Taima posted:

Oooh ok, understood thanks.

Forgive my naivete on this subject, I've been a pretty dedicated PC gamer since forever but the last few years I've preferred big screen TV gaming over smaller monitors, hence why VRR hasn't really been on my radar (and also why I'm extra stoked on this C9 announcement).

A few questions about the specifics if y'all don't mind:

1) I have a 2080 and really enjoy RTX on games that support it. Does VRR make RTX more feasible at higher resolutions, since you can have dips in the FPS and not "feel" it as much? I played through Exodus on a native 1080p TV with full RTX and it was a blast, but I did miss the resolution as I'm usually gaming at 4k/60.

2) What happens if you go over or under the FPS "range" that a given VRR display utilizes? Like for instance DrDork posted that a theoretical range could be 40-70fps. Sometimes you would think that the FPS would go above 70 given, say, a scene that was particular GPU non-intensive. What happens then?

I assume that FPS limiters wouldn't work, as my understanding is that FPS limits are essentially VSYNC? Though I could be wrong on that for sure.

3) Are there any other benefits of VRR that I'm missing beyond the elimination of tearing and the ability to not feel FPS dips as much?

1. Maybe. I can't really tell you how sensitive you are to bad FPS. 58 FPS/60hz with vsync is a heinous experience to some people and totally fine to others.

2. If you go over or under the FPS range (frame time really, it's on a per frame basis), VRR is off and you get the standard behavior you would get with either vsync or no sync. This is why capping your framerate (with RTSS or the few good in-game limiters, the other ways suck) is in fact an integral part of using VRR.

3. You almost eliminate the massive latency penalty of vsync, while avoiding potential reaction time increases from your brain having issues processing images with tearing.

K8.0 fucked around with this message at 20:56 on Sep 11, 2019

Indiana_Krom
Jun 18, 2007
Net Slacker

Statutory Ape posted:

Can't I just get solid gold cables

Then I could at least justify the monitor as an investment

No.

(Because copper is a better conductor than gold, we just gold plate the connections because it doesn't tarnish like the bare copper would.)

sauer kraut
Oct 2, 2004
Reviews for the MSI Gaming X 5700XT are in, and of course they flicked power target up 50W and vCore to a laughable 1200mV.
For a measily 1-2 fps gain :smith: Don't buy overclocked AMD anything please.

Cavauro
Jan 9, 2008

Actually, gold has a lower hardness than copper so it's easier for electricity to flow through it.

Indiana_Krom
Jun 18, 2007
Net Slacker

Cavauro posted:

Actually, gold has a lower hardness than copper so it's easier for electricity to flow through it.

No poo poo? I must be thinking about thermal conductivity...

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Indiana_Krom posted:

No poo poo? I must be thinking about thermal conductivity...

No, you're right. Silver actually has the best electrical conductivity, followed very closely by copper. Gold is the next step down. The IACS standard has pure copper as 100, silver as 105, and gold as 70.

Gold is used on connectors because, unlike copper and silver, it's non-reactive, and thus does not corrode or tarnish, which would cause connectivity issues.

Indiana_Krom
Jun 18, 2007
Net Slacker
I knew silver was better than copper and gold is a fair step back for thermal conductivity (diamond beats them all by a significant amount) but I wasn't sure on electrical conductivity.

Adbot
ADBOT LOVES YOU

Cavauro
Jan 9, 2008

Sorry for my post

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply