Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Rabid Snake posted:

Thats what I was thinking, I can always use it as a normal 75hz ultrawide if I switch to a Nvidia card. I love my 290 (except for the power draw) so I might stick to AMD if they can release some good competitive cards.

Right now, NV is the only game in town for top performance, AMD's holding out for HBM2 to release their top two chips, but that means that they should be getting their 1080Ti equivalent out about the same time as NV starts selling that to consumers rather than data centers. If you don't mind flipping cards an NV card in the interim may be a good call. I'm personally committed to riding my 290 out unless Polaris is better than expected and it's viable for me on an XR341ck, but I don't mind turning down a few settings and running sub 60 with freesync on for demanding games.

Adbot
ADBOT LOVES YOU

froody guy
Jun 25, 2013

This belogns to the GPU Megathread but it's worth crossposting

https://www.youtube.com/watch?v=WpUX8ZNkn2U

Frakkin amazing setting, makes me wish I had an high refresh rate monitor (w/o gsync!)

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

froody guy posted:

This belogns to the GPU Megathread but it's worth crossposting

https://www.youtube.com/watch?v=WpUX8ZNkn2U

Frakkin amazing setting, makes me wish I had an high refresh rate monitor (w/o gsync!)

But fast sync matters when your render rate is higher than your refresh rate, so having a high refresh rate display makes fast sync *less* valuable, and makes vsync less of a latency issue.

fozzy fosbourne
Apr 21, 2010

Yeah, I've seen a few people jumping to the conclusion that fast sync obsoletes gsync but they don't seem to be solving the same problem. From reading pcper, it sounds like there would be judder at lower framerates and the feature would really be great when you are rendering many more frames than your monitor's refresh (like CS:GO or a map game or whatever)

I'm curious if fast sync works with ulmb; I assume it does, and that might be (subtle) enhancement for less demanding games where you are pegged at 120hz.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

fozzy fosbourne posted:

Yeah, I've seen a few people jumping to the conclusion that fast sync obsoletes gsync but they don't seem to be solving the same problem. From reading pcper, it sounds like there would be judder at lower framerates and the feature would really be great when you are rendering many more frames than your monitor's refresh (like CS:GO or a map game or whatever)

I'm curious if fast sync works with ulmb; I assume it does, and that might be (subtle) enhancement for less demanding games where you are pegged at 120hz.

The Nvidia dude in the video actually goes into the Gsync/Fast sync relationship, basically he says that ideally you want both because Gsync helps at low frame rates and Fast sync is the opposite helping remove tearing seen at very high frame rates beyond your monitor's refresh rate.

Also Fast sync should be coming to most Nvidia cards including past ones, it's not a 1080/70 thing, it's just a driver thing.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Rabid Snake posted:

Everyone's been talking about the Asus X34 but how is the freesync version, the XR341CK? I know the biggest knock is 100hz -> 75hz.

$590 in the Asus Recertified store is tempting.. but I was thinking about upgrading to a Nvidia 1080 card so this monitor might not be the best choice vs a GSync version.
It's a fantastic monitor. First, obviously, is the price. It's worth remembering that Acer caught on to people buying the Recerted X34's and jacked those up to $1000, so it's almost double the price of the XR34. Second, there seem to be less QA issues with them. A lot of that's more or less anecdotal, and it may also be a case of people being more picky about their $1300 monitor than their $600 one, but it is what it is. Third is that 75Hz is generally not a hard limit--many people report being able to push it into the high 80's or low 90's, at which point you're really very close to the X34 to begin with, especially since not all X34's will ever hit 100Hz (they only say "up to 100Hz" and do not guarantee it in any way, though you could probably play monitor roulette if getting stuck at 95Hz really offended you).

I mean, if price is no object, and you have an NVidia card, then by all means go with the X34. But if $400+ is something that matters to you, or you have or think you will have an AMD card, the XR34 looks mighty attractive.

fozzy fosbourne
Apr 21, 2010

AVeryLargeRadish posted:

The Nvidia dude in the video actually goes into the Gsync/Fast sync relationship, basically he says that ideally you want both because Gsync helps at low frame rates and Fast sync is the opposite helping remove tearing seen at very high frame rates beyond your monitor's refresh rate.

Also Fast sync should be coming to most Nvidia cards including past ones, it's not a 1080/70 thing, it's just a driver thing.

Interesting. I've read about people capping their fps a few frames below their refresh rate to keep gsync enabled; I wonder if it's still worth doing that in some contexts since it sounds like fast vsync still adds a frame of latency

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

DrDork posted:

I mean, if price is no object, and you have an NVidia card, then by all means go with the X34. But if $400+ is something that matters to you, or you have or think you will have an AMD card, the XR34 looks mighty attractive.

The XR341 is one of those happy freesync monitors that are usually limited to the low end that are real compelling even without freesync turned on. A lot of the reason I got one is because that made it really easy to not worry about vendor lock-in.

froody guy
Jun 25, 2013

Even in cases when your gpu rendering is below the monitor's refresh rate fast sync will show the "last rendered frame" entirely so no tearing, no stuttering. It's not dynamic as gsync but sounds like gold to me.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

froody guy posted:

Even in cases when your gpu rendering is below the monitor's refresh rate fast sync will show the "last rendered frame" entirely so no tearing, no stuttering. It's not dynamic as gsync but sounds like gold to me.
It also doesn't cost you an extra $200-$300, so there's that going for it, too.

fozzy fosbourne
Apr 21, 2010

froody guy posted:

Even in cases when your gpu rendering is below the monitor's refresh rate fast sync will show the "last rendered frame" entirely so no tearing, no stuttering. It's not dynamic as gsync but sounds like gold to me.

In a context where rendering framerate is lower than display refresh rate, isn't that pretty much standard (double buffered) vsync, though? If it can't render frames faster than your monitor wants to display them, the system will only be able to buffer one complete frame and it will have the same limitations with regards to stuttering and latency as vsync. It seems like you would need to sometimes hit 2x your refresh rate in order to be able to do better than vsync (and even then, it seems like it would effect latency but not judder)

e: clarified the context in first sentence

fozzy fosbourne fucked around with this message at 23:59 on May 17, 2016

froody guy
Jun 25, 2013

The principle is the same of the triple buffering but detaching the rendering from the display you don't create latency and don't risk to fill up the buffer and create drops or queue on the render/gpu side. Basically with triple buffering once the buffer is full the rendering stops, this creates latency and lots of overhead, with fast sync the rendering keeps going as fast as it can (as in vsync off) so you'll always have the best and newest "last rendered frame" avaialble to display and zero talking between the display and the gpu as in "hey you can start filling up the buffer again, I'm done with the previous frame". Hope it makes sense, actually the guy in the vid answers the same question at a certain point.

froody guy fucked around with this message at 23:26 on May 17, 2016

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Triple buffering should work like fast sync, but instead usually works like double buffering with an extra linear queue entry. Fast sync actually works, works with DX and not just GL, and doesn't require application awareness of buffer state for doing things that need access to the previous frame. (I'm not sure how TXAA works if previous frames can be dropped, actually, but I can't be bothered to draw a diagram and try to figure it out right now!)

fozzy fosbourne
Apr 21, 2010

Subjunctive posted:

Triple buffering should work like fast sync, but instead usually works like double buffering with an extra linear queue entry. Fast sync actually works, works with DX and not just GL, and doesn't require application awareness of buffer state for doing things that need access to the previous frame. (I'm not sure how TXAA works if previous frames can be dropped, actually, but I can't be bothered to draw a diagram and try to figure it out right now!)

Yeah, that makes sense to me. But what I'm not clear on is how you take advantage of that when your rendering can't keep up with the display rate (in comparison to the usual double buffered vsync and not gross direct x triple buffering :P). It seems if your display rate is faster than you can render frames, then you'll still need to re-display the last displayed frame frequently and you'd have the same latency and judder from showing a stale frame as double buffered vsync. You can't display a partial frame (tearing) and you aren't able to fill the buffer with more than one complete frame since you can't render them at a rate of 2x the display, so you'd pretty much always have a buffer of 0-1.9 completely drawn frames, right? Maybe it provides advantages at lower frame rates if your rendering pipeline is occasionally able to catch up and fill the buffer, which would never happen with double buffered vsync?

Around this timestamp, the presenter mentions that this is a technology for games that are rendering well in excess of the refresh rate https://www.youtube.com/watch?v=WpUX8ZNkn2U&t=847s. Also, at the end of the presentation, when comparing this to gsync, he also mentions that fast sync is useful when your render rate is a multiple of your display rate which seems to jive with my understanding of the technique not being so useful unless you can typically render greater than one frame per refresh. Not trying to be FUDdy here, but want to make sure I understand this right.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

fozzy fosbourne posted:

Maybe it provides advantages at lower frame rates if your rendering pipeline is occasionally able to catch up and fill the buffer, which would never happen with double buffered vsync?

Also, at the end of the presentation, when comparing this to gsync, he also mentions that fast sync is useful when your render rate is a multiple of your display rate which seems to jive with my understanding of the technique not being so useful unless you can typically render greater than one frame per refresh. Not trying to be FUDdy here, but want to make sure I understand this right.

Fast sync only helps when render >> refresh, by reducing latency for vsync-on (by keeping the renderer from backing up on the back buffer). Gsync only helps when render < refresh (by varying the actual refresh so that it happens when the frame is ready rather than waiting until the next "tick").

Both avoid tearing that you get with vsync-off, in their respective refresh ranges.

fozzy fosbourne
Apr 21, 2010

Subjunctive posted:

Fast sync only helps when render >> refresh, by reducing latency for vsync-on (by keeping the renderer from backing up on the back buffer). Gsync only helps when render < refresh (by varying the actual refresh so that it happens when the frame is ready rather than waiting until the next "tick").

Both avoid tearing that you get with vsync-off, in their respective refresh ranges.

Ok, yeah, that's my understanding.

Still think it will be pretty sweet if you can somehow peg a game at 120+fps and enable ulmb.

e: Also, it seems like you are still better capping your framerate just under your refresh and using gsync, if it's available. So you don't have the (however small) sampling misses and latency that they referred to in the video presentation. The crazies on blurbusters seem to think the same

fozzy fosbourne fucked around with this message at 01:20 on May 18, 2016

Yaoi Gagarin
Feb 20, 2014

How is the frame rate limit actually implemented in code? Does the driver simply block a present call until the "right" amount of time has passed? That seems like it would still introduce some latency even if the monitor supports *sync.

fozzy fosbourne
Apr 21, 2010

VostokProgram posted:

How is the frame rate limit actually implemented in code? Does the driver simply block a present call until the "right" amount of time has passed? That seems like it would still introduce some latency even if the monitor supports *sync.

Not sure, good question. This old blur busters article measured display lag with uncapped gsync and capped gsync in cs:go and found some interesting results, but didn't really have an explanation
http://www.blurbusters.com/gsync/preview2/

fozzy fosbourne
Apr 21, 2010

Found this where durante describes the gedosato implementation of frame capping: http://blog.metaclassofnil.com/?p=715

Yaoi Gagarin
Feb 20, 2014

fozzy fosbourne posted:

Found this where durante describes the gedosato implementation of frame capping: http://blog.metaclassofnil.com/?p=715

That's interesting, and confirms what I was thinking. Frame rate capping seems to have the same input lag problem as vsync when you're rendering too fast, except the interval you wait for is configured by the user. Definitely seems like uncapped frame rate with fast sync is the way to go.

That predictive waiting system is pretty cool. Also makes me wonder if phones and laptops can save some power by putting the GPU in a lower-power state during the waiting period.


e: Good god it's impossible to read the slides on the fast sync video. People should be banned from making slides with anything other than black-white contrast for text.

Yaoi Gagarin fucked around with this message at 05:39 on May 18, 2016

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

You might be able to get more consistent frame times with a cap at the cost of that input lag, but I bet that's highly implementation dependent and usually broken in a way that obviates the potential gains.

Mikojan
May 12, 2010

Hi guys, I just started building a new PC and awaiting to buy a 1080GTX for some high end gaming.

While waiting I'm looking for the best monitor I could get in the 700-900 range.

For now I'm torn between the Asus PG27AQ and the Acer Predator XB271HU.

IPS and G-sync is a must and it basically comes down to:

4K (163PPI) with 60hz
vs
1440p (108PPI) with 165hz

One of my hobbies is photography and that 163PPI is really really tempting for some insane looking pictures. On the other hand I'm afraid the 1080GTX is going to be just barely not good enough for gaming at 4K according to the latest benchmarks.
Or does G-sync really make gaming <60FPS more acceptable? (I've never used G-sync)

I'm not even sure 165hz is worth it for non-competative gaming (read: at or below 60FPS)?

Is the 4K one more 'future proof' considering I could just game at lower resolutions until appropriate GPU's are out?

Mikojan fucked around with this message at 11:23 on May 18, 2016

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

Mikojan posted:

Hi guys, I just started building a new PC and awaiting to buy a 1080GTX for some high end gaming.

While waiting I'm looking for the best monitor I could get in the 700-900 range.

For now I'm torn between the Asus PG27AQ and the Acer Predator XB271HU.

IPS and G-sync is a must and it basically comes down to:

4K (163PPI) with 60hz
vs
1440p (108PPI) with 165hz

One of my hobbies is photography and that 163PPI is really really tempting for some insane looking pictures. On the other hand I'm afraid the 1080GTX is going to be just barely not good enough for gaming at 4K according to the latest benchmarks.
Or does G-sync really make gaming <60FPS more acceptable? (I've never used G-sync)

I'm not even sure 165hz is worth it for non-competative gaming (read: at or below 60FPS)?

Is the 4K one more 'future proof' considering I could just game at lower resolutions until appropriate GPU's are out?

According to everyone I have heard who has used Gsync it really helps out a ton at lower frame rates, however I would probably go with the higher refresh monitor because from what I have heard the difference in smoothness going from 60Hz to 100Hz+ is huge and you will see much better frame rates in general so basically: High refresh + higher frame rates + Gsync > Higher DPI + lower frame rates compensated for by Gsync.

Mikojan
May 12, 2010

AVeryLargeRadish posted:

According to everyone I have heard who has used Gsync it really helps out a ton at lower frame rates, however I would probably go with the higher refresh monitor because from what I have heard the difference in smoothness going from 60Hz to 100Hz+ is huge and you will see much better frame rates in general so basically: High refresh + higher frame rates + Gsync > Higher DPI + lower frame rates compensated for by Gsync.

After delving a bit further into the DPI topic I found out that the 1440p monitor has the same sharpness sitting 32 inches from the screen compared to 21 inches from the 4K screen.

I don't see myself sitting anywhere close to 21 inches from a 27 inch screen so that is off the table.

XB271HU it is then

Mikojan fucked around with this message at 13:28 on May 18, 2016

PRADA SLUT
Mar 14, 2006

Inexperienced,
heartless,
but even so
How is the Dell 2412M? I'm looking at work monitors but have to order from CDW.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

PRADA SLUT posted:

How is the Dell 2412M? I'm looking at work monitors but have to order from CDW.

Get the Dell U2415 instead, it is the newer and better version of the U2412M and gets rid of some problem that monitor had like replacing the nasty screen door anti-glare coating with a much better one that is almost unnoticeable. I have the U2415 myself and it is a really nice monitor.

Shaocaholica
Oct 29, 2002

Fig. 5E
What's a 23-24" 16x9 or 16x10 IPS display that has component input and isn't terribly old.

Something newer than a Dell 2408WFP or HP LP2475w???

Shaocaholica fucked around with this message at 00:16 on May 19, 2016

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.


If you need component for some ungodly reason, buy a separate converter and buy whatever monitor you want.

Component is a dead standard and it isn't going to be found on anything approaching a modern monitor.

Shaocaholica
Oct 29, 2002

Fig. 5E

bull3964 posted:

If you need component for some ungodly reason, buy a separate converter and buy whatever monitor you want.

Component is a dead standard and it isn't going to be found on anything approaching a modern monitor.

Yeah I know it's dead. Just trying to find out what was the last run with them.

Tony Montana
Aug 6, 2005

by FactsAreUseless
So I did some cable testing last night and I confirmed it.

Over the 2m DP cable that came with my x34 I can overclock to 100hz without a problem. I just tell it in the monitor OSD and it switches and I left it sitting like that for half an hour and there wasn't a flicker or anything. Great. So the monitor can do it, my x34 enjoys 100Hz that's a grand thing.

However, going to the 5 or even 10m DP cable and it doesn't work. When I make the change in the OSD the monitor will turn itself off, or perhaps display the new resolution for a second and then flicker off. The 10m cable cost me $100! I just like having the PC across the room next to the TV, cabled back to a desk where I do my desk related stuff.

Either it's the length, putting that much bandwidth of 3440x1440@100Hz over anything longer than 2m isn't possible. Or it's the cable, but they are brandname cables and I paid lots for them.

Anyone got some experience with this?

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Tony Montana posted:

So I did some cable testing last night and I confirmed it.

Over the 2m DP cable that came with my x34 I can overclock to 100hz without a problem. I just tell it in the monitor OSD and it switches and I left it sitting like that for half an hour and there wasn't a flicker or anything. Great. So the monitor can do it, my x34 enjoys 100Hz that's a grand thing.

However, going to the 5 or even 10m DP cable and it doesn't work. When I make the change in the OSD the monitor will turn itself off, or perhaps display the new resolution for a second and then flicker off. The 10m cable cost me $100! I just like having the PC across the room next to the TV, cabled back to a desk where I do my desk related stuff.

Either it's the length, putting that much bandwidth of 3440x1440@100Hz over anything longer than 2m isn't possible. Or it's the cable, but they are brandname cables and I paid lots for them.

Anyone got some experience with this?

Yeah, I had a similar experience trying to daisy chain because it seemed like I was only getting two lanes over the 10 foot cable. Works fine normally but you'd need them for > 60 adventures. AMD's drivers show it, no idea if you can see it on NV drivers. I've heard hearsay about cables that can do all four lanes at a 10 foot length, but I don't remember off the top of my head and they may not actually work or may be the exception.

Tony Montana
Aug 6, 2005

by FactsAreUseless
Right. So 10 feet is.. 3 meters. That's a short cable! So my 5 meter cable is 15 feet and my 10m cable is over 30 feet!

The point being my cables are much longer than the 10 foot cable you've noticed this behavior with. Sounds like cable length is the cause and you're talking about perhaps getting the data over a 3m cable, but it sounds like it's certainly not going to be a 5 or 10m cable.

Time to move the whole bloody lounge room around so my PC can be 2 meters from the monitor. That's ok, it's worth it for that glorious monitor.

Evil Fluffy
Jul 13, 2009

Scholars are some of the most pompous and pedantic people I've ever had the joy of meeting.
What's the major difference between the XB270HU and XB271HU at this point, that the latter is newer and overclocks to 165hz? I've been looking around more lately in my impatience for the 1070/1080 to go on sale and have seen more refurbished XB270HU monitors for around $450.

Etrips
Nov 9, 2004

Having Teemo Problems?
I Feel Bad For You, Son.
I Got 99 Shrooms
And You Just Hit One.

Evil Fluffy posted:

What's the major difference between the XB270HU and XB271HU at this point, that the latter is newer and overclocks to 165hz? I've been looking around more lately in my impatience for the 1070/1080 to go on sale and have seen more refurbished XB270HU monitors for around $450.

Just get the refurb!

Tony Montana
Aug 6, 2005

by FactsAreUseless
Everyone seems to be waiting for the 1080. It's going to be really expensive, as the top tier always is. Isn't it going to be another incremental upgrade? Why not buy at 980GTX? I have had one for something like 18 months now and it's great.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Tony Montana posted:

Everyone seems to be waiting for the 1080. It's going to be really expensive, as the top tier always is. Isn't it going to be another incremental upgrade? Why not buy at 980GTX? I have had one for something like 18 months now and it's great.
Well sure. But people who are waiting for the 1080 are well aware of that. The major advantages over the 980Ti are (1) it's faster, (2) much better VR performance, (3) presumably better DX12/Vulkan performance going forward, (4) it's newer and therefore more-better. It also has some features which won't do much for you now, but might down the road a bit (DP 1.4, etc).

In a thread filled with people buying/thinking of buying/lusting over $1300+ monitors, it seems odd to turn around and question spending an extra $200-$300 on a GPU.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

Tony Montana posted:

Why not buy at 980GTX? I have had one for something like 18 months now and it's great.

Because the 1070 is faster and cheaper.

Etrips
Nov 9, 2004

Having Teemo Problems?
I Feel Bad For You, Son.
I Got 99 Shrooms
And You Just Hit One.

DrDork posted:

In a thread filled with people buying/thinking of buying/lusting over $1300+ monitors, it seems odd to turn around and question spending an extra $200-$300 on a GPU.

To be fair, monitors have a longer life cycle compared to GPUs.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Etrips posted:

To be fair, monitors have a longer life cycle compared to GPUs.
Sure, but you look me in the eye and tell me that $1300 for an X34 is a sensible and economically-sound purchase over a $600 monitor because you're gonna keep it "long term."

People buy this sort of stuff because they have the money to do so and want to, and much like the X34 offers things that no other monitor does (hence its price), so too with the 1080.

Adbot
ADBOT LOVES YOU

VulgarandStupid
Aug 5, 2003
I AM, AND ALWAYS WILL BE, UNFUCKABLE AND A TOTAL DISAPPOINTMENT TO EVERYONE. DAE WANNA CUM PLAY WITH ME!?




Evil Fluffy posted:

What's the major difference between the XB270HU and XB271HU at this point, that the latter is newer and overclocks to 165hz? I've been looking around more lately in my impatience for the 1070/1080 to go on sale and have seen more refurbished XB270HU monitors for around $450.

A lot of XB270HU's will OC to 165, it all depends on the manufacturing date.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply