Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
KOTEX GOD OF BLOOD
Jul 7, 2012

SwissArmyDruid posted:

Yes.



Look at that water mesh. Well done. They should have had a function to scale back the amount of tessellation the farther away it gets from the camera, but it's servicable.

EXCEPT WAIT



WATER TESSELLATION MESH ALL THE TIME, NO CULLING, NO EXCEPTIONS

TL;DR Crytek are poo poo and I'm not really surprised they nearly went bankrupt and shed people to Star Citizen.

I am also not optimistic on Star Citizen.
Crysis 3, while having a stupid, stupid singleplayer mode, does still have the best multiplayer FPS gameplay of anything out there. Unfortunately all the servers are 100% level 50s and hackers these days.

Adbot
ADBOT LOVES YOU

CharlieFoxtrot
Mar 27, 2007

organize digital employees



AVeryLargeRadish posted:

Asus's hardware is generally well made and engineered, but Asus is also renowned for their awful customer service so YMMV. As far as if that card will fit you would need to measure, especially in a mITX build where space is at a premium.

Thanks! GPU "thickness" doesn't seem to be a spec that's universally available, but luckily I was able to find a completed build on PCPartPicker with the same case/mobo and a video card that is as thick, so that's looking up.

Otakufag
Aug 23, 2004
"A 1070 is overkill for 1080p, you should have gotten a 1060 instead kill yourself" is a common statement around here, but does it hold if you are also planning on getting a 144hz 1080p gsync monitor?

Otakufag fucked around with this message at 21:19 on Sep 3, 2016

Phlegmish
Jul 2, 2011



Otakufag posted:

"A 1070 is overkill for 1080p, you should have gotten a 1060 instead kill yourself" is a common statement around here, but does it hold if you are also planning on getting a 144hz gsync monitor?

No, I have a 1070 and a 1080p144Hz monitor myself. It's not like you're getting 144 FPS on the highest settings in the newest games, even with some older games it's not a given.

sauer kraut
Oct 2, 2004

Otakufag posted:

"A 1070 is overkill for 1080p, you should have gotten a 1060 instead kill yourself" is a common statement around here, but does it hold if you are also planning on getting a 144hz gsync monitor?

Not many people would say that, a 1060 is precisely what you need right now to play new games (with sane, high quality settings) at 1080p/60fps.
No telling how soon you'd have to reduce settings. The performance/price scaling up to the 1070 is linear, so you get exactly what you pay for. The 1080 has a moderate premium tax attached.

Klyith
Aug 3, 2007

GBS Pledge Week

CharlieFoxtrot posted:

If I am staying above 60fps, the benefit to Gsync would be minimal, right?

Benefits of gsync aren't minimal, but 144hz does close a lot of the gap compared to a 60hz monitor. The gsync tax is only worth paying if the rest of the components in your build are already plenty good and you still have an extra $200 to throw at the monitor. Saving that cash in favor of other stuff (or banking it toward the next upgrade) is probably a good idea if you're on the fence about affordability.

Also the only thing that makes a gsync monitor even remotely a good purchase is the fact that a good monitor can last 5-8 years. Getting a refurb with no warranty seems like a terrible idea in that case.


Otakufag posted:

"A 1070 is overkill for 1080p, you should have gotten a 1060 instead kill yourself" is a common statement around here, but does it hold if you are also planning on getting a 144hz 1080p gsync monitor?
Pshaw there are plenty of people in this thread for whom overkill is a meaningless word!

A 1080 would be overkill for that monitor. A 1070 is more like "heh I like keeping the eye candy maxed out and I can afford it." Get it with clear conscience if $400 is fine for you.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Klyith posted:

Benefits of gsync aren't minimal, but 144hz does close a lot of the gap compared to a 60hz monitor.

How does having a higher maximum refresh rate on the display provide any of the benefits of gsync?

Klyith
Aug 3, 2007

GBS Pledge Week

Subjunctive posted:

How does having a higher maximum refresh rate on the display provide any of the benefits of gsync?

Less delay between a frame being ready and being drawn on the screen, less noticeable skipped refreshes when framerate is not an even multiple of refresh. The ideal is zero delay and zero skips, and that's what free/gsync do*. But higher and higher refresh rate will get closer to that ideal. A theoretical 300hz monitor would probably make *sync pointless.

But it would be much easier to just make adaptive sync part of the standard than try to make 300hz monitors (and DP cables).

*in theory, real world there are still signal & processing time but whatever



edit: For an example, a 60hz monitor displaying a constant 50fps (w/ vsync) will average 6ms of extra delay just from a ready frames waiting on the next refresh. A 144hz monitor only adds 3ms.

Klyith fucked around with this message at 22:46 on Sep 3, 2016

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Klyith posted:

Benefits of gsync aren't minimal, but 144hz does close a lot of the gap compared to a 60hz monitor. The gsync tax is only worth paying if the rest of the components in your build are already plenty good and you still have an extra $200 to throw at the monitor. Saving that cash in favor of other stuff (or banking it toward the next upgrade) is probably a good idea if you're on the fence about affordability.

Also the only thing that makes a gsync monitor even remotely a good purchase is the fact that a good monitor can last 5-8 years. Getting a refurb with no warranty seems like a terrible idea in that case.

Pshaw there are plenty of people in this thread for whom overkill is a meaningless word!

A 1080 would be overkill for that monitor. A 1070 is more like "heh I like keeping the eye candy maxed out and I can afford it." Get it with clear conscience if $400 is fine for you.

144 Hz and GSync are totally different things. Faster refresh rates make games smoother (if you can push the framerates). GSync fixes judder/stuttering/framepacing and tearing. They are totally different things. They work well together, especially since many people cannot push 144 Hz in most games and GSync covers up the juddering and tearing issues that result, but they actually are not the same - you can be running >60fps and still have judder that makes the game feel less smooth.

GSync is actually pretty important across the board because it negates the impact of framerate variability, but you will see the most significant gains when you are using hardware that is really just too weak for what you're trying to do with it. If your hardware can only push 45 fps in a game, then that's perfectly playable on GSync.

The "GSync tax" is oversold. You can pick up a Dell S2716DG refurb for $350. You're going to pay a minimum of $300 for a new 27" 1440p IPS monitor anyway, even if you're buying a Korean monitor with a seconds-grade panel (unless you go for one of the really cheapo panels with a PWM backlight that causes eyestrain) or a 60 Hz model. Or if you insist on buying new, BestBuy has had the XB240H for $300 according to PcPartPicker. So the "GSync tax" is more like $50-100 nowadays.

There's nothing wrong with getting refurbs, especially if it's a Dell. Acer has been hit-or-miss in the past but they've stepped it up recently (especially their packaging, they're no longer shipping monitors in unpadded cardboard boxes).

Honestly the 1080 is not overkill for anything the 1070 is not overkill for. It's only 20% faster, that's not going to make or break you, or allow you to step up to 4K ultra gaming, or anything like that. If a 1070 gets 60 fps then the 1080 is only going to get 72 fps, so it's like being able to turn a setting or two a notch higher, for a cost of $200. The reason the 1080 is a bad recommendation for most people is because it's 50% more expensive for that minimal performance increase.

Given the choice between spending $200 extra on going from a 1070 to a 1080 and buying a GSync monitor I would go GSync all the way.

Paul MaudDib fucked around with this message at 23:16 on Sep 3, 2016

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Also GSync/Freesync is probably about as big a deal as IPS, so a choice between the two if it's a good TN like the Dell is actually a choice no matter how hard the thread chants IPS.

Anime Schoolgirl
Nov 28, 2002

VA is better :ssh:

CharlieFoxtrot
Mar 27, 2007

organize digital employees



The Asus MG279Q is $489 for a 27 inch 1440p 144Hz monitor without GSync. The Asus PG279Q is $799 for a 27 inch 1440p 144Hz monitor with GSync, which means that the one feature costs $310 at that level. I mean if you're made of money perhaps that is a no-brainer decision but

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Klyith posted:

Less delay between a frame being ready and being drawn on the screen, less noticeable skipped refreshes when framerate is not an even multiple of refresh. The ideal is zero delay and zero skips, and that's what free/gsync do*. But higher and higher refresh rate will get closer to that ideal. A theoretical 300hz monitor would probably make *sync pointless.

But it would be much easier to just make adaptive sync part of the standard than try to make 300hz monitors (and DP cables).

*in theory, real world there are still signal & processing time but whatever



edit: For an example, a 60hz monitor displaying a constant 50fps (w/ vsync) will average 6ms of extra delay just from a ready frames waiting on the next refresh. A 144hz monitor only adds 3ms.

Averages aren't how things work in reality. The reality is you are trying to draw to a device that expects an image to be ready at exactly every 16ms with a GPU that can only produce an image every 20ms. Under VSync the only time you can draw an image is when one is ready, so if you can only draw at 50fps then those images will only be displayed at 30fps. It also effectively adds a ton of input lag.

If you are not using VSync then the average framerate will be better, but not every frame will be displayed for the same amount of time. You will have to display every fifth image twice. That's called "judder".

Increasing the refresh rate will help decrease the impact of both problems, but GSync/Freesync fix it entirely. The refresh rate is whatever your framerate is, problem solved (at least until you can push more frames then your refresh rate).

Paul MaudDib fucked around with this message at 23:36 on Sep 3, 2016

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

CharlieFoxtrot posted:

The Asus MG279Q is $489 for a 27 inch 1440p 144Hz monitor without GSync. The Asus PG279Q is $799 for a 27 inch 1440p 144Hz monitor with GSync, which means that the one feature costs $310 at that level. I mean if you're made of money perhaps that is a no-brainer decision but

Oh so we're just assuming that a single company's pricing is representative of all monitors everywhere?

"The Acer GN246HL is $200 for a 24" 1080p 144 Hz monitor without GSync. The Acer XB240H is $300 for a 24" 1080p 144 Hz monitor with GSync, which means that one feature costs $100."

"The Acer XF270HU is $550 for a 27" 1440p 144 Hz monitor without GSync. The Acer XB270HU is $700 for a 27" 1440p 144 Hz monitor with GSync, which means that one feature costs $150".

Welp theory disproven, case closed.

Paul MaudDib fucked around with this message at 23:40 on Sep 3, 2016

CharlieFoxtrot
Mar 27, 2007

organize digital employees



I dunno what's up with that snark but I want a monitor at those specs and those are my options. I know people masturbate over GSync but every time I see that kind of attitude I wonder how much of a cult mentality is involved

Klyith
Aug 3, 2007

GBS Pledge Week

Paul MaudDib posted:

GSync is actually pretty important across the board because it negates the impact of framerate variability, but you will see the most significant gains when you are using hardware that is really just too weak for what you're trying to do with it. If your hardware can only push 45 fps in a game, then that's perfectly playable on GSync.
G & freesync are both very cool. I never said that 144hz is just as good, only that it makes up some ground. I would like adaptive refresh to be a standard thing, on all monitors & video cards. But not with vendor lock-in and super-high parent license costs.

The fact that you have to go to very specific scenarios to maximize the impact of *sync, for example 45fps on a 60hz monitor, says a lot. 45fps on a 144hz monitor is a whole lot less variable -- it's not about having "hardware that's to weak", it's just the math of common denominators.

quote:

The "GSync tax" is oversold. You can pick up a Dell S2716DG refurb for $350. You're going to pay a minimum of $300 for a new 27" 1440p IPS monitor anyway
Compare apples with apples please. Where can you get that dell for $350, is that deal available all the time or once 5 months ago? A $300 IPS screen is not the same as a TN screen, some people do stuff other than play video games and TN vs IPS is not the same for them. A 90 day warranty is not the same as 3 years.

When a monitor is available that is the same panel from the same OEM and it has freesync or gsync, the gsync costs about $100-200 more. That's not trivial. So maybe it's more $150 than $200, but still. People have budgets.

quote:

There's nothing wrong with getting refurbs, especially if it's a Dell.
Except the last of warranty. I have bought stuff from refurb, and generally it's been fine. But personally my risk tolerance goes down the more money it costs and more I use expect to use a thing.

do you even know what math is? 1/144 = 0.007

SlayVus
Jul 10, 2009
Grimey Drawer

Klyith posted:

do you even know what math is? 1/144 = 0.007

The post that he quoted was talking about 60Hz. Aka 1/60=0.0166ms

SwissArmyDruid
Feb 14, 2014

by sebmojo

Samsung is finally making those with freesync. Ultrawide 144hz with 125% sRGB, oh baby. :neckbeard:

Klyith
Aug 3, 2007

GBS Pledge Week

CharlieFoxtrot posted:

I dunno what's up with that snark but I want a monitor at those specs and those are my options. I know people masturbate over GSync but every time I see that kind of attitude I wonder how much of a cult mentality is involved

basically there are two types of people who blow tons of money on expensive kit:
"wow this poo poo is sweet I love it"
"wow this poo poo is sweet everyone else should love it"

the second type, when hearing talk of cheaper alternatives that are also capable of being good, tend to get very unhappy unless everything is prefaced with a disclaimer that their stuff is obviously the best.

SlayVus posted:

The post that he quoted was talking about 60Hz. Aka 1/60=0.0166ms

So since we're talking about the improvements of 144hz over 60hz, I guess we can consider that a solid reason that 144hz is better.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Klyith posted:

The fact that you have to go to very specific scenarios to maximize the impact of *sync, for example 45fps on a 60hz monitor, says a lot. 45fps on a 144hz monitor is a whole lot less variable -- it's not about having "hardware that's to weak", it's just the math of common denominators.

A minimum framerate of 45 fps is not very uncommon, and every game has areas that are more intensive than others. You can be running 65+ fps in most of Witcher 3 but drop to 45 in Novigrad, or 55-60 fps in FO4 down to 40fps in Boston.

It's particularly going to be the case when you're in the use-case I mentioned - using hardware that is really too weak to keep the minimum framerate up for what you're trying to do with it.

quote:

Compare apples with apples please. Where can you get that dell for $350, is that deal available all the time or once 5 months ago? A $300 IPS screen is not the same as a TN screen, some people do stuff other than play video games and TN vs IPS is not the same for them. A 90 day warranty is not the same as 3 years.

It's available whenever the Dell outlet has refurbs in stock, yes.

The difference between a good-quality Dell TN and an IPS is real, but small. Here's my S2716DG next to my P2715Q - the P2715Q is slightly contrastier and more vivid but it's hard to tell unless you have them side by side.



Right now there's no good AMD cards to choose from if you need more performance than a 390X can give, so GSync is the default way to go for gaming. Fury has stutter issues and too little memory, and the RX 480 is no faster than the 390. For 1440p you really need something in the 980 Ti/1070 performance class.

If you are buying more for work than for games than knock yourself out. If you insist on having a single very-top-of-the-line monitor that can handle both, and you absolutely must have it brand new with warranty, then you pay the price, it's as simple as that. If you can budge even slightly on any of the constraints then there's a very minimal price difference.

I hope you have a colorimeter, by the way. Otherwise you're way behind the curve on this stuff, an uncalibrated IPS is not going to do you very much and a good colorimeter is at least another $150. An extra $100 for GSync kinda disappears into the whole thing after you're spending $500 on a brand new IPS monitor, $400 on a GPU, and $150 on a colorimeter.

quote:

do you even know what math is? 1/144 = 0.007

I was using your figures of 50fps @ 60 Hz, there, buddy. 1/60 Hz is 16ms.

VSync essentially locks the framerate to integer divisors of the refresh rate. So if you can only push 50 fps then you are still effectively locking it to 36 Hz (division by 4) instead of 30 Hz. So we are talking about a difference between a 33ms frametime at 60 Hz and a 28 ms frametime at 144 Hz, or a 16% difference. It's not that big an impact.

Nobody really recommends VSync for performance-sensitive applications because it kinda sucks for a variety of reasons, this is one of them. You will have better luck without it on a non-GSync monitor, but there will still be some judder from the non-integer pulldown.

Paul MaudDib fucked around with this message at 00:14 on Sep 4, 2016

Craptacular!
Jul 9, 2001

Fuck the DH

CharlieFoxtrot posted:

I don't think I can do it. I want a 1440p 144Hz IPS and I don't think I can pay the GSync tax. It would be in budget at Acer Recertified but I don't really feel comfortable with only a 90 day warranty on something that I need to last years and that has already been returned once, especially reading some of the bad stories here and on Reddit with faulty monitors. Also it never seems to be in stock there anyway.

If I am staying above 60fps, the benefit to Gsync would be minimal, right? It seems with a 1070 now I could probably do that on High for most things, and I guess I could just revisit this in a couple of years if Nvidia decides to accept Freesync or AMD has better options.

Get a Freesync monitor and an AMD now, and the cost in the future of selling your AMD used and getting either an NVidia that supports Freesync if that universe comes, or a better AMD model if it doesn't, is still less than the GSync tax.

The 470s/480s perform worse than the 1060 by a margin, but the threat of having to replace a video card in a couple years because it wasn't the best performer in it's day isn't worth "I guess I'll just ignore this technology completely."

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
The RX 480 isn't really enough horsepower for 1440p and the Fury doesn't have enough memory for 1440p, but the good news there is that Freesync will help compensate for AMD's lackluster GPU lineup. Like I said, the ideal use-case for GSync is when you really don't have enough GPU horsepower for what you're trying to push, and FreeSync is mostly the same. Go for the RX 480 because the Fury has some microstutter issues that FreeSync can't fix.

I say "mostly" because the catch is that GSync actually extends to lower framerates in many cases, and always supports framerate doubling when you drop below that, so you do need to check the specific FreeSync monitor and make sure that (a) it extends low enough, or (b) that its maximum is at least 2.5x its minimum to enable framerate doubling. The 1440p/144 Hz monitors are typically pretty solid FreeSync implementations though.

Don't count on resale value too much though, because when the latest buttcoin mining craze burns out the market will be swamped with used 480s.

Paul MaudDib fucked around with this message at 00:21 on Sep 4, 2016

Setset
Apr 14, 2012
Grimey Drawer

Paul MaudDib posted:

The RX 480 isn't really enough horsepower for 1440p and the Fury doesn't have enough memory for 1440p, but the good news there is that Freesync will help compensate for AMD's lackluster GPU lineup.


Like where are you getting your info from? Fury catches up to a lot of more powerful cards in higher resolutions like 1440p. If anything, 1440p is one of the Fury's strengths, not a weakness.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Ninkobei posted:

Like where are you getting your info from? Fury catches up to a lot of more powerful cards in higher resolutions like 1440p. If anything, 1440p is one of the Fury's strengths, not a weakness.

If you are only looking at average framerates, and only in settings levels that keep below 4 GB of VRAM utilization, sure.

The Fury has a huge problem with microstutter. Its frame-pacing is highly variable even before it hits the VRAM limit, and once it hits the VRAM limit it's all over the map.

4 GB of VRAM is only just enough for 1440p even at this point, and it's not enough for 4K.

http://www.extremetech.com/gaming/213069-is-4gb-of-vram-enough-amds-fury-x-faces-off-with-nvidias-gtx-980-ti-titan-x

Bear in mind that right now AMD is tweaking the games' VRAM utilization using their drivers (DX11 and prior). Once we move into the DX12 and Vulkan generation, AMD will no longer be tweaking VRAM utilization and you can expect it to blow right on by the 4 GB limit. Some games are even blowing right by it now - for example, Doom will refuse to run Nightmare Quality on a 4 GB card.

It's fine if you keep in mind that you will be lowering texture quality and tweaking other options to keep VRAM utilization under control, but the fact is that the Fury just has too little memory for its performance level. It's been a known danger of that design right from the start and now that DX12/Vulkan are here it's pretty much a bad recommendation compared to other cards. Right now you need at least 6 GB and preferably 8 GB if you are planning on playing 1440p or higher and planning to keep the card for a long period of time (say, longer than a year).

Paul MaudDib fucked around with this message at 00:47 on Sep 4, 2016

penus penus penus
Nov 9, 2014

by piss__donald

Ninkobei posted:

Like where are you getting your info from? Fury catches up to a lot of more powerful cards in higher resolutions like 1440p. If anything, 1440p is one of the Fury's strengths, not a weakness.

It has a smaller deficit with higher resolutions compared to other cards. The numbers are the numbers though , some wouldn't prefer the fps they can produce, other fury specific quirks aside. They are viable now because of the fairly astonishing price cuts considering its still in production (I think).

For the gsync pricing, that 1440p Dell was $380 new for like two days in June :( if I had only known

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

THE DOG HOUSE posted:

It has a smaller deficit with higher resolutions compared to other cards. The numbers are the numbers though , some wouldn't prefer the fps they can produce, other fury specific quirks aside. They are viable now because of the fairly astonishing price cuts considering its still in production (I think).

For the gsync pricing, that 1440p Dell was $380 new for like two days in June :( if I had only known

Exactly. If you were considering an RX 480 4GB model at the $240 price they've been going for, the Fury is an obvious step upwards at $275. But it just doesn't have enough VRAM to be a very good recommendation above 1080p.

People have stockholm syndrome because AMD has yet to release any card faster than a 390X with more than 4 GB of VRAM, and once Vega launches I think people are going to turn on Fury pretty hard.

Klyith
Aug 3, 2007

GBS Pledge Week

Paul MaudDib posted:

I was using your figures of 50fps @ 60 Hz, there, buddy. 1/60 Hz is 16ms.

VSync essentially locks the framerate to integer divisors of the refresh rate. So if you can only push 50 fps then you are still effectively locking it to 36 Hz (division by 4) instead of 30 Hz. So we are talking about a difference between a 33ms frametime at 60 Hz and a 28 ms frametime at 144 Hz, or a 16% difference. It's not that big an impact.

You are doing dumb things with numbers that do not mean the things your words say they mean.

An FPS rate that, on a 60hz monitor, produces a worst case pulldown effect like this:
16/16/16/33/16/16/16/33/16...
will on a 144hz monitor produce this:
21/14/14/21/14/14/14/21/14/14/21...

The thing that produces noticeable judder isn't the the average frametime. It's the deviation between the average frametime and the long frametime. The deviation in that 60hz sequence is 13ms, the deviation on the 144hz is only 5ms. That's what pulldown is, not your nonsense numbers.



OTOH all that only considers constant framerates, and I totally concede that running a game with highly variable fps performance shows the most g/freesync advantage. Bad performance is made worse by vsync, and hidden by g/freesync. In such cases, people who don't have gsync have to play around with settings until they find a personal optimum between eye candy and noticeable judder.

(Though I also note that judder is a thing that not everyone is sensitive to, just like not everyone notices tearing. One thing I remember a lot of reviewers saying when gsync first hit was that it was one of those "now I can never go back" things. I was recently watching a friend demonstrate something in a game and was super distracted by the huge tearing he had going on. But after I found that he didn't see it, I was like no don't look for it, pretend I never said anything. Better not to know.)

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Klyith posted:

You are doing dumb things with numbers that do not mean the things your words say they mean.

An FPS rate that, on a 60hz monitor, produces a worst case pulldown effect like this:
16/16/16/33/16/16/16/33/16...
will on a 144hz monitor produce this:
21/14/14/21/14/14/14/21/14/14/21...

The thing that produces noticeable judder isn't the the average frametime. It's the deviation between the average frametime and the long frametime. The deviation in that 60hz sequence is 13ms, the deviation on the 144hz is only 5ms. That's what pulldown is, not your nonsense numbers.

Judder is not a problem with VSync. The problem with VSync is very low effective framerate and long input lag, as I was discussing in the part you quoted. You were the one who brought VSync up, not me, see:

Klyith posted:

edit: For an example, a 60hz monitor displaying a constant 50fps (w/ vsync) will average 6ms of extra delay just from a ready frames waiting on the next refresh. A 144hz monitor only adds 3ms.

High refresh is going to help judder a bit when vsync is disabled though, just by lowering the magnitude of the problem as you show there.

quote:

OTOH all that only considers constant framerates, and I totally concede that running a game with highly variable fps performance shows the most g/freesync advantage. Bad performance is made worse by vsync, and hidden by g/freesync. In such cases, people who don't have gsync have to play around with settings until they find a personal optimum between eye candy and noticeable judder.

(Though I also note that judder is a thing that not everyone is sensitive to, just like not everyone notices tearing. One thing I remember a lot of reviewers saying when gsync first hit was that it was one of those "now I can never go back" things. I was recently watching a friend demonstrate something in a game and was super distracted by the huge tearing he had going on. But after I found that he didn't see it, I was like no don't look for it, pretend I never said anything. Better not to know.)

Yeah, if you don't know what you're looking for then definitely don't start looking for it. But I too found GSync/FreeSync noticeably smoother and tearing is now super noticeable to me.

I'm honestly fairly sensitive to judder and other issues because I used to do a lot of video encoding where I'd be dealing with 24fps sources on a 60 Hz panel.

Paul MaudDib fucked around with this message at 04:21 on Sep 4, 2016

Klyith
Aug 3, 2007

GBS Pledge Week

Paul MaudDib posted:

4 GB of VRAM is only just enough for 1440p even at this point, and it's not enough for 4K.

http://www.extremetech.com/gaming/213069-is-4gb-of-vram-enough-amds-fury-x-faces-off-with-nvidias-gtx-980-ti-titan-x

quote:

While we do see some evidence of a 4GB barrier on AMD cards that the NV hardware does not experience, provoking this problem in current-generation titles required us to use settings that rendered the games unplayable any current GPU.

quote:

The most we can say of a specific 4GB issue at 4K is that gamers who want to play at 4K will have to do some fine-tuning to keep frame rates and resolutions balanced, but that’s not unique to any vendor. If you’re a gamer who wants 4K and ultra-high quality visual settings, none of the current GPUs on the market are going to suit you.

I think paul mauddib is having a spice vision into the Other World, where everything means the reverse of what it says

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Klyith posted:

I think paul mauddib is having a spice vision into the Other World, where everything means the reverse of what it says

No, it's more that I am able to actually read the charts and interpret them myself. Like I said earlier, the Fury has microstutter issues even before it hits its VRAM limit, and once it hits the VRAM limit it's all over the map.

100-120 ms is an absolutely absurd amount of microstutter, that's blowing 7 frames - that's not even microstutter, that's pretty much just the game freezing. And the Fury is regularly hitting frametimes upwards of 50ms even at 1440p.







GTA:V is the only game where the Fury has reasonable microstutter even at 1440p. And at 4K it just totally loses it. (note: even in Far Cry, that one 300 ms spike from the Fury is screwing up the Y-axis, those are still 80-140ms spikes during the rest)







That microstutter problem is only going to continue to get worse and worse, especially once AMD is no longer able to optimize games to keep VRAM utilization under 4 GB. Which, again, is already happening, see: Doom.

Again, it's kind of a shame because if it had more VRAM and less stutter it would be good in Crossfire for 4K.

Paul MaudDib fucked around with this message at 03:04 on Sep 4, 2016

SlayVus
Jul 10, 2009
Grimey Drawer

Paul MaudDib posted:

.

VSync essentially locks the framerate to integer divisors of the refresh rate. So if you can only push 50 fps then you are still effectively locking it to 36 Hz (division by 4) instead of 30 Hz. So we are talking about a difference between a 33ms frametime at 60 Hz and a 28 ms frametime at 144 Hz, or a 16% difference. It's not that big an impact.

Nobody really recommends VSync for performance-sensitive applications because it kinda sucks for a variety of reasons, this is one of them. You will have better luck without it on a non-GSync monitor, but there will still be some judder from the non-integer pulldown.

I think the is a major point in Nvidia's favor on the monitor tech side of things. With GSync and FastSync(Can AMD even do this?) you remove stuttering and bad experiences at low frame rate and almost eliminate input at high frame rates. Once AMD can come out with something for Fast Sync they'll be more competitive with Nvidia. My only issues with FreeSync is the issue with the ranges the monitors provide, but this isn't AMD's fault.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Paul MaudDib posted:

Yeah, if you don't know what you're looking for then definitely don't start looking for it. But I too found GSync/FreeSync noticeably smoother and tearing is now super noticeable to me.

I'm honestly fairly sensitive to judder and other issues because I used to do a lot of video encoding where I'd be dealing with 24fps sources on a 60 Hz panel.

I didn't think I was. Then I got freesync running and 40 fps or so became so much smoother.

Setset
Apr 14, 2012
Grimey Drawer

Paul MaudDib posted:

No, it's more that I am able to actually read the charts and interpret them myself. Like I said earlier, the Fury has microstutter issues even before it hits its VRAM limit, and once it hits the VRAM limit it's all over the map.

100-120 ms is an absolutely absurd amount of microstutter, that's blowing 7 frames - that's not even microstutter, that's pretty much just the game freezing. And the Fury is regularly hitting frametimes upwards of 50ms even at 1440p.

GTA:V is the only game where the Fury has reasonable microstutter even at 1440p. And at 4K it just totally loses it. (note: even in Far Cry, that one 300 ms spike from the Fury is screwing up the Y-axis)



That microstutter problem is only going to continue to get worse and worse, especially once AMD is no longer able to optimize games to keep VRAM utilization under 4 GB. Which, again, is already happening, see: Doom.

Again, it's kind of a shame because if it had more VRAM and less stutter it would be good in Crossfire for 4K.

Got any articles that are more recent? That single review is the only one I can find and it's over a year old now. Hell microstutter could be caused by old drivers.

There are tons of reviews out there that show the 4GB HBM cards maintaining their performance ranks in games that are known to fill up more than 4gb of VRAM.




And here's one with a nice showing from a 1060, drat



And then the Fury catching back up at 2k

Setset fucked around with this message at 02:04 on Sep 4, 2016

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Ninkobei posted:

Got any articles that are more recent? That single review is the only one I can find and it's over a year old now. Hell microstutter could be caused by old drivers.

There are tons of reviews out there that show the 4GB HBM cards maintaining their performance ranks in games that are known to fill up more than 4gb of VRAM.




And here's one with a nice showing from a 1060, drat



AMD is using their driver stack to optimize VRAM consumption down (in DX11 titles - they will not be able to do this in DX12). If you look at the article I posted, some games and some resolutions are using more than 4 GB of VRAM on NVIDIA but use less than 4 GB on AMD (possibly just on Fury). Of course they cannot use more than 4 GB on a card that only has 4 GB, so some situations are actually memory-limited. So it's tough to instantly pick out which is which. But generally higher resolutions, higher texture quality, and more supersampling (effectively, more resolution) will all unavoidably tend to use more VRAM.

Also, average framerates or even minimum framerates do not fully capture the problem since "framerates" are always averaged over some time period. What you really need to look at are "FCAT graphs" or tables of 90/99/99.9-percentile frametimes. This allows you to capture both how bad the real extremes of the frametime distribution are, and how often those extremes are occurring.

What I posted above are FCAT graphs. Ideally you want them to be smooth - past a certain point not even GSync or FreeSync will help, you can just see when a frametime jumps between 30ms and 100ms. Guru3d has some others with issues, but also some that are OK while ExtremeTech had problems - probably due to DX11-vs-DX12 issues.





Paul MaudDib fucked around with this message at 02:25 on Sep 4, 2016

Harik
Sep 9, 2001

From the hard streets of Moscow
First dog to touch the stars


Plaster Town Cop

Craptacular! posted:

Get a Freesync monitor and an AMD now, and the cost in the future of selling your AMD used and getting either an NVidia that supports Freesync if that universe comes, or a better AMD model if it doesn't, is still less than the GSync tax.

Ninkobei posted:

Like where are you getting your info from? Fury catches up to a lot of more powerful cards in higher resolutions like 1440p. If anything, 1440p is one of the Fury's strengths, not a weakness.

I guess the question comes down to: What setting level lets you realistically drive 1440p144hz? Obviously you can't run ultra, but ultra is often stupidly-high cost for limited visible image quality. If I could run current AAA titles on high with a 480/freesync it would be great.

Anyone know reviewers who benchmark realistic settings? I get ultra for comparing raw horsepower between cards, but if you can get double the FPS on high then it'd be nice to know.


Jesus those are bad.

Random latency out of nowhere sucks. I'm fighting it in the batman games - my 270x runs Arkham City at average 55-60fps with whatever auto-tuning they do, but when it stutters it's down to 1-2FPS for a moment. Really painful, and made more so by how smoothly it runs when it's not stuttering.

Harik fucked around with this message at 02:56 on Sep 4, 2016

penus penus penus
Nov 9, 2014

by piss__donald
That's the "fury problem". Fyi if anybody wants to check for themselves use fraps to record benchmarks then frafs viewer to generate the graphs

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

Paul MaudDib posted:

That microstutter problem is only going to continue to get worse and worse, especially once AMD is no longer able to optimize games to keep VRAM utilization under 4 GB. Which, again, is already happening, see: Doom.

Again, it's kind of a shame because if it had more VRAM and less stutter it would be good in Crossfire for 4K.

Even when the 4GB VRAM limit isn't a bottleneck its incredibly baffling how AMD manages to still lose to Nvidia despite having over 50% advantage in raw compute and VRAM bandwidth.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Paul MaudDib posted:

Don't count on resale value too much though, because when the latest buttcoin mining craze burns out the market will be swamped with used 480s.

The miners that I know are counting on being able to resell 470s/480s used for near MSRP to make the whole thing worthwhile in the end...

Harik
Sep 9, 2001

From the hard streets of Moscow
First dog to touch the stars


Plaster Town Cop
edit: N/M why did I hit submit on this other window.

For actual content: The problem with Arkham City was vsync. Turned that off and the lowest FPS I saw was mid 30s - and that's pretty rare. It was 100+ normally. Frafs showed some awful frames with vsync enabled too - 100+ms followed by .35ms followed by 100+ etc. So I appreciate the pointer to that, I would have thought it was a lovely videocard and not one setting.


Harik fucked around with this message at 11:18 on Sep 4, 2016

Adbot
ADBOT LOVES YOU

Saukkis
May 16, 2003

Unless I'm on the inside curve pointing straight at oncoming traffic the high beams stay on and I laugh at your puny protest flashes.
I am Most Important Man. Most Important Man in the World.

Twerk from Home posted:

The miners that I know are counting on being able to resell 470s/480s used for near MSRP to make the whole thing worthwhile in the end...

Can you give more info on why they've chosen RX4?0 cards? I think the opinion on this thread has been that GTX10?0 would have been a preferable choice.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply