Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

Whiskey A Go Go! posted:

I am still happy with my 1080Ti. It's gotten to the point I am waiting for next gen Intel and amd processors because my 8600k is struggling 6 non-hyper threaded cores. Unless there is massive gains on the Nvidia front with the new cards, I probably skip this generation.

DLSS 2.0 alone means Turing completely owns Pascal on most graphically intensive games going forward... don’t need to wait for Ampere on that.

DLSS + Ampere is going to be a ridiculous performance increase barring some massive fuckup from Nvidia.

Taima fucked around with this message at 17:09 on May 11, 2020

Adbot
ADBOT LOVES YOU

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride

Taima posted:

DLSS 2.0 means Turing completely owns Pascal on most graphically intensive games going forward... don’t need to wait for Ampere on that.

I mean theoretically yes, but let's see how developer support shapes up

ufarn
May 30, 2009

Taima posted:

DLSS 2.0 alone means Turing completely owns Pascal on most graphically intensive games going forward... don’t need to wait for Ampere on that.

DLSS + Ampere is going to be a ridiculous performance increase barring some massive fuckup from Nvidia.
you just had to jinx it like that didn't you

v1ld
Apr 16, 2012

I've complained about AMD driver support on the 5700 in this thread - turns out at least some significant chunk of that was my own overly aggressive overclocking. It's way more stable after spending some significant (and fun) time with Kombustor and 3DMark, which showed the specific crash and lockup scenarios I had seen earlier.

The overall changes weren't huge: had to drop cpu freq from 2040 -> 2020 MHz, memory freq from 1900 -> 1850 MHz, and increase max power draw from +20 to +50%.

The +50% power is important because the XT bios flash hack increases TDP max to 300W and it was hitting that in Furmark and close to it in some games, at least as reported by Afterburner.

Overclocking on top of the XT bios flash on a 5700 should disallow me from being able to complain about anything related to the card, really.

v1ld fucked around with this message at 17:20 on May 11, 2020

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

Dogen posted:

I mean theoretically yes, but let's see how developer support shapes up

Devs have been consistently vocal about its ease of use. Do you have any specific reason to be skeptical? I've played multiple DLSS 2.0 games. It seems pretty real, and from what I've personally read, I see little reason to doubt its uptake.

I mean poo poo it's so powerful that even if it sucked to implement, devs would do it.


ufarn posted:

you just had to jinx it like that didn't you

oh god

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride

Taima posted:

Devs have been consistently vocal about its ease of use. Do you have any specific reason to be skeptical? I've played multiple DLSS 2.0 games. It seems pretty real, and from what I've personally read, I see little reason to doubt its uptake.

I mean poo poo it's so powerful that even if it sucked to implement, devs would do it.

Mostly that we only have 2 major titles that support it right now, and there's at least one title that supports DLSS 1 (Metro Exodus) that has said, "eh we don't want to go back and add it in because it'd be too much work."

I'm not saying I'm not hopeful or even that I wouldn't make a reasonable wager on it being successful, but we just don't know for sure at this point.

Truga
May 4, 2014
Lipstick Apathy

v1ld posted:

I've complained about AMD driver support on the 5700 in this thread - turns out at least some significant chunk of that was my own overly aggressive overclocking. It's way more stable after spending some significant (and fun) time with Kombustor and 3DMark, which showed the specific crash and lockup scenarios I had seen earlier.

The overall changes weren't huge: had to drop cpu freq from 2040 -> 2020 MHz, memory freq from 1900 -> 1850 MHz, and increase max power draw from +20 to +50%.

The +50% power is important because the XT bios flash hack increases TDP max to 300W and it was hitting that in Furmark and close to it in some games, at least as reported by Afterburner.

Overclocking on top of the XT bios flash on a 5700 should disallow me from being able to complain about anything related to the card, really.

man, i'll take being able to do this kind of galaxy brain bullshit over "your bios is signed, you can't use a custom one" any day, stability be damned.

still lmao every time that my stupid reference 980Ti draws ~425w in 3dmark and performs almost as well as a reference 1080. :v: i sure hope new amd cards don't do the signed bios bullshit, i *really* don't want to hardware voltmod my poo poo

orcane
Jun 13, 2012

Fun Shoe

Taima posted:

Devs have been consistently vocal about its ease of use. Do you have any specific reason to be skeptical? I've played multiple DLSS 2.0 games. It seems pretty real, and from what I've personally read, I see little reason to doubt its uptake.

I mean poo poo it's so powerful that even if it sucked to implement, devs would do it.


oh god
Yet it's still in only 5 games or so, and at least two have had pretty lovely native image quality to begin with.

By the time DLSS 2.0 is the killer feature to own all previous generations and the remaining competition, enthusiasts will want a new GPU anyway.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

Dogen posted:

Mostly that we only have 2 major titles that support it right now, and there's at least one title that supports DLSS 1 (Metro Exodus) that has said, "eh we don't want to go back and add it in because it'd be too much work."

I'm not saying I'm not hopeful or even that I wouldn't make a reasonable wager on it being successful, but we just don't know for sure at this point.

To be fair afaik they didn't say it would be too much work, the vibe was more that development had stopped and they didn't want to add anything to the game at all.


orcane posted:

By the time DLSS 2.0 is the killer feature to own all previous generations and the remaining competition, enthusiasts will want a new GPU anyway.

I completely agree but we were talking about the future efficacy of Pascal.

ufarn
May 30, 2009
I also think the lack of DLSS 1.0 adoption was partly due to DLSS not being very good, so I wouldn't extrapolate too much from how little usage it got.

Craptacular!
Jul 9, 2001

Fuck the DH
I have doubts in DLSS2 adoption, simply because even if the effort is little I'm not sure where developers are giving big exclusive advantages to one side of a platform war without taking money. Developers know Nvidia has the cash to make their game a featured game or whatever, and will want that at minimum. DLSS has all the same downside that people complained about with something like GameWorks or PhysX, but this time it's more useful than those parlor tricks. I don't see companies, especially AAA companies, rushing to volunteer to give Nvidia a big competitive advantage for Nvidia to crow about without some money moving across the table.

Nvidia used to spend money on making their logo appear when you started games, so I'm sure they'll have the cash to get a few big titles. But I also think it doesn't see much beyond that and some indies.

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride

Taima posted:

To be fair afaik they didn't say it would be too much work, the vibe was more that development had stopped and they didn't want to add anything to the game at all.

The tweet said something like, "we would have to rework too many things," which given what's been said about how DLSS 2 is implemented didn't make a ton of sense.

But yeah that's different from putting it in a new game going forward. We'll just have to see what the uptake is, but really we don't know yet.

orcane
Jun 13, 2012

Fun Shoe

Taima posted:

To be fair afaik they didn't say it would be too much work, the vibe was more that development had stopped and they didn't want to add anything to the game at all.


I completely agree but we were talking about the future efficacy of Pascal.
Historically a GTX 1080 Ti would have a similar performance to about a hypothetical "4060" so it's not going to be the performance level of choice for enthusiasts in the first place. At the same time, I don't believe game devs can afford to make DLSS 2.0 a base requirement for playable framerates in the short term, and even whether it's a viable medium-term approach depends very much on where the cutoff for tensor cores is going to be in the next generation cards.

v1ld
Apr 16, 2012

Truga posted:

man, i'll take being able to do this kind of galaxy brain bullshit over "your bios is signed, you can't use a custom one" any day, stability be damned.

still lmao every time that my stupid reference 980Ti draws ~425w in 3dmark and performs almost as well as a reference 1080. :v: i sure hope new amd cards don't do the signed bios bullshit, i *really* don't want to hardware voltmod my poo poo

Yeah, couldn't agree more. It's fun to play with this stuff and as you say, stability be damned, though I haven't done any hardware mods. I didn't think my stability problems were partly self-inflicted due to AMD's recent woes, so thought I'd set the record straight.

Aside: Considering taking a day off after my 3600 and x570 board arrive just so I can overclock this crazy memory to the insanity shown in these two buildzoid videos.

Fun tweak: buildzoid having that fan over the memory in the second video (he has all 4 memory banks in use) made me wonder about more direct airflow over the memory slots in my old Corsair 400R case. Behold, the amazing 5.25" bay to 120mm fan adapter! So now a fan will blow directly over the memory and into the NH-D15 that arrived today. The 400R has the same mesh-like grill over those slots as over the front fans at the bottom, so airflow is fine through it. (Waiting on the Meshify version of the Define 7 for my next case hence the surgery on this one.)

Totally have to take a day off for this. (I'm also moving groups at work, it's a good time for a break.)

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

Dogen posted:

The tweet said something like, "we would have to rework too many things," which given what's been said about how DLSS 2 is implemented didn't make a ton of sense.

But yeah that's different from putting it in a new game going forward. We'll just have to see what the uptake is, but really we don't know yet.

Yeah this is true I went back and looked and that is accurate. I still think the general vibe is "we just don't feel like it so here's an unspecific reason why" (or perhaps that their DLSS 1.0 has hacks bolted on that don't play nice with DLSS 2.0 or something of that nature).


orcane posted:

Historically a GTX 1080 Ti would have a similar performance to about a hypothetical "4060" so it's not going to be the performance level of choice for enthusiasts in the first place. At the same time, I don't believe game devs can afford to make DLSS 2.0 a base requirement for playable framerates in the short term, and even whether it's a viable medium-term approach depends very much on where the cutoff for tensor cores is going to be in the next generation cards.

I generally agree, my initial post was responding to Whiskey who mentioned that he won't upgrade unless Ampere has massive gains. My point was just, those gains are here (if you believe the promise of DLSS 2.0). I wasn't trying to say that Pascal is dead or anything :)

Worf
Sep 12, 2017

If only Seth would love me like I love him!

Taima posted:

DLSS 2.0 alone means Turing completely owns Pascal on most graphically intensive games going forward... don’t need to wait for Ampere on that.

DLSS + Ampere is going to be a ridiculous performance increase barring some massive fuckup from Nvidia.

These aren't my words but there's a place in my wallet for their truth

E: theoretically, idk how affordable anything is likely to be.

Cygni
Nov 12, 2005

raring to post

Game developers tend to rally to free performance techniques pretty quickly, especially if someone else already paid to develop the tools. Stuff like S3TC went from paid implementations to being in every major game within a year or two once DX6 bundled it. Free performance bumps let developers skip the expensive and tedious process of performance optimization and fixing their house-of-cards game engines ("we cant change the draw routines cause Dan wrote them before he quit three years ago after a fight at the christmas party and we don't know how they work"), and instead just ship sloppy spaghetti code dogshit cause at least it hits 60fps. Now thats value for partners!

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

orcane posted:

At the same time, I don't believe game devs can afford to make DLSS 2.0 a base requirement for playable framerates in the short term, and even whether it's a viable medium-term approach depends very much on where the cutoff for tensor cores is going to be in the next generation cards.

I mean, hypothetically it would look a lot like RDR2. Graphically intensive game that gets 60-70 if you have the latest gen, 40-50 if you don't.

We can waffle around the definition of "playable framerate" but the short of it is that by modern standards, 60 fps is "playable" and RDR2 falls short of there on a lot of older hardware. The 1080 is kind of the quintessential 1440p card and it does 49 fps average, the "comparable" 2070 does 62 fps average, so there is a "generational difference in playability" there.

https://www.techspot.com/review/1939-red-dead-redemption-2-benchmarks/

Now, I think what amounts to right on the brink of 50 fps is not completely unplayable and the evidence shows that people are willing to accept this if necessary. If things get down into the low 40s or into the 30s then that becomes a different story. And DLSS will only get you so much in the first place, it still needs some real frames to extrapolate motion data from/etc, so far it doesn't seem to allow you to double a framerate, so if you're doing less than 40 fps without DLSS then you're probably not doing more than 60 fps even with it.

But I mean, an RDR2 style situation where only current gen cards are breaking 60 fps (for whatever their "appropriate/normal" resolution) and older cards are slumming in the 40 fps range... seems plausible enough to me.

Paul MaudDib fucked around with this message at 18:40 on May 11, 2020

orcane
Jun 13, 2012

Fun Shoe
Nah.

Taima posted:

I generally agree, my initial post was responding to Whiskey who mentioned that he won't upgrade unless Ampere has massive gains. My point was just, those gains are here (if you believe the promise of DLSS 2.0). I wasn't trying to say that Pascal is dead or anything :)
Oh yeah. I don't see how they'll end up in a situation where upgrading the 2 generations old top model isn't going to be worth, it even without DLSS. Even the current cards were significantly faster (but they also more expensive) - a 2080 Ti is way, way faster than the 1080 Ti/2070 Super/2080 tier already.

redreader
Nov 2, 2009

I am the coolest person ever with my pirate chalice. Seriously.

Dinosaur Gum
Someone posted this a couple of pages back, saying (correct me please, I'm sure I got it wrong) that during the 20xx card release what was previously known as the 80's became the 70's and the 70's became the 60's? Or is that going to happen in the nvidia 30xx series? Surely if the 2080 wasn't a big improvement on the 1080 (also something I read here) then that's a new thing for the 30xx generation.

orcane
Jun 13, 2012

Fun Shoe
The jump at the top end was not far out of line compared to previous generations, a 1080 was not much faster than a 980 Ti before it either so I don't think it's accurate to say that Nvidia moved the product stack, but people kept saying that to justify/explain the jump in prices (and the memory situation to some extent because the GTX 1080 Ti had 11 GB of VRAM and the new similarly fast RTX 2080 had only 8).

Although the 2070/2080 (non-super) seemed like they landed a bit too low - the product stack looks more like old generations if you use the Super refresh: A 780 Ti was about as fast as a GTX 970 and a 980 Ti was roughly a 1070, so a 1080 Ti being about as fast as a 2070 Super is completely normal.

Cygni
Nov 12, 2005

raring to post

redreader posted:

Someone posted this a couple of pages back, saying (correct me please, I'm sure I got it wrong) that during the 20xx card release what was previously known as the 80's became the 70's and the 70's became the 60's? Or is that going to happen in the nvidia 30xx series? Surely if the 2080 wasn't a big improvement on the 1080 (also something I read here) then that's a new thing for the 30xx generation.

Remember that the names and tiers are completely made up and irrelevant in the first place. They will name it and price it in whatever way they think will extract the most money out of people.

v1ld
Apr 16, 2012

A lot of this is going to come down to what's in the next gen consoles as the next baseline. I hope they have the tensor cores to do the equivalent of DLSS 2.0, but there's a lot more than hardware involved to make DLSS 2 work. That nvideo was quite convincing though, given they're talking about better-than-native rendering at half-res each axis (in lab conditions, batteries not included, objects in mirror are closer, yada yada).

Average and max frame rates are not that interesting to me now I finally have a monitor with Freesync. RDR2 gave me 33/68/96 min/ave/max on its internal benchmark yesterday (3440x1440 on a 5700). I'm most interested in whether I can move that 33 min to be well over the 48 Hz Freesync cutoff. Other performance numbers are less interesting. I bought the freesync monitor to make them less interesting.

v1ld fucked around with this message at 18:59 on May 11, 2020

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

redreader posted:

Someone posted this a couple of pages back, saying (correct me please, I'm sure I got it wrong) that during the 20xx card release what was previously known as the 80's became the 70's and the 70's became the 60's? Or is that going to happen in the nvidia 30xx series? Surely if the 2080 wasn't a big improvement on the 1080 (also something I read here) then that's a new thing for the 30xx generation.

Typically with a new generation, the old generation roughly matches up in performance to one slot down in the new generation's lineup. Eg a 1080 ~= 2070 +/- a bit. The 20xx line was weird because the price-points moved more than ususal: the 1080 launched at $600, the 1070 launched at $380, while the 2070 launched at $500--so price-wise it was closer to the old x80 slot than it was to the old x70 slot.

The 30xx series may show out-sized gains against the 20xx series, as compared to the 20xx vs 10xx (at least at launch--the 20xx series has gotten comparatively faster thanks to driver improvements over time), because the 20xx series frankly wasn't all that much faster, especially considering the prices. People have chalked that up to Tensor cores or whatever, but between Tensor maturing and dropping to a new node, the 30xx series should be a considerable upgrade.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Cygni posted:

Game developers tend to rally to free performance techniques pretty quickly, especially if someone else already paid to develop the tools.

Yeah, if anything I think this will be what pushes adoption hard if NVidia enables DLSS 2.x across the entire 30xx stack (or at least starting with the 3050--no one gives a gently caress about how playable anything is with a xx30/40 part, let's be real). I don't think devs really are too concerned about not being fair to AMD, given their tiny market share, so if they can spend a few hours implementing DLSS and bump performance considerably for a good chunk of their player-base, that seems like a no-brainer.

AAA-titles might also be able to get NVidia to cough up some money to be "featured" or whatever, but if it's really as easy as they claim it is to implement, I think we'll see it a lot more often going forward. HairWorks died because it was a massive performance killer, and PhysX never really took off for similar reasons (and that it added dev time to create stuff that only a small fraction of the playerbase was going to get to experience). That's a lot different than "here's a 50% speedup for basically free."

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

Cygni posted:

Remember that the names and tiers are completely made up and irrelevant in the first place. They will name it and price it in whatever way they think will extract the most money out of people.

I said it and its this.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
When talking about Turing's generaional improvement, it's REALLY worth keeping in mind modern benches. Turing looked like trash when it debuted, but it has performed much better than Pascal in newer games, especially the more demanding ones. It's still Nvidia's weakest generation in a long time, but it's not nearly as bad as it's made out to be. A 2080 is a much more capable GPU than a 1080Ti these days, when at launch it was occasionally a downgrade. A 2070 Super is generally faster than a 1080 Ti in modern games, which is the sort of standard you have historically expected, just a half-generation late.


v1ld posted:

Average and max frame rates are not that interesting to me now I finally have a monitor with Freesync. RDR2 gave me 33/68/96 min/ave/max on its internal benchmark yesterday (3440x1440 on a 5700). I'm most interested in whether I can move that 33 min to be well over the 48 Hz Freesync cutoff. Other performance numbers are less interesting. I bought the freesync monitor to make them less interesting.

That's not how it works. If your monitor has a max refresh at least 2.5x minimum refresh (120hz for you), you get frame doubling and Freesync works no matter how low your framerate is. It's just still going to be a bad experience. VRR makes minor differences in framerate very difficult to tell apart, and slight frame pacing inconsistencies less annoying, but it's not going to make bad framerates look good.

K8.0 fucked around with this message at 19:49 on May 11, 2020

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
Just out of curiosity as I'm no expert on the whole timeline of video cards- what other historical technologies have garnered a similar or greater performance than DLSS 2.0?

Obviously it's super powerful but I have no idea about the histories of such technologies. Presumably there have been jumps similar to this.

repiv
Aug 13, 2009

K8.0 posted:

When talking about Turing's generaional improvement, it's REALLY worth keeping in mind modern benches. Turing looked like trash when it debuted, but it has performed much better than Pascal in newer games, especially the more demanding ones. It's still Nvidia's weakest generation in a long time, but it's not nearly as bad as it's made out to be. A 2080 is a far more capable GPU than a 1080Ti these days.

Turing is set for #finewine too since it has feature parity with the new consoles. There's performance wins to be had from VRS and mesh shaders that have barely been exploited so far, but now the consoles benefit too they're certainly going to get used.

orcane
Jun 13, 2012

Fun Shoe
Outside of a literal handful of outliers (like the mentioned RDR2) the 2080/2070 Super is regularly 5-10% ahead of the 1080 Ti even in modern titles. While it's true that future games will tend to get comparatively better performance, calling it a "far more capable" card at this point is laughable.

It's far more capable if all you care about is playing RDR2 or Control but the vast majority of games will not be a noticeably different experience worth upgrading to a $600+ card right now.

v1ld
Apr 16, 2012

K8.0 posted:

That's not how it works. If your monitor has a max refresh at least 2.5x minimum refresh (120hz for you), you get frame doubling and Freesync works no matter how low your framerate is. It's just still going to be a bad experience. VRR makes minor differences in framerate very difficult to tell apart, and slight frame pacing inconsistencies less annoying, but it's not going to make bad framerates look good.

That's not how what works? I want frame rates over the freesync cutoff because those frame rates are better than, you know, lower rates. One point of having min rate over the freesync cutoff rate is to not have bad framerates in the first place. I'm not interested in LFC-as-compensated-Freesync because it's still low framerates rendered tear-free. I may be misunderstanding the intent of your post.

E: Maybe it helps to explain that I'm not very interested in the high end of the frame rate range because testing on this 144 Hz monitor has shown me very clearly that I cannot tell the difference very much between tear-free 70-90 Hz and 144 Hz when I'm playing a game. Even tear-free 50-60 Hz is good. So I enable Radeon Chill with a range of 72-140 Hz to move heat death back another few nanoseconds. It does a pretty good job of adapting frame rate to what's going on.

v1ld fucked around with this message at 21:29 on May 11, 2020

Mr.PayDay
Jan 2, 2004
life is short - play hard

orcane posted:

Outside of a literal handful of outliers (like the mentioned RDR2) the 2080/2070 Super is regularly 5-10% ahead of the 1080 Ti even in modern titles. While it's true that future games will tend to get comparatively better performance, calling it a "far more capable" card at this point is laughable.

It's far more capable if all you care about is playing RDR2 or Control but the vast majority of games will not be a noticeably different experience worth upgrading to a $600+ card right now.

Just for protocol, the 5-10% were autumn 2018 results.
Right now Pascal GPUs lose serious fps compared to Turing in DX12.
IIRC the 1080Ti trails already 20% or even more fps behind the 2080 in Division 2 or Metro Exodus for example.

Even the 2070 starts matching 1080Ti fps in that scenarios.

Edit:
My brother went from a 1080Ti to a WF 2080Ti.
His fps on 3440*1440 with a 5930K and 1080Ti oc vs 2080Ti oc.
RDR2 ultra + manually maxed benchmark: 35 to 62
Forza Horizon 4 : 75 to loving 108
Far Cry New Dawn Ultra : 75 to 102
Ghost Recon Wildlands Ultra 49 to 65.

It is - depending on the game and engine - a 30% to 60% fps jump,for him. And that, indeed, is a noticeable difference, every game.

So anything else but a 2080Ti isn’t worth the jump from a 1080Ti until the 3000 GPUs are launched.

Mr.PayDay fucked around with this message at 21:41 on May 11, 2020

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Using demanding games where people actually care about their framerates and generational improvements ISN'T FAIR! We need to stick to benchmarking Quake 3 forever, if I can't feel the difference in framerate it's not an upgrade!

Yes there aren't many of those games yet. The point is that "a 1080Ti is just as good as a 2080" wasn't quite true at launch, but it's nowhere even close to true now, and the trend isn't slowing down.

v1ld posted:

That's not how what works? I want frame rates over the freesync cutoff because those frame rates are better than, you know, lower rates. One point of having min rate over the freesync cutoff rate is to not have bad framerates in the first place. I'm not interested in LFC-as-compensated-Freesync because it's still low framerates rendered tear-free. I may be misunderstanding the intent of your post.

E: Maybe it helps to explain that I'm not very interested in the high end of the frame rate range because testing on this 144 Hz monitor has shown me very clearly that I cannot tell the difference very much between tear-free 70-90 Hz and 144 Hz when I'm playing a game. Even tear-free 50-60 Hz is good. So I enable Radeon Chill with a range of 72-140 Hz to move heat death back another few nanoseconds. It does a pretty good job of adapting frame rate to what's going on.

My point is that there is no freesync cutoff for you. I think you're saying you understand that and you've just arbitrarily picked 48 FPS as the acceptable minimum framerate for you, which is fine.

K8.0 fucked around with this message at 21:51 on May 11, 2020

v1ld
Apr 16, 2012

K8.0 posted:

My point is that there is no freesync cutoff for you. I think you're saying you understand that and you've just arbitrarily picked 48 FPS as the acceptable minimum framerate for you, which is fine.

Ahh, ok - yeah. We're in sync, just like LFC with my monitor.

orcane
Jun 13, 2012

Fun Shoe

Mr.PayDay posted:

Just for protocol, the 5-10% were autumn 2018 results.
No they're recent Computerbase 2020 results (1440p frametimes/min. fps), they even revamped their test sequence earlier this year and threw out a lot of older games. The biggest advantages for the 2070 Super over the 1080 Ti they're listing in Battlefield V (+11%), CoD:MW(2019) (+11%), Control (+15%) and RDR2 (+16%). The Division 2 sits at +9% and that's noted as "Pascal does worse than usual" in their Warlords of NY article. (E: copied from the wrong graph).

Obviously test methodology can differ, but you really need to cherry pick settings and games to get those 20% as a generalization.

orcane fucked around with this message at 22:26 on May 11, 2020

Busy Bee
Jul 13, 2004
I currently have a 2080 Super with one LG 27GL850 27'' monitor (1440p 144hz IPS etc.) - https://www.lg.com/us/monitors/lg-27GL850-gaming-monitor

I want to buy a second monitor and have been looking at this one since it has the same frame as my current monitor while having lower resolution - LG 27GL63T-B.AUS 27 - https://www.lg.com/us/monitors/lg-27gl63t-b-gaming-monitor \

I read online that some people have issues when using Nvidia G-Sync on your main display while having two different monitors? Is this something I should be concerned about with my 2080 Super?

MarsellusWallace
Nov 9, 2010

Well he doesn't WANT
to look like a bitch!

Zarin posted:

Does the Nvidia Control Panel have an option for adjust gamma (or something?) on a per-application basis?

I've been playing a game (The Forest) and there is no gamma adjustment in-game. I changed the "night-time and cave" brightness to maximum, but it didn't seem to do anything.

I've seen some YouTube videos that the users are in caves that look like full-day brightness. I'm not sure I need to go that far, but having completed the story, I've had enough "atmosphere" and would like to just stop stumbling around in the dark.

It looks like there is a per-monitor adjustment; I suppose I can just use that if I have to, and set it back when I'm done. Just curious if there's a better way.

It's an adjustment in the menu for the color profile. There's a bunch of different preset color profiles with weird names in the menu. Phoneposting or I'd look it up for you.I

If anyone has a way of making Sea of Thieves brighter then I could play that before sundown. It detects and counteracts the Nvidia control panel settings because it wasn't loved as a child.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
go get your grains of salt ready, we have the final-final-totally-therealdeal rumors from Moores Law on Ampere:

https://www.youtube.com/watch?v=oCPufeQmFJk

It's spicy. It's a pile of habaneros.

Anime Schoolgirl
Nov 28, 2002

Taima posted:

go get your grains of salt ready, we have the final-final-totally-therealdeal rumors from Moores Law on Ampere:

https://www.youtube.com/watch?v=oCPufeQmFJk

It's spicy. It's a pile of habaneros.

https://twitter.com/dril/status/922321981

Adbot
ADBOT LOVES YOU

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

Cygni posted:

Remember that the names and tiers are completely made up and irrelevant in the first place. They will name it and price it in whatever way they think will extract the most money out of people.

People should have to hit accept on this statement in order to post in this thread.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply