Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
eames
May 9, 2009

It feels like DLSS 2.0 is the final version of what should launched with Turing.
Do we know anything about if/how it scales to high framerates (>144 FPS) and if it adds any significant latency? The implications of this tech are pretty significant, it could be the end of native rasterization rendering for video games on platforms that support it. Somebody ITT pointed out that the next-gen Nintendo Switch might support this and that’s a very interesting thought.

Adbot
ADBOT LOVES YOU

ShaneB
Oct 22, 2002


Dumb question: Why would NVidia want to extend the life of their hardware with DLSS2? Is it just to make people jump from AMD to them if they are buying something new?

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
At 11:50 in this presentation you can see Nvidia's claimed DLSS 2.0 processing times, which seem to be hugely improved from 1.0.They're low enough to be viable at any realistic framerate, and while it does take time, in practice at the same resolution your actual latency will be lower because the total time to generate and deliver a frame is shorter.

DLSS making sense on a SwitchU would depend in part on how power efficient it would be.

ShaneB posted:

Dumb question: Why would NVidia want to extend the life of their hardware with DLSS2? Is it just to make people jump from AMD to them if they are buying something new?

A 3080Ti is still going to smoke a 2080Ti. With 4k gaming becoming a thing, there is no risk that Nvidia is ever going to go "gently caress, our current GPUs are too good and no one is interested in the performance uplift of our next-gen GPUs."

Hell, that's half the point of the help Nvidia provides to game developers. It's 50% optimizing so the low end works / they beat AMD, and 50% helping implement demanding features to help drive demand for the high end.

K8.0 fucked around with this message at 22:36 on Apr 14, 2020

repiv
Aug 13, 2009

eames posted:

Do we know anything about if/how it scales to high framerates (>144 FPS) and if it adds any significant latency?

It should scale up to higher framerates quite well. The GTC talk gave the cost of running DLSS on various GPUs at various resolutions:



If we take 1440p on the 2070S as an example (~1ms) then a game which runs at 144fps (7ms/frame) at some low base resolution should run at 125fps (8ms/frame) upscaled to 1440p, not a huge loss given the results.

In fact the difference may be even smaller than that, since DLSS replaces the games native TAA the new frametime isn't (base cost + DLSS cost), it's actually (base cost - TAA cost + DLSS cost)

repiv fucked around with this message at 23:02 on Apr 14, 2020

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

ShaneB posted:

Dumb question: Why would NVidia want to extend the life of their hardware with DLSS2? Is it just to make people jump from AMD to them if they are buying something new?

They're not, really. The main charge against the 20-series has been that they were too expensive for the performance difference compared to the 10-series. This would actually give a lot of people a compelling reason to upgrade from 10- (or even 9-) series cards that they've been hanging on to waiting for something better.

Likewise, as K8.0 noted, it's not like the 30-series won't be a substantial performance jump over the 20-series. 50%ish gains with DLSS 2.0 are great, don't get me wrong, but that only applies to some games (a list that I'm sure will be expanding, but it's not large yet), and no one's going to turn down 30+% performance bump from hardware revisions that works everywhere. And as noted, there are already games where even a 50% bump to a 2080Ti won't do 4k@144hz, which is absolutely going to be a target once the next gen video cards drop and finally support DP 2.0/HDMI 2.1.

Stickman
Feb 1, 2004

There's also a lot of room to add cool, expensive raytracing effects to games. It'll be easy for developers to eat any DLSS performance gains they get thrown.

ufarn
May 30, 2009
I wonder who gets raytraced sound right, because the demos sound very, very weird, even if it's just about getting used to it.

Ugly In The Morning
Jul 1, 2010
Pillbug

ufarn posted:

I wonder who gets raytraced sound right, because the demos sound very, very weird, even if it's just about getting used to it.

If I have to get a sound card for this poo poo, so help me...



Actually I wouldn't mind having to get a sound card, there's no other meaningful tweaks I can make to this computer.

ufarn
May 30, 2009

Ugly In The Morning posted:

If I have to get a sound card for this poo poo, so help me...



Actually I wouldn't mind having to get a sound card, there's no other meaningful tweaks I can make to this computer.
I have a pair of Sennheisers connected to an audio interface, so it's not for lack of good equipment; it's just weird how far the sound travels. Probably needs texture-specific optimizations like global illumination does.

Makes me wonder whether I need to take a second look at Dolby Atmos on PC.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

ufarn posted:

I wonder who gets raytraced sound right, because the demos sound very, very weird, even if it's just about getting used to it.

HRTF is pretty much solved, so the rest is likely just the math on different reflection properties of the geometry.

Kraftwerk
Aug 13, 2011
i do not have 10,000 bircoins, please stop asking

Should I upgrade my 1070 GTX or wait until the next gen consoles come out and build a Ryzen 3/3080 system from scratch? I’ve already been asking around in other threads. The general consensus is don’t try to future proof because whatever 1800$ GPu or Cpu you have now will become hopelessly obsolete as soon as the PS5 hits store shelves.

Anyway I don’t want to future proof but I do want to pair my 7700k with a better GPU than the one I have now so I can have smooth frame rates until it’s time to build against the next gen consoles this fall. What’s the best cost to performance upgrade I can get right now so I can get solid frame rates at 1440p?

I’m not even sure if I should be building a PC anymore based on the last few pages it sounds like some SSD magic is gonna make games poo poo on pc since most new games are developed for consoles first.

OhFunny
Jun 26, 2013

EXTREMELY PISSED AT THE DNC
I would definitely wait for Nvidia's 3000 series that are expected to launch by the end of the year.

Ugly In The Morning
Jul 1, 2010
Pillbug

Kraftwerk posted:

Should I upgrade my 1070 GTX or wait until the next gen consoles come out and build a Ryzen 3/3080 system from scratch? I’ve already been asking around in other threads. The general consensus is don’t try to future proof because whatever 1800$ GPu or Cpu you have now will become hopelessly obsolete as soon as the PS5 hits store shelves.

Anyway I don’t want to future proof but I do want to pair my 7700k with a better GPU than the one I have now so I can have smooth frame rates until it’s time to build against the next gen consoles this fall. What’s the best cost to performance upgrade I can get right now so I can get solid frame rates at 1440p?

I’m not even sure if I should be building a PC anymore based on the last few pages it sounds like some SSD magic is gonna make games poo poo on pc since most new games are developed for consoles first.

Nah, the 1070 still beats the price to performance king (1660 Super) and though it doesn’t have any of the fun 2000 series things, those will be on the 3000 series so unless you need em right now it’s best to hold off. I love my 2070S (it’s basically my favorite card since the Radeon 9800), but I also don’t mind playing on my laptop with a 1660 that’s a bit weaker than your desktop 1070.

The SSD magic will spill over, and the information you’re hearing now about them is all promo stuff, I don’t doubt that they aren’t quite as great as the puff pieces say. The 3-series is likely to smoke whatever graphics cards are in the next gen stuff anyway, since they’re equivalent to cards that are already out.

Kraftwerk
Aug 13, 2011
i do not have 10,000 bircoins, please stop asking

Thanks! I appreciate that.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.
I upgraded from a 1070 to a 2080S when the quarantine stuff started coming down because I knew PC Gaming was going to be a main focus for a few months. So balance that as you will. Under normal circumstances I'd definitely wait but I am playing way more games now than I usually do so the benefit is more than it'd usually be. Though I also really wanted to play Alyx and the 1070 would have huffed a puffed a bit on that.

I would not count on the 3000 series until November and it might even be a little later.

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

Lockback posted:

I upgraded from a 1070 to a 2080S when the quarantine stuff started coming down because I knew PC Gaming was going to be a main focus for a few months. So balance that as you will. Under normal circumstances I'd definitely wait but I am playing way more games now than I usually do so the benefit is more than it'd usually be. Though I also really wanted to play Alyx and the 1070 would have huffed a puffed a bit on that.

I would not count on the 3000 series until November and it might even be a little later.

I jumped on a 2060S when I found one that was $330 pretax, then sold my old 1070 for $150. The 1070 was already struggling on a 1440p150 XB271HU so that was no brainer upgrade, especially when Pascal loses a lot of steam on async compute heavy games like Doom Eternal and RDR2.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

Riflen posted:

People have messed around in the ini files and tried to see how low a base resolution is needed for a good experience. 512x288 is surprisingly acceptable and sadly 128x72 isn't enough. =)

512x288 seriously looked good enough to be playable and now I'm wondering how much of a performance gain was had from doing that.

eames
May 9, 2009

repiv posted:

It should scale up to higher framerates quite well. The GTC talk gave the cost of running DLSS on various GPUs at various resolutions:

Thanks, those numbers look low enough to not be a big concern but also high enough that there’s room for improvement. In other words NVidia finally did it, Machine Learning/Tensor Core improvements of future chip generations could directly impact gaming performance.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Riflen posted:

People have messed around in the ini files and tried to see how low a base resolution is needed for a good experience. 512x288 is surprisingly acceptable and sadly 128x72 isn't enough. =)

....that's disgusting.

edit: I say that in genuine admiration of the tech.

SwissArmyDruid fucked around with this message at 11:49 on Apr 15, 2020

ConanTheLibrarian
Aug 13, 2004


dis buch is late
Fallen Rib
The 512*288 video is really impressive, especially when pausing to look at things like writing on a wall.

While GPUs might be able to output 4k at 144Hz, matching IPS monitors that also support HDR are still very expensive. Until they come down in price, I think most people will be missing out on at least some of the benefits of the 3 series cards.

Riflen
Mar 13, 2009

"Cheating bitch"
Bleak Gremlin

gradenko_2000 posted:

512x288 seriously looked good enough to be playable and now I'm wondering how much of a performance gain was had from doing that.

The poster says in the video description:

quote:

Internal resolutions used in this video:
128x72
256x144
512x288
1280x720 (this is what the game normally allows you to use with 1440p output resolution)
1280x720 (DLSS off)
Output resolution:
2560x1440

Running on a PC with AMD Ryzen 2600 CPU and NVIDIA GeForce RTX 2060 graphics card. Framerates were consistently above 90 in all resolutions except 720p (about 50-55fps on average). On 1440p native (not shown here) the framerates start getting pretty bad with lost of frame drops below the 10s in combat and about 25fps on average (just a guess, these aren't exact numbers).

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

Riflen posted:

The poster says in the video description:

good lord if a 2060 can manage everything turned on while maintaining over 60 FPS and looking that good (and 540p could probably be a sweet spot between 720p-DLSS and the hacked 288p) this tech is going to be loving incredible

ufarn
May 30, 2009
I still hope devs and Nvidia will do a better job of explaining the technology, because so many cool graphics features require people to know a lot in advance, even just to mess with basic settings like AA and AO.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

ufarn posted:

I still hope devs and Nvidia will do a better job of explaining the technology, because so many cool graphics features require people to know a lot in advance, even just to mess with basic settings like AA and AO.

I'd expect them to just roll it in to GeForce Experience. As much as people in this thread beat up on it for requiring a log-in and possible telemetry, Joe Consumer probably isn't as morally offended, and likes seeing the little green "Optimized!" check boxes on his games.

Otherwise, yeah, we're left with expecting Joe Consumer to actually read anything about the settings for his games, which is an iffy proposition. Still, "turn this on to get a basically free 50% performance bump" is probably going to be right at the top of any settings guides, so I'd hope that people would catch on quickly for the most part.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

DrDork posted:

I'd expect them to just roll it in to GeForce Experience. As much as people in this thread beat up on it for requiring a log-in and possible telemetry, Joe Consumer probably isn't as morally offended, and likes seeing the little green "Optimized!" check boxes on his games.

Otherwise, yeah, we're left with expecting Joe Consumer to actually read anything about the settings for his games, which is an iffy proposition. Still, "turn this on to get a basically free 50% performance bump" is probably going to be right at the top of any settings guides, so I'd hope that people would catch on quickly for the most part.

It'll just get rolled into the "Recommended" settings. Honestly there isn't much here to adjust. You can probably even get away with "adjust this bar to find the contrast you want" and there you go. Let the tweaker-class be the ones to disable it if they want true native res.

It's kind of like Anti-Aliasing. Way back when that was a "Pro" setting, and then it just got rolled into what the software suggests as a setting based on your card. This is the same thing but with the reverse performance impact.

Consoles need DLSS the most. A lot of the drawbacks become impossible to notice on the mediocre response rate TVs most consoles are plugged into. The suggestion of the next switch using it heavily is brilliant.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

ConanTheLibrarian posted:

The 512*288 video is really impressive, especially when pausing to look at things like writing on a wall.

While GPUs might be able to output 4k at 144Hz, matching IPS monitors that also support HDR are still very expensive. Until they come down in price, I think most people will be missing out on at least some of the benefits of the 3 series cards.

What we need is DLSS running on the monitor, so we can send it 720p at 2500Hz and have it upscale.

eames
May 9, 2009

I feel like it could go either way and might turn out to be two-edged sword, because if this catches on native rendering output might be gone for good.

Yes, it's indubitably cool to get a free 50% performance gain without an immediately obvious difference in image quality, but I'm not sure I'll be as thrilled when <insert big AAA publisher> forces those tradeoffs for me and the locally rendered title displays videocompression-esque artifacts to achieve playable framerates.
I realize I'm overly negative again but :tinfoil:

ufarn
May 30, 2009
At least that's the one good thing about it being exclusive to Nvidia. They can't brick a game like that for both Radeon and GeForce with only GeForce owners having a solution.

Unless it's HairWorks

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

eames posted:

I feel like it could go either way and might turn out to be two-edged sword, because if this catches on native rendering output might be gone for good.

Yes, it's indubitably cool to get a free 50% performance gain without an immediately obvious difference in image quality, but I'm not sure I'll be as thrilled when <insert big AAA publisher> forces those tradeoffs for me and the locally rendered title displays videocompression-esque artifacts to achieve playable framerates.
I realize I'm overly negative again but :tinfoil:

Except wouldn't you rather play a game that is decently optimized and thus doesn't have those artifacts? There's nothing stopping companies from doing a lovely job now and releasing games with poor visuals, DLSS won't change that. The nice thing is you don't really design around DLSS, you design around a high res and DLSS helps to boost you closer to that.

Will it mean some games will push the envelope more and have options that are only going to work well on a high end GPU WITH DLSS? Sure, but again, that is how things like Hairworks and Ray tracing work today, and how high levels of AA used to work in the past.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Lockback posted:

Except wouldn't you rather play a game that is decently optimized and thus doesn't have those artifacts? There's nothing stopping companies from doing a lovely job now and releasing games with poor visuals, DLSS won't change that. The nice thing is you don't really design around DLSS, you design around a high res and DLSS helps to boost you closer to that.

Will it mean some games will push the envelope more and have options that are only going to work well on a high end GPU WITH DLSS? Sure, but again, that is how things like Hairworks and Ray tracing work today, and how high levels of AA used to work in the past.

I think he's saying the opposite: that rather than bother to spend the time trying to optimize their poo poo, lazy studios will instead rely on DLSS to get performance into an acceptable range and basically force you to accept any image degradation that might come with that. So instead of targeting 60FPS and using DLSS to bump it to 90 if you want, they're targeting 40FPS and assuming you'll turn on DLSS to get back to 60 and save themselves a few weeks/months of engine tuning and $$ in the process.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

DrDork posted:

I think he's saying the opposite: that rather than bother to spend the time trying to optimize their poo poo, lazy studios will instead rely on DLSS to get performance into an acceptable range and basically force you to accept any image degradation that might come with that. So instead of targeting 60FPS and using DLSS to bump it to 90 if you want, they're targeting 40FPS and assuming you'll turn on DLSS to get back to 60 and save themselves a few weeks/months of engine tuning and $$ in the process.

Right, but the companies that do that work will have better looking games by having better visuals or by letting you turn up graphic settings using the FPS "headroom" DLSS buys you. You can be lazy right now and release things unoptimized, DLSS won't really change it.

Basically "Lazy Developers will make ugly games" isn't any more or less of a risk with DLSS.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
You can argue that lazy devs will be allowed to be even lazier by being given a free 50% headroom bump, and that it could really gently caress over a big chunk of the player base because a substantial number of people aren't going to have DLSS-compatible cards for years.

That last point is why I, for one, don't really think it'll go that way: DLSS is super cool, but it's also very new. It takes years and years for tech to dribble down to enough of the install base that you can count on it being there. I mean, gently caress, we're still dealing with people not having DX12 cards because they've "only" been around since 2015.

pyrotek
May 21, 2004



eames posted:

I feel like it could go either way and might turn out to be two-edged sword, because if this catches on native rendering output might be gone for good.

Yes, it's indubitably cool to get a free 50% performance gain without an immediately obvious difference in image quality, but I'm not sure I'll be as thrilled when <insert big AAA publisher> forces those tradeoffs for me and the locally rendered title displays videocompression-esque artifacts to achieve playable framerates.
I realize I'm overly negative again but :tinfoil:

Look at it from the other way: why waste time rendering at needlessly high resolution when you can get the same or better quality using DLSS and use the savings to push the overall rendering quality higher?

Variable rate shading will also similarly improve image quality by not wasting time rendering all of the pixels at the highest shading rate when the differences wouldn't be visible. This improves image quality by freeing extra performance to render the pixels that are most important to the image at a higher quality.

Real-time rendering is all about making trade-offs between rendering speed and quality. Rendering internally at your monitor's native resolution may be one of the next things we have to get used to not always being the best idea, especially as monitor resolutions continue to climb.

If you really wanted to, you could use DSR to render at a resolution higher than your monitor and set the DLSS resolution to the same as your monitor. Would that count as native rendering output?

ConanTheLibrarian
Aug 13, 2004


dis buch is late
Fallen Rib

DrDork posted:

That last point is why I, for one, don't really think it'll go that way: DLSS is super cool, but it's also very new. It takes years and years for tech to dribble down to enough of the install base that you can count on it being there. I mean, gently caress, we're still dealing with people not having DX12 cards because they've "only" been around since 2015.
The new consoles are AMD-based too, so it's very unlikely devs would bet on DLSS being prevalent enough that they can ignore people without access to it.

repiv
Aug 13, 2009

ConanTheLibrarian posted:

The new consoles are AMD-based too, so it's very unlikely devs would bet on DLSS being prevalent enough that they can ignore people without access to it.

That doesn't mean the consoles aren't going to use reconstruction though, checkerboard rendering and TAAU are already widely used in console games to fake high resolutions.

We might see CB/TAAU become more common on PC with a baseline expectation that you're supposed to use reconstruction instead of native resolution rendering, and an option to use DLSS instead where available.

Farmer Crack-Ass
Jan 2, 2001

this is me posting irl

DrDork posted:

I mean, gently caress, we're still dealing with people not having DX12 cards because they've "only" been around since 2015.

How much of that is from the video card market being totally fuckclobbered for a while by those shithead cryptocurrency miners? In 2016 I was griping in this thread about not being able to find a drat 1060 anywhere thanks to those clowns.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
The crypto poo poo-fest certainly didn't help anything, no. But the simple reality is that it probably wouldn't be all that much different by now: a 9-series card is still DX12 compatible, yet you have the 750Ti as #13 on the Steam Hardware Survey (I know, it's an iffy source for absolute numbers, but a ton of those cards are still out there).

My point mostly is that even if DLSS lives up to what it looks like it could be, it'll be a solid several years before a dev shop can realistically assume that the majority of their install base will have access to it. And that's saying nothing of what AMD is going to have to figure out in the meantime.

ConanTheLibrarian
Aug 13, 2004


dis buch is late
Fallen Rib
It will be interesting to see if nvidia add tensor cores to their cheap GPUs. On the one hand they might reserve them as a premium feature, but on the other hand they could create low end cards with many fewer shaders by compensating with tensor cores. Pushing DLSS to the largest segment of the market is one way to encourage developers to invest in it.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
It might depend heavily on how much they would cost to add. If they could add an appropriate amount for like $10-$20 and get a 30% performance bump out of it, that would absolutely crush AMD at the only price-points where they're reasonably competitive right now.

On the other hand, I could also absolutely see them using it as a potent gatekeeper to encourage people to pick up a 3060 instead of whatever the replacement for the 1650/1660 will be called.

Adbot
ADBOT LOVES YOU

eames
May 9, 2009

pyrotek posted:

Look at it from the other way: why waste time rendering at needlessly high resolution when you can get the same or better quality using DLSS and use the savings to push the overall rendering quality higher?

Im not opposed to this technology and I think all your arguments are very valid, but I haven’t made up my mind yet on what the implications are.

The phrase „needlessly high resolution“ reminds me a bit of the countless arguments of .wav/lossless audio vs lossy compression. As of 2020 the latter seems to have won out (I consume all my music via normal iTunes/Spotify/Amazon streaming), but I also think that the defenders of lossless have valid arguments.

We‘ll see where this road leads the industry, I for one am pretty excited to buy a next gen NVidia card that supports the feature.

On a completely different note, I’d love to see some FPS per Watt numbers for various DLSS resolutions versus native rendering.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply