Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Ugly In The Morning
Jul 1, 2010
Pillbug
DLSS 2.0 takes significantly less effort to implement than 1.0, at least. I wouldn’t be surprised to see it become a lot more common.

Adbot
ADBOT LOVES YOU

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

Ugly In The Morning posted:

DLSS 2.0 takes significantly less effort to implement than 1.0, at least. I wouldn’t be surprised to see it become a lot more common.

It should basically become standard in upcoming Unreal Engine games as the latest build has DLSS 2.0 as part of the engine. With so many big AAA companies using a single engine for most of their games, once DLSS 2.0 support is built in once, it seems like it should ~just work~ with everything using the engine from there on out.

repiv
Aug 13, 2009

Beautiful Ninja posted:

It should basically become standard in upcoming Unreal Engine games as the latest build has DLSS 2.0 as part of the engine.

Not quite, Nvidia is providing UE4 integration for DLSS 2.0 but it's not part of UE4 out of the box. Game developers need to pull in NVs patches to get it.

NV getting DLSS into UE4/Unity as a first class feature would be a huge win but they haven't convinced Epic/Unity to do it yet.

shrike82
Jun 11, 2005

Have there been any prior successful manufacturer specific enhancements for games that boosted performance agnostically?

I don’t count stuff like PhysX.

repiv
Aug 13, 2009

AMDs Mantle I guess? That was just a stepping stone to DX12/Vulkan though, it only shipped in a few games.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

shrike82 posted:

Have there been any prior successful manufacturer specific enhancements for games that boosted performance agnostically?

I don’t count stuff like PhysX.

Radeon Image Sharpening, the internal bit depth dropping that NVIDIA used to do, etc.

repiv
Aug 13, 2009

Paul MaudDib posted:

Radeon Image Sharpening, the internal bit depth dropping that NVIDIA used to do, etc.

The bit depth thing was ATi unless there's another instance I forgot. Their drivers would swap out FP16 render targets (16 bits per channel) for R11G11B10 (11 or 10 bits per channel) because those formats were much faster on their hardware at the time, but obviously some precision was lost.

https://www.geeks3d.com/20100916/fp16-demotion-a-trick-used-by-ati-to-boost-benchmark-score-says-nvidia/

Geemer
Nov 4, 2010



K8.0 posted:

Whether you have the issue or not seems almost entirely random. It's absurd that Nvidia can't manage to fix it. It's loving 2020, who the gently caress buys a DGPU but still runs single monitor?

I've tried multi-monitor. Didn't like it. Single monitor 4 lyfe baybee!

pik_d
Feb 24, 2006

follow the white dove





TRP Post of the Month October 2021

Geemer posted:

I've tried multi-monitor. Didn't like it. Single monitor 4 lyfe baybee!

I have 2 monitors and now want 3

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

shrike82 posted:

It’s a lot less interesting than a raw boost in performance.

I'd actually disagree, on the grounds that 20% raw boost in performance matters less than the +50% you can get out of DLSS. I agree that it's not a universal that can be expected to be everywhere going forward, but for those games that do support it, it'll be a huge benefit.

And while it does take some effort to implement, the enormous gains that can be gotten by using it I think will seriously incentivise studios to give it a shot if they're already using UE4.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
DLSS is so good that it should make it into major titles that need it. Of course not every game will enjoy its benefits, but they don’t have to. This is a key point that is often overlooked.

The games that really benefit from DLSS are the exact same AAA titles that can have it inplemented AND get the white glove nvidia dev help to make it much easier.

I’m guessing a lot of people who are currently iffy on DLSS 2.0 haven’t used it. To say its the real deal is almost an understatement. It’s basically incredible.

Putting it in the same box as physX or whatever is ridiculous. We’re talking about a full-fat 30-50% baseline performance increase with effectively no downside for the user.

If you think major AAA games are gonna sleep on that big of a gimme I don’t know what to say. It’s not some parlor trick.

DrDork posted:

I'd actually disagree, on the grounds that 20% raw boost in performance matters less than the +50% you can get out of DLSS. I agree that it's not a universal that can be expected to be everywhere going forward, but for those games that do support it, it'll be a huge benefit.

And while it does take some effort to implement, the enormous gains that can be gotten by using it I think will seriously incentivise studios to give it a shot if they're already using UE4.

Yes. Preferring raw clocks over all else is a shallow and outdated way to think about GPU performance. And again, we don’t need every game to support it; just the ones that need it, which ultimately is a much smaller slice of the pie.

Taima fucked around with this message at 22:55 on Jun 21, 2020

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Taima posted:

Yes. Preferring raw clocks over all else is a shallow and outdated way to think about GPU performance. And again, we don’t need every game to support it; just the ones that need it, which ultimately is a much smaller slice of the pie.

Yeah, that's a good point. Stardew Valley won't support DLSS, but it doesn't need to, either, because even a potato can push decent frames.

Though I do wonder how easy it is to bolt on to non-UE4 engines, particularly because some of the perennial performance-crushing games are built on things like REDengine (TW3, Cyberpunk 2077) or whatever bizarro-engine Bethesda is using at a given point in time.

Ugly In The Morning
Jul 1, 2010
Pillbug
I’m pretty certain Cyberpunk is going to have DLSS considering how much cross promotion has been going on between that game and NVidia.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
I believe it will largely be a moot point. I will eat my loving hat if 2077 doesn't support DLSS 2.0 directly out of the box. We know that Nvidia has been balls deep in Cyberpunk 2077 development for at least a year:

https://www.nvidia.com/en-us/geforc...0PROJEKT%20RED.

And releasing bespoke editions of their poo poo:

https://www.pcgamer.com/cyberpunk-2077-rtx-2080-ti-ebay-listings/

DLSS implementation in REDengine is assured as far as I'm concerned.

Nvidia knows how good DLSS is and they know where their bread is buttered. They will bend over backwards to get DLSS support into major titles.

Ugly In The Morning posted:

I’m pretty certain Cyberpunk is going to have DLSS considering how much cross promotion has been going on between that game and NVidia.

poo poo you beat my post. Yes, exactly.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Ugly In The Morning posted:

I’m pretty certain Cyberpunk is going to have DLSS considering how much cross promotion has been going on between that game and NVidia.

You're probably right. I just mean that, as a general statement, some of the bigger dev studios have been running their own franken-engines for years, and am curious as to how much work it'll take to get them onboard with DLSS. Which isn't to say that I don't think they'll make the effort--the gains are just too good to ignore if it's something that can be implemented without having to go back and rewrite huge chunks of the engine. I mean, where else do you get to pull 50% performance out of your hat for free?

Sure it only works for a small segment of the market right this moment, but I think when the 3-series drops from NVidia we'll see a lot of people upgrade to something that supports DLSS (10xx cards right now hold the top 7 slots on the Steam Hardware Survey--in another year I could see those being replaced by 20xx/30xx cards), so it'll just get wider application over time.

Riflen
Mar 13, 2009

"Cheating bitch"
Bleak Gremlin

DrDork posted:

Yeah, that's a good point. Stardew Valley won't support DLSS, but it doesn't need to, either, because even a potato can push decent frames.

Though I do wonder how easy it is to bolt on to non-UE4 engines, particularly because some of the perennial performance-crushing games are built on things like REDengine (TW3, Cyberpunk 2077) or whatever bizarro-engine Bethesda is using at a given point in time.

If the developer is implementing TAA, then DLSS is trivial for them to include. Or as close to trivial as anything gets these days. But yes, the engine will need some changes made to it and there are definitely scenarios where the developers making the game are not in a position to modify the engine.

repiv
Aug 13, 2009

Riflen posted:

If the developer is implementing TAA, then DLSS is trivial for them to include.

and what engine doesn't have TAA nowadays, it's the de facto standard outside of VR

shrike82
Jun 11, 2005

DrDork posted:

I'd actually disagree, on the grounds that 20% raw boost in performance matters less than the +50% you can get out of DLSS. I agree that it's not a universal that can be expected to be everywhere going forward, but for those games that do support it, it'll be a huge benefit.

And while it does take some effort to implement, the enormous gains that can be gotten by using it I think will seriously incentivise studios to give it a shot if they're already using UE4.

People were rumor mongering a raw 50% boost in compute. If the 3000 series end up running a lot hotter on an architectural shift with a lot more CUDA cores and at a higher base clock, and hitting a fairly mediocre bump in performance - I’d count that as a disappointment.

DLSS is interesting but I think people are pretending that it’s a “lossless” boost in performance and I’m skeptical that even a majority of games released in the future will ever support it.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
Have you used the tech in question, yes or no

shrike82
Jun 11, 2005

Yes both versions - which is why i think people are overstating its benefits

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
What do you dislike about it or find to be lacking?

4k DLSS 2.0 set on high quality is excellent. And I’m pretty discerning, I game on a 65 inch 4K GSync OLED from like 5 feet away. If anyone is going to notice poo poo being off it’s me.

I personally find the tech to be incredible, and would in fact say it’s extremely close to native 4k while providing a pretty astonishing frame increase (again, this is at the high quality setting).

shrike82
Jun 11, 2005

There is absolutely a loss of quality at least when I tried it on Control on the highest quality setting. We can argue about how (in)significant it is but then we're making similar arguments to "if you turn down these settings from Ultra to High, you'll never notice it in-game".

Malcolm XML
Aug 8, 2009

I always knew it would end like this.
imo dlss is so good that nvidia ought to figure out a way to make it automatic. perhaps some kind of extension that hooks into taa passes, or such. 25% speed for no quality loss is bonkers

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

shrike82 posted:

There is absolutely a loss of quality at least when I tried it on Control on the highest quality setting. We can argue about how (in)significant it is but then we're making similar arguments to "if you turn down these settings from Ultra to High, you'll never notice it in-game".


I suppose. All I can say is that I’m a consumerist dickwad with more money than sense who is used to pretty excellent image quality at all times, and it didn’t strike me like I was giving up much of anything (and getting a huge performance benefit in return).

Point taken on the subjectivity argument though.

Personally I’ll be using DLSS on all games that support it :shrug:

Taima fucked around with this message at 01:30 on Jun 22, 2020

Ugly In The Morning
Jul 1, 2010
Pillbug
The only time I really notice DLSS 2.0 losing image quality is on some characters’ hair or if I crank the setting all the way to performance, and even then the not-terribly severe degradation is well worth the extra ~40 FPS I get with ray tracing on.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Malcolm XML posted:

imo dlss is so good that nvidia ought to figure out a way to make it automatic. perhaps some kind of extension that hooks into taa passes, or such. 25% speed for no quality loss is bonkers

It could easily be done via drivers in the same way that NVidia already allows games to have preset defaults for a given game that are different from globals. Or the game itself could just default to having that option switched on in the settings when it autodetects a 20/30 series card. But from what I understand it's not so generic that they can simply apply it to TAA-enabled games that didn't take the time to do the needful to make it hook into DLSS properly.

CaptainSarcastic
Jul 6, 2013



I wish sites showing benchmarks and framerates would do a better job indicating what resolution they are using for the tests. Way too many times I have to hunt to figure out what resolution they are testing at to contextualize the fps they are reporting. I just went through a Techradar article on DLSS and had to find the one reference in the text that said they were testing at 3860x1440 in order to get a better sense of their results.

Gyrotica
Nov 26, 2012

Grafted to machines your builders did not understand.

Riflen posted:

If the developer is implementing TAA, then DLSS is trivial for them to include. Or as close to trivial as anything gets these days. But yes, the engine will need some changes made to it and there are definitely scenarios where the developers making the game are not in a position to modify the engine.

Like the Frostbite, where institutional knowledge of the engine is spread across multiple studios, all of them infamous for treating all their people like replaceable cogs and none of them particularly renowned for their documentation.

shrike82
Jun 11, 2005

Has AMD made any noise about developing their own implementation of DLSS? Seems like it'd be in their interest for boosting their PC and console hardware.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

shrike82 posted:

Has AMD made any noise about developing their own implementation of DLSS? Seems like it'd be in their interest for boosting their PC and console hardware.

The closest they've got is Radeon Image Shapening (RIS) which is...not close. They'd have to be insane to not be working on something similar, but how long it'll take them to get it out the door is a huge question, and AMD hasn't been forthcoming with any details. Best we can hope for is something reasonably similar will be baked into RDNA 2, because DLSS-like options would be huge for consoles. But we have absolutely 0 indications so far that it'll be there.

repiv
Aug 13, 2009

shrike82 posted:

Has AMD made any noise about developing their own implementation of DLSS? Seems like it'd be in their interest for boosting their PC and console hardware.

nope

replicating nvidias approach probably isn't even tenable on their hardware without any ML acceleration, having an equivalent algorithm isn't much use if it ends up eating half the frame budget

the problem is that non-ML temporal upscaling has already been extensively researched by tons of developers and nobody has come up with anything even close to DLSS, amd can try throwing more r&d at the problem but it would be grasping at straws

repiv fucked around with this message at 02:35 on Jun 22, 2020

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
The only bad thing about DLSS2.0 is that it's nvidia specific

The closest thing to black magic in GPU tech for years, and it just had to be vendor-specific. It really sucks when that happens. In another timeline, it would have been innovated by some platform-agnostic developer and the code open-source, bit like how SMAA is

Cygni
Nov 12, 2005

raring to post

If Nvidia is making the switch 2, ya gotta think its gonna rely on DLSS heavily which would be mondo for adoption of the tech. Could be real tasty for that sort of use case.

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

gradenko_2000 posted:

500 would still be an absolute steal for the sheer amount of hardware you'd be getting. I don't think it's possible to build a computer of those specs for that price. Hell, even twice the price feels like it'd be pushing it. What's a 5700XT go for nowadays? 400 by itself for something that still isn't as powerful as the GPU on the PS5?

Well yeah, that's always been the case at a new consoles launch though - remember these aren't expected to be on the shelves until late Nov this year, and in likely limited quantities at that. So it won't be compared to a 5700XT, it will be the 6700xt (?). I expect the mid-tier cards available from both Nvidia/AMD will be comparable, if not superior GPU-wise than these machines (mid-tier meaning around $400).

An Xbox One X is around GTX 1660 Super in pure GPU throughput, and it can often be had for $300 as a system vs. $230 for the card alone. Difference is though that even with a middling processor, while running at 4K with the same settings as the one X (and far higher than the PS4 Pro), that $230 card, while certainly not a great 'value' in compared to a 'full' console, can provide a significantly better experience in some games simply due to it being paired with a much better CPU + storage. If you don't like 30fps, then just play at 1440p/1080p for 60, an option that just don't get with a lot of multiplatform titles on the Pro/X because the CPU is holding it back.

That's not going to be the case with this gen though. There will be games with 120fps modes. Storage speed will of course be a huge improvement, and with things like saving game states, getting in and out of games will actually be faster than on the PC. PC versions having 'better textures' will likely be a thing of the past too as they'll just exist on storage and be streamed in. Much of these improvements will likely carry over to the cheaper-series X variant as well, so much like a PC where the inherent architecture is very similar, but just paired with a lower-end GPU and a little less ram.

So it is a case of diminishing returns IMO, and hence why I don't agree with the argument that there's little overlap between the GPU market and console, they may not be in direct competition but there is some. The PC has always had at least one area where it had an immediate advantage over consoles in terms of CPU/GPU/Storage, not all areas at once at launch but almost always one - and this is the first gen where that's not necessarily true (maybe ray tracing performance?).

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
I was watching the Digital Foundry review of The Outer Worlds on Switch and I kept thinking that's the kind of game that would have really benefit from DLSS, on that platform: you can keep the internal resolution (really) low so that you don't have to reduce the detail to the point where it looks like smeared poo poo and the FPS can remain stable during combat, and then you upscale it to a resolution that's actually decent.

Shame about the entire rest of the gameplay and the plot though!

pyrotek
May 21, 2004



shrike82 posted:

There is absolutely a loss of quality at least when I tried it on Control on the highest quality setting. We can argue about how (in)significant it is but then we're making similar arguments to "if you turn down these settings from Ultra to High, you'll never notice it in-game".

I think it looks better in some ways, worse in other ways than native rendering.

https://www.youtube.com/watch?v=tMtMneugt0A

That video convinced me that some of the things I thought were flaws were actually closer to the 32 sample per pixel reference they trained the algorithm with. We are just used to the types of image flaws that current rendering brings, and DLSS does look different from that.

I certainly understand people having different opinions on the perceptual quality of the image, but for such massive performance gains I'll live with "pretty close".

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

Cygni posted:

If Nvidia is making the switch 2, ya gotta think its gonna rely on DLSS heavily which would be mondo for adoption of the tech. Could be real tasty for that sort of use case.

Absolutely, and that’s the magic of DLSS 2.0- it immediately provides value to high end users trying to get max frames with ray tracing, ie the people who will actually pay for the architecture. Meanwhile, it scales ridiculously well with older GPUs for performing basic operations at 1440p and under.

I get why old school GPU heads don’t believe the hype, but this is next generation technology that is above simple generations. And the fact is, nvidia will spend as much as it has to for adoption.

My only worry is that Big Navi won’t have an answer, and that will destroy the architecture along with Intel’s future GPUs. Nvidia spent 2 years making DLSS good, and we’re just moving towards actual competition in the segment. I will be sad if DLSS crushes that competition.

shrike82
Jun 11, 2005

Hopefully we'll see an open source implementation (including pretrained models) of this leveraging commodity hardware as well as Nvidia tensor cores. Supersampling is a big topic in CV research so it's not like Nvidia has a monopoly on it.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Zedsdeadbaby posted:

The only bad thing about DLSS2.0 is that it's nvidia specific

The closest thing to black magic in GPU tech for years, and it just had to be vendor-specific. It really sucks when that happens. In another timeline, it would have been innovated by some platform-agnostic developer and the code open-source, bit like how SMAA is

repiv posted:

nope

replicating nvidias approach probably isn't even tenable on their hardware without any ML acceleration, having an equivalent algorithm isn't much use if it ends up eating half the frame budget

the problem is that non-ML temporal upscaling has already been extensively researched by tons of developers and nobody has come up with anything even close to DLSS, amd can try throwing more r&d at the problem but it would be grasping at straws

it's the best thing since... variable refresh rate gaming, which was also pioneered by NVIDIA.

people really don't appreciate how much NVIDIA works at pushing gaming tech. And GPU tech in general. They invented GPGPU compute as we know it too.

Huang was absolutely right, 11 years ago (!), when he said "NVIDIA is a software company". People laughed.

There's a reason Huang is one of the only (possibly the only?) OG 90s tech CEOs who is still sitting in his job at the helm of the company he founded.

God, I wish we could have seen Huang at the helm of AMD, at the peak of their AMD64 dominance, with NVIDIA graphics IP and an x86 license. Bulldozer would not have happened and AMD would not have spent 5 years in the wilderness, Intel would then have bought ATI and dumped a shitload of cash into fixing their poo poo, it would have been so much better that way. The AMD board just couldn't bring themselves to swallow their pride.

Paul MaudDib fucked around with this message at 06:47 on Jun 22, 2020

Adbot
ADBOT LOVES YOU

shrike82
Jun 11, 2005

Nvidia still isn't a software company? They've been trying to bind the ML stack to Nvidia hardware but Google successfully pushing researchers and ML framework towards their TPU solution is a good example of that.

If anything, anything Nvidia services/software related has been pretty bad - GeForce Now, Experience, their cloud compute platform etc. It's a reason why I'm skeptical they'll be able to make DLSS a universal SS solution for games.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply