Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

gradenko_2000 posted:

a sub-200 dollar RTX card would be cool. I'll use it to run stuff at 540 upscaled to 1080 :nyoron:
IMO what NVidia showed back then with scaling from 540p to 1080p was probably just the very best cases. So I wouldn't hold my horses. They overpromised DLSS v1, too.

DLSS2 probably works optimally when the scale factor is lower, meaning it needs to make up less details, and that screen resolution is at the higher end, which implies a) less high frequency detail that's hard to extrapolate, and b) artifacts from bad predictions will be less noticeable.

I would love there to be a bullshit mode. Scaling 1440p up to 4K and then downsampling it back to 1440p for superawesome antialiasing.

Adbot
ADBOT LOVES YOU

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

Combat Pretzel posted:


I would love there to be a bullshit mode. Scaling 1440p up to 4K and then downsampling it back to 1440p for superawesome antialiasing.

I don't think it's as good as TAA, there are a few games where you can do exactly this, and as an alternative method of anti-aliasing it's still worse for image stability. A game with strictly no AA being supersampled to 4k from 1440p looks worse than a game just running at 1440p with TAA, and the performance requirement is literally double.

Supersampling is horribly expensive and doesn't fix the biggest issue with lower resolutions, the stair-casing jaggies and shimmering.

It's most obvious in a game like Destiny 2, where you can go all the way up to 200% resolution even at 4k and you still experience horrific shimmering and instability on specular surfaces and foliage.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
I thought the neat thing of DLSS2 was that it automatically generates antialiased imagery, because the ground truth of the trained network was like 32x supersampled? The idea is to create a DLSS2 antialiased image, not just plain upscale, and then downsample it back to the original resolution. Go from 1440p to 4K, create nice imagery, downsample it back to 1440p, which should smooth out remaining jaggies. Optional sharpening filter. The idea being better detail at the end than 1080p to 1440p via DLSS2

Indiana_Krom
Jun 18, 2007
Net Slacker
Just running DLSS but starting at the panels native resolution should be at the very least equal to TAA, but more likely will be comfortably superior (it will be something like combining MSAA+TAA).

repiv
Aug 13, 2009

Nvidia talked about using DLSS 1.0 without upscaling early on (they called it DLSS2x) but ended up never shipping anything with that mode, and they haven't mentioned the idea again since DLSS 2.0 came out :shrug:

I guess their messaging is that DLSS 2.0 in quality mode is already so close to ground truth that you don't need native resolution input, but it would be nice to compare and see how true that is.

repiv fucked around with this message at 12:37 on Jul 14, 2020

shrike82
Jun 11, 2005

Well, I just realized AC Valhalla is another Ryzen-partner game. I wonder if it's a big enough game that Ubisoft would implement DLSS especially since it's launching probably right after the Ampere releases. If not, pretty canny deal by AMD.

repiv
Aug 13, 2009

DLSS 2.0 is confirmed for Ubisoft's Watch Dogs Legion, including the sharpness slider that was missing from the earlier implementations, but that doesn't mean much for other Ubisoft franchises given they all run on different engines lol

Ubisoft has what, four different open world engines in active use?

repiv fucked around with this message at 12:51 on Jul 14, 2020

shrike82
Jun 11, 2005

Wonder if it's theoretically possible for a future DLSS version to support all games without the need for additional development work.
That'd be a huge win.

Riflen
Mar 13, 2009

"Cheating bitch"
Bleak Gremlin

repiv posted:

DLSS 2.0 is confirmed for Ubisoft's Watch Dogs Legion, including the sharpness slider that was missing from the earlier implementations, but that doesn't mean much for other Ubisoft franchises given they all run on different engines lol

Ubisoft has what, four different open world engines in active use?

I know of Anvil Next 2.0 (AssCreed, Ghost Recon), Disrupt (W_D) and Dunia (Far Cry). I suppose Snowdrop counts too.

Ugly In The Morning
Jul 1, 2010
Pillbug

repiv posted:


Ubisoft has what, four different open world engines in active use?

Beats EA’s “EVERYTHING RUNS ON FROSTBITE!” Stuff.

shrike82 posted:

Well, I just realized AC Valhalla is another Ryzen-partner game. I wonder if it's a big enough game that Ubisoft would implement DLSS especially since it's launching probably right after the Ampere releases. If not, pretty canny deal by AMD.

That would be a bummer, the AC games are gorgeous but one of the few I can’t max out at 60/1440

repiv
Aug 13, 2009

Timestamped to DLSS discussion:

https://www.youtube.com/watch?v=ggnvhFSrPGE&t=960s

its good folks

ufarn
May 30, 2009

repiv posted:

Timestamped to DLSS discussion:

https://www.youtube.com/watch?v=ggnvhFSrPGE&t=960s

its good folks
Oh man, those Ryzen frametimes are not great, hopefully that's ironed out in HZD.

v1ld
Apr 16, 2012

I hope that engine is easy to mod, both for visuals and gameplay. Game looks great as is of course.

Good to see DLSS being incorporated even in a PC port. That augurs well for native PC games.

shrike82
Jun 11, 2005

ufarn posted:

Oh man, those Ryzen frametimes are not great, hopefully that's ironed out in HZD.

What is it about Ryzen that makes them perform worse than Intel chips for gaming? I thought they were approaching parity in IPC.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

shrike82 posted:

What is it about Ryzen that makes them perform worse than Intel chips for gaming? I thought they were approaching parity in IPC.

about 500 mhz

redeyes
Sep 14, 2002

by Fluffdaddy

shrike82 posted:

What is it about Ryzen that makes them perform worse than Intel chips for gaming? I thought they were approaching parity in IPC.

I think they were and then Intel jacked their frequencies up passed 5Ghz.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

shrike82 posted:

What is it about Ryzen that makes them perform worse than Intel chips for gaming? I thought they were approaching parity in IPC.

Zen 2 has worse latency than Intel and games are a latency sensitive workload. The way improve latency is through RAM speed, but Ryzen realistically caps out at DDR4-3800 due to the infinity fabric not being able to clock higher than that. Intel's lower latency and you slap on really fast RAM on it to make it even better, so it just scales better in latency sensitive workloads.

Increasing clock speed doesn't even improve performance on Zen 2 past the clocks they are already at. People have tested Zen 2 at 5ghz using exotic cooling and found no real benefit to performance, the bottleneck is in the infinity fabric/RAM performance.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

shrike82 posted:

What is it about Ryzen that makes them perform worse than Intel chips for gaming? I thought they were approaching parity in IPC.

They are *better* per clock in some workloads, but their clocks are lower, and inter-ccx latency is still an issue

Truga
May 4, 2014
Lipstick Apathy

shrike82 posted:

What is it about Ryzen that makes them perform worse than Intel chips for gaming? I thought they were approaching parity in IPC.

if there's more than a % or 2 difference between intel/amd the issue is the either the game and/or windows scheduler taking a dump, not the cpu itself, seeing how most games reach roughly the same fps on comparable ryzens.

v1ld
Apr 16, 2012

ufarn posted:

Oh man, those Ryzen frametimes are not great, hopefully that's ironed out in HZD.

The frametime spike was only visible on the 12-core Ryzen part not on the Intel 6-core part. This could well be the game not dealing well with scheduling on more than the 6-8 cores it was optimized for due to its console history. Not to say it's not a Ryzen problem, but it would have been nice to see the frametime graph on a 3600 or 3700x.

TacticalHoodie
May 7, 2007

Truga posted:

if there's more than a % or 2 difference between intel/amd the issue is the either the game and/or windows scheduler taking a dump, not the cpu itself, seeing how most games reach roughly the same fps on comparable ryzens.

I am really getting the feeling that most of the performance jumps we used to see years ago in the CPU space isn't really there when it comes to gaming, especially 1440p. I looked at some comparisons for a friend who just built a desktop for VR gaming and the difference from my 8600k to most cpus is a 7% difference at best. Is there a rush of people just buying for the future proofing of more cores with the new console generation? I am not really seeing anything too compelling to upgrade in the near future other than waiting to see the results of the Nvidia 3000 series and having my bank account weep at the Canadian Prices when they are announced next month but will do it anyways for Cyberpunk 2077.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

Indiana_Krom posted:

Just running DLSS but starting at the panels native resolution should be at the very least equal to TAA, but more likely will be comfortably superior (it will be something like combining MSAA+TAA).
Oh right, I almost forgot about DSR. Make the game think there's a 4K display and have the card scale it back down. --edit: The blind naive assumption being the ground truth of 32x SS being remotely achievable, stacked with 2.25x SS from DSR, getting something equivalent to 72x SS.

Combat Pretzel fucked around with this message at 16:40 on Jul 14, 2020

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
There are a bunch of reasons to want more cores - non-gaming workloads, other loads while gaming, and perhaps most relevant to what you're talking about, the upcoming consoles. The base PS4 and Xbone run on a 1.6ghz mobile-derived CPU that is an absolute piece of poo poo. It's so bad that for the majority of games you've been able to use a 4 core CPU and have it outrun them even in many games that utilize multithreading well. This is not true for the upcoming consoles, which have 3.5+ghz Zen2 cores. That is a gigantic difference in performance and it accounts for why people are valuing core count more - it's no longer going to be trivial to get several times the performance of console CPUs, so packing a bunch of threads designed around console into fewer desktop cores is going to rapidly become less viable.

That said there's still a big potential difference between Intel and AMD in gaming performance. While the actual CPU cores can trade blows with Intel depending on the workload, memory access and cross-core access is not as good on AMD. A lot of games are either not sensitive or well optimized to deal with this, but it's not surprising to see worse performance drops on AMD than Intel even when AMD has a core count advantage. Zen3 may show significant improvement in this regard, or it may not.

Indiana_Krom posted:

Just running DLSS but starting at the panels native resolution should be at the very least equal to TAA, but more likely will be comfortably superior (it will be something like combining MSAA+TAA).

TAA is already temporally multisampling - combining results across multiple frames with jittered pixel offsets. That's why it has ghosting, but also good temporal stability with low motion. What DLSS 2.0 does is much better temporal multisampling by better utilizing the motion vectors, and also potentially some AI magic. It would be nice to see support for native base resolution, but I think Nvidia is looking ahead to the move to 4k and knowing that it's not going to be realistic at that resolution, so they're either disabling or strongly discouraging native resolution DLSS 2.0 support.

K8.0 fucked around with this message at 16:44 on Jul 14, 2020

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.
AMD's probably not pleased that Nvidia pulled DLSS 2.0 out and actually made the tech work, pretty much salvaging the silicon used on first gen RTX. With Cyberpunk 2077 having DLSS 2.0 support and being the game that will drive PC upgrades for probably the next 2 years, it's going to be tough for AMD to convince people to buy price equivalent AMD cards. I get the feeling RDNA 1 cards are going to age like Kepler cards did.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
what was AMD's cross-core communications method before the Infinity Fabric? I hear about IF a lot, and of Intel's ring bus (and mesh bus, to a lesser extent), but never of what came before Ryzen

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
HyperTransport.

Cygni
Nov 12, 2005

raring to post

The area between Phenom 2 and Ryzen is missing on the charts, filled only with a drawing of a dragon and warning to mariners to stay away.

repiv
Aug 13, 2009

https://wccftech.com/death-stranding-and-dlss-2-0-gives-a-serious-boost-all-around/



big lol at that site which claimed fidelityfx worked better than dlss

playing now and i really wish they added an option to render depth of field at full resolution, cutscene backgrounds being a janky aliased mess lets down an otherwise great presentation

repiv fucked around with this message at 17:27 on Jul 14, 2020

Ugly In The Morning
Jul 1, 2010
Pillbug

repiv posted:

https://wccftech.com/death-stranding-and-dlss-2-0-gives-a-serious-boost-all-around/



big lol at that site which claimed fidelityfx worked better than dlss

playing now and i really wish they added an option to render depth of field at full resolution, cutscene backgrounds being a janky aliased mess lets down an otherwise great presentation

Except for some jaggies in the tower in the middle, the DLSS 2.0 one actually looks better than either. That's nuts.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

Ugly In The Morning posted:

Except for some jaggies in the tower in the middle, the DLSS 2.0 one actually looks better than either. That's nuts.

And that tower will probably look a lot better with some sharpness settings, honestly. That still seems to be the biggest remaining problem with DLSS but it seems pretty solveable, or at least mitigatable.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Ugly In The Morning posted:

Except for some jaggies in the tower in the middle, the DLSS 2.0 one actually looks better than either. That's nuts.

I was thinking the same thing, honestly. What a time to be alive.

repiv
Aug 13, 2009

Lockback posted:

And that tower will probably look a lot better with some sharpness settings, honestly. That still seems to be the biggest remaining problem with DLSS but it seems pretty solveable, or at least mitigatable.

There's a sharpness knob in the DLSS SDK, the first batch of game integrations just didn't expose it to the user for whatever reason.

Watch Dogs Legion does though:

repiv
Aug 13, 2009

Oh apparently the sharpness setting wasn't ready for production when they first shipped DLSS 2.0



Must be working now if Legion is shipping with it

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
I doubt sharpening is something you're really going to want to adjust on DLSS anyway. I don't want to write a giant loving novel about why, but imagine a static camera and jittered subpixels over many frames. When you're trying to create a high-res image, you aren't going to look at pixels in your "output" from a frame and naively upscale them the way you would with a still image. Instead you want to choose the samples which are closest to the pixel you're trying to create, probably also weighted by recency and motion. You're going to prefer the more rightward samples that could naively be attributed to the "native rendering pixel" to your left, the more leftward samples from the pixel to your right, etc. You're not creating an image from another image of square pixels, but from a network of jittered and motion-adjusted samples that (hopefully) has a very high density, much higher than a native image. Because you're not doing conventional upscaling, you're not going to want a ton of conventional sharpening, because the artifacts upscaling normally introduces aren't going to be there in the same ways with this more advanced type of method.

K8.0 fucked around with this message at 18:25 on Jul 14, 2020

repiv
Aug 13, 2009

It might not be sharpness in the usual sense of applying a sharpening filter to the reconstructed image after the fact, maybe they tuning the weights of the reconstruction itself to favor sharper output at the expense of letting some aliasing through.

We'll see when Watch Dogs comes out :shrug:

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

repiv posted:

https://wccftech.com/death-stranding-and-dlss-2-0-gives-a-serious-boost-all-around/



big lol at that site which claimed fidelityfx worked better than dlss

What site was that? I know Hardware Unboxed claimed that with DLSS 1.9, and they were right. But they've since praised DLSS 2.0 like everyone - is someone actually saying it's better than DLSS 2.0?

Riflen
Mar 13, 2009

"Cheating bitch"
Bleak Gremlin

Happy_Misanthrope posted:

What site was that? I know Hardware Unboxed claimed that with DLSS 1.9, and they were right. But they've since praised DLSS 2.0 like everyone - is someone actually saying it's better than DLSS 2.0?

It was arstechnica.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

repiv posted:

It might not be sharpness in the usual sense of applying a sharpening filter to the reconstructed image after the fact, maybe they tuning the weights of the reconstruction itself to favor sharper output at the expense of letting some aliasing through.

We'll see when Watch Dogs comes out :shrug:

That's a very good point. I really can't wait until either someone cracks DLSS open or someone else creates a demo that works the same, because it'd be cool as gently caress to have a bunch of sliders to play with to tune the various parameters in real time.

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

Oh I thought you meant DLSS in general, yeah I remember reading that. The digital foundry guy replied him earlier saying it definitely wasn't his experience, how in gods name was the ARS guy evaluating it?

Adbot
ADBOT LOVES YOU

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

repiv posted:

It might not be sharpness in the usual sense of applying a sharpening filter to the reconstructed image after the fact, maybe they tuning the weights of the reconstruction itself to favor sharper output at the expense of letting some aliasing through.

We'll see when Watch Dogs comes out :shrug:
Yeah that's what I was thinking - for example in Control, I think it's actually oversharpened in DLSS 2.0 mode to a slight degree, which perhaps is the reason some of the text (one of the few fault points with its implementation) on billboards and such can look a little noisy.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply