|
gradenko_2000 posted:a sub-200 dollar RTX card would be cool. I'll use it to run stuff at 540 upscaled to 1080 DLSS2 probably works optimally when the scale factor is lower, meaning it needs to make up less details, and that screen resolution is at the higher end, which implies a) less high frequency detail that's hard to extrapolate, and b) artifacts from bad predictions will be less noticeable. I would love there to be a bullshit mode. Scaling 1440p up to 4K and then downsampling it back to 1440p for superawesome antialiasing.
|
# ? Jul 14, 2020 10:02 |
|
|
# ? May 30, 2024 10:48 |
|
Combat Pretzel posted:
I don't think it's as good as TAA, there are a few games where you can do exactly this, and as an alternative method of anti-aliasing it's still worse for image stability. A game with strictly no AA being supersampled to 4k from 1440p looks worse than a game just running at 1440p with TAA, and the performance requirement is literally double. Supersampling is horribly expensive and doesn't fix the biggest issue with lower resolutions, the stair-casing jaggies and shimmering. It's most obvious in a game like Destiny 2, where you can go all the way up to 200% resolution even at 4k and you still experience horrific shimmering and instability on specular surfaces and foliage.
|
# ? Jul 14, 2020 10:14 |
|
I thought the neat thing of DLSS2 was that it automatically generates antialiased imagery, because the ground truth of the trained network was like 32x supersampled? The idea is to create a DLSS2 antialiased image, not just plain upscale, and then downsample it back to the original resolution. Go from 1440p to 4K, create nice imagery, downsample it back to 1440p, which should smooth out remaining jaggies. Optional sharpening filter. The idea being better detail at the end than 1080p to 1440p via DLSS2
|
# ? Jul 14, 2020 10:36 |
|
Just running DLSS but starting at the panels native resolution should be at the very least equal to TAA, but more likely will be comfortably superior (it will be something like combining MSAA+TAA).
|
# ? Jul 14, 2020 12:13 |
|
Nvidia talked about using DLSS 1.0 without upscaling early on (they called it DLSS2x) but ended up never shipping anything with that mode, and they haven't mentioned the idea again since DLSS 2.0 came out I guess their messaging is that DLSS 2.0 in quality mode is already so close to ground truth that you don't need native resolution input, but it would be nice to compare and see how true that is. repiv fucked around with this message at 12:37 on Jul 14, 2020 |
# ? Jul 14, 2020 12:32 |
|
Well, I just realized AC Valhalla is another Ryzen-partner game. I wonder if it's a big enough game that Ubisoft would implement DLSS especially since it's launching probably right after the Ampere releases. If not, pretty canny deal by AMD.
|
# ? Jul 14, 2020 12:37 |
|
DLSS 2.0 is confirmed for Ubisoft's Watch Dogs Legion, including the sharpness slider that was missing from the earlier implementations, but that doesn't mean much for other Ubisoft franchises given they all run on different engines lol Ubisoft has what, four different open world engines in active use? repiv fucked around with this message at 12:51 on Jul 14, 2020 |
# ? Jul 14, 2020 12:45 |
|
Wonder if it's theoretically possible for a future DLSS version to support all games without the need for additional development work. That'd be a huge win.
|
# ? Jul 14, 2020 13:03 |
|
repiv posted:DLSS 2.0 is confirmed for Ubisoft's Watch Dogs Legion, including the sharpness slider that was missing from the earlier implementations, but that doesn't mean much for other Ubisoft franchises given they all run on different engines lol I know of Anvil Next 2.0 (AssCreed, Ghost Recon), Disrupt (W_D) and Dunia (Far Cry). I suppose Snowdrop counts too.
|
# ? Jul 14, 2020 13:07 |
|
repiv posted:
Beats EA’s “EVERYTHING RUNS ON FROSTBITE!” Stuff. shrike82 posted:Well, I just realized AC Valhalla is another Ryzen-partner game. I wonder if it's a big enough game that Ubisoft would implement DLSS especially since it's launching probably right after the Ampere releases. If not, pretty canny deal by AMD. That would be a bummer, the AC games are gorgeous but one of the few I can’t max out at 60/1440
|
# ? Jul 14, 2020 13:14 |
|
Timestamped to DLSS discussion: https://www.youtube.com/watch?v=ggnvhFSrPGE&t=960s its good folks
|
# ? Jul 14, 2020 14:21 |
|
repiv posted:Timestamped to DLSS discussion:
|
# ? Jul 14, 2020 14:51 |
|
I hope that engine is easy to mod, both for visuals and gameplay. Game looks great as is of course. Good to see DLSS being incorporated even in a PC port. That augurs well for native PC games.
|
# ? Jul 14, 2020 14:59 |
|
ufarn posted:Oh man, those Ryzen frametimes are not great, hopefully that's ironed out in HZD. What is it about Ryzen that makes them perform worse than Intel chips for gaming? I thought they were approaching parity in IPC.
|
# ? Jul 14, 2020 15:18 |
|
shrike82 posted:What is it about Ryzen that makes them perform worse than Intel chips for gaming? I thought they were approaching parity in IPC. about 500 mhz
|
# ? Jul 14, 2020 15:22 |
|
shrike82 posted:What is it about Ryzen that makes them perform worse than Intel chips for gaming? I thought they were approaching parity in IPC. I think they were and then Intel jacked their frequencies up passed 5Ghz.
|
# ? Jul 14, 2020 15:23 |
|
shrike82 posted:What is it about Ryzen that makes them perform worse than Intel chips for gaming? I thought they were approaching parity in IPC. Zen 2 has worse latency than Intel and games are a latency sensitive workload. The way improve latency is through RAM speed, but Ryzen realistically caps out at DDR4-3800 due to the infinity fabric not being able to clock higher than that. Intel's lower latency and you slap on really fast RAM on it to make it even better, so it just scales better in latency sensitive workloads. Increasing clock speed doesn't even improve performance on Zen 2 past the clocks they are already at. People have tested Zen 2 at 5ghz using exotic cooling and found no real benefit to performance, the bottleneck is in the infinity fabric/RAM performance.
|
# ? Jul 14, 2020 15:25 |
|
shrike82 posted:What is it about Ryzen that makes them perform worse than Intel chips for gaming? I thought they were approaching parity in IPC. They are *better* per clock in some workloads, but their clocks are lower, and inter-ccx latency is still an issue
|
# ? Jul 14, 2020 15:27 |
|
shrike82 posted:What is it about Ryzen that makes them perform worse than Intel chips for gaming? I thought they were approaching parity in IPC. if there's more than a % or 2 difference between intel/amd the issue is the either the game and/or windows scheduler taking a dump, not the cpu itself, seeing how most games reach roughly the same fps on comparable ryzens.
|
# ? Jul 14, 2020 15:54 |
|
ufarn posted:Oh man, those Ryzen frametimes are not great, hopefully that's ironed out in HZD. The frametime spike was only visible on the 12-core Ryzen part not on the Intel 6-core part. This could well be the game not dealing well with scheduling on more than the 6-8 cores it was optimized for due to its console history. Not to say it's not a Ryzen problem, but it would have been nice to see the frametime graph on a 3600 or 3700x.
|
# ? Jul 14, 2020 16:25 |
Truga posted:if there's more than a % or 2 difference between intel/amd the issue is the either the game and/or windows scheduler taking a dump, not the cpu itself, seeing how most games reach roughly the same fps on comparable ryzens. I am really getting the feeling that most of the performance jumps we used to see years ago in the CPU space isn't really there when it comes to gaming, especially 1440p. I looked at some comparisons for a friend who just built a desktop for VR gaming and the difference from my 8600k to most cpus is a 7% difference at best. Is there a rush of people just buying for the future proofing of more cores with the new console generation? I am not really seeing anything too compelling to upgrade in the near future other than waiting to see the results of the Nvidia 3000 series and having my bank account weep at the Canadian Prices when they are announced next month but will do it anyways for Cyberpunk 2077.
|
|
# ? Jul 14, 2020 16:27 |
|
Indiana_Krom posted:Just running DLSS but starting at the panels native resolution should be at the very least equal to TAA, but more likely will be comfortably superior (it will be something like combining MSAA+TAA). Combat Pretzel fucked around with this message at 16:40 on Jul 14, 2020 |
# ? Jul 14, 2020 16:32 |
|
There are a bunch of reasons to want more cores - non-gaming workloads, other loads while gaming, and perhaps most relevant to what you're talking about, the upcoming consoles. The base PS4 and Xbone run on a 1.6ghz mobile-derived CPU that is an absolute piece of poo poo. It's so bad that for the majority of games you've been able to use a 4 core CPU and have it outrun them even in many games that utilize multithreading well. This is not true for the upcoming consoles, which have 3.5+ghz Zen2 cores. That is a gigantic difference in performance and it accounts for why people are valuing core count more - it's no longer going to be trivial to get several times the performance of console CPUs, so packing a bunch of threads designed around console into fewer desktop cores is going to rapidly become less viable. That said there's still a big potential difference between Intel and AMD in gaming performance. While the actual CPU cores can trade blows with Intel depending on the workload, memory access and cross-core access is not as good on AMD. A lot of games are either not sensitive or well optimized to deal with this, but it's not surprising to see worse performance drops on AMD than Intel even when AMD has a core count advantage. Zen3 may show significant improvement in this regard, or it may not. Indiana_Krom posted:Just running DLSS but starting at the panels native resolution should be at the very least equal to TAA, but more likely will be comfortably superior (it will be something like combining MSAA+TAA). TAA is already temporally multisampling - combining results across multiple frames with jittered pixel offsets. That's why it has ghosting, but also good temporal stability with low motion. What DLSS 2.0 does is much better temporal multisampling by better utilizing the motion vectors, and also potentially some AI magic. It would be nice to see support for native base resolution, but I think Nvidia is looking ahead to the move to 4k and knowing that it's not going to be realistic at that resolution, so they're either disabling or strongly discouraging native resolution DLSS 2.0 support. K8.0 fucked around with this message at 16:44 on Jul 14, 2020 |
# ? Jul 14, 2020 16:38 |
|
AMD's probably not pleased that Nvidia pulled DLSS 2.0 out and actually made the tech work, pretty much salvaging the silicon used on first gen RTX. With Cyberpunk 2077 having DLSS 2.0 support and being the game that will drive PC upgrades for probably the next 2 years, it's going to be tough for AMD to convince people to buy price equivalent AMD cards. I get the feeling RDNA 1 cards are going to age like Kepler cards did.
|
# ? Jul 14, 2020 16:40 |
|
what was AMD's cross-core communications method before the Infinity Fabric? I hear about IF a lot, and of Intel's ring bus (and mesh bus, to a lesser extent), but never of what came before Ryzen
|
# ? Jul 14, 2020 16:45 |
|
HyperTransport.
|
# ? Jul 14, 2020 16:49 |
|
The area between Phenom 2 and Ryzen is missing on the charts, filled only with a drawing of a dragon and warning to mariners to stay away.
|
# ? Jul 14, 2020 16:50 |
|
https://wccftech.com/death-stranding-and-dlss-2-0-gives-a-serious-boost-all-around/ big lol at that site which claimed fidelityfx worked better than dlss playing now and i really wish they added an option to render depth of field at full resolution, cutscene backgrounds being a janky aliased mess lets down an otherwise great presentation repiv fucked around with this message at 17:27 on Jul 14, 2020 |
# ? Jul 14, 2020 17:21 |
|
repiv posted:https://wccftech.com/death-stranding-and-dlss-2-0-gives-a-serious-boost-all-around/ Except for some jaggies in the tower in the middle, the DLSS 2.0 one actually looks better than either. That's nuts.
|
# ? Jul 14, 2020 17:29 |
|
Ugly In The Morning posted:Except for some jaggies in the tower in the middle, the DLSS 2.0 one actually looks better than either. That's nuts. And that tower will probably look a lot better with some sharpness settings, honestly. That still seems to be the biggest remaining problem with DLSS but it seems pretty solveable, or at least mitigatable.
|
# ? Jul 14, 2020 17:45 |
|
Ugly In The Morning posted:Except for some jaggies in the tower in the middle, the DLSS 2.0 one actually looks better than either. That's nuts. I was thinking the same thing, honestly. What a time to be alive.
|
# ? Jul 14, 2020 17:46 |
|
Lockback posted:And that tower will probably look a lot better with some sharpness settings, honestly. That still seems to be the biggest remaining problem with DLSS but it seems pretty solveable, or at least mitigatable. There's a sharpness knob in the DLSS SDK, the first batch of game integrations just didn't expose it to the user for whatever reason. Watch Dogs Legion does though:
|
# ? Jul 14, 2020 17:48 |
|
Oh apparently the sharpness setting wasn't ready for production when they first shipped DLSS 2.0 Must be working now if Legion is shipping with it
|
# ? Jul 14, 2020 18:08 |
|
I doubt sharpening is something you're really going to want to adjust on DLSS anyway. I don't want to write a giant loving novel about why, but imagine a static camera and jittered subpixels over many frames. When you're trying to create a high-res image, you aren't going to look at pixels in your "output" from a frame and naively upscale them the way you would with a still image. Instead you want to choose the samples which are closest to the pixel you're trying to create, probably also weighted by recency and motion. You're going to prefer the more rightward samples that could naively be attributed to the "native rendering pixel" to your left, the more leftward samples from the pixel to your right, etc. You're not creating an image from another image of square pixels, but from a network of jittered and motion-adjusted samples that (hopefully) has a very high density, much higher than a native image. Because you're not doing conventional upscaling, you're not going to want a ton of conventional sharpening, because the artifacts upscaling normally introduces aren't going to be there in the same ways with this more advanced type of method.
K8.0 fucked around with this message at 18:25 on Jul 14, 2020 |
# ? Jul 14, 2020 18:20 |
|
It might not be sharpness in the usual sense of applying a sharpening filter to the reconstructed image after the fact, maybe they tuning the weights of the reconstruction itself to favor sharper output at the expense of letting some aliasing through. We'll see when Watch Dogs comes out
|
# ? Jul 14, 2020 18:30 |
|
repiv posted:https://wccftech.com/death-stranding-and-dlss-2-0-gives-a-serious-boost-all-around/ What site was that? I know Hardware Unboxed claimed that with DLSS 1.9, and they were right. But they've since praised DLSS 2.0 like everyone - is someone actually saying it's better than DLSS 2.0?
|
# ? Jul 14, 2020 18:47 |
|
Happy_Misanthrope posted:What site was that? I know Hardware Unboxed claimed that with DLSS 1.9, and they were right. But they've since praised DLSS 2.0 like everyone - is someone actually saying it's better than DLSS 2.0? It was arstechnica.
|
# ? Jul 14, 2020 18:51 |
|
repiv posted:It might not be sharpness in the usual sense of applying a sharpening filter to the reconstructed image after the fact, maybe they tuning the weights of the reconstruction itself to favor sharper output at the expense of letting some aliasing through. That's a very good point. I really can't wait until either someone cracks DLSS open or someone else creates a demo that works the same, because it'd be cool as gently caress to have a bunch of sliders to play with to tune the various parameters in real time.
|
# ? Jul 14, 2020 18:51 |
|
Riflen posted:It was arstechnica. Oh I thought you meant DLSS in general, yeah I remember reading that. The digital foundry guy replied him earlier saying it definitely wasn't his experience, how in gods name was the ARS guy evaluating it?
|
# ? Jul 14, 2020 18:57 |
|
|
# ? May 30, 2024 10:48 |
|
repiv posted:It might not be sharpness in the usual sense of applying a sharpening filter to the reconstructed image after the fact, maybe they tuning the weights of the reconstruction itself to favor sharper output at the expense of letting some aliasing through.
|
# ? Jul 14, 2020 19:00 |