|
repiv posted:Apparently FFXIV still has a DX9 renderer, maybe you were still using that and somehow got switched over to the DX11 renderer when you changed cards? This is likely! Launcher had said DX11 but then the game doesn't like Radeon cards.
|
# ? Nov 1, 2020 15:11 |
|
|
# ? Jun 9, 2024 11:39 |
|
The Grumbles posted:I mean I understand the reaction besides the weird accusations that it's all in my head. More reasonable assumptions would be - FF14 has problems with Radeon drivers (which is already something p well established - the game crashes a LOT on Radeon cards for people. And AMD drivers/software can be pretty miserable at the best of times) and/or there was something wild that's happening with my game that's unaccounted for. Yes, I take everyone's point that there's probably nothing innate to the cards themselves causing a different level of graphical quality, but it's stupid for that poster to jump to 'you're imagining every bit of it'. i don't think you're imagining it, but you probably set something in the radeon control panel one time or another and forgot about it, or it was running on dx9. there's a ton of options you can choose there that override game settings, while your new gpu has default settings now where game settings take priority. i think it's likely that. The Grumbles posted:This is likely! Launcher had said DX11 but then the game doesn't like Radeon cards. this is silly, because i played the game on dual 6950 radeons for over a year, and it gave me an almost 100% sli fps boost (i went from like average 33 fps on a single card to 60fps vsync locked 99 percentile), and it crashed on me maybe once. it's old anecdata but game's clearly capable of running super nicely even with the dumbest setups, and the dx11 client hasn't changed since heavensward Truga fucked around with this message at 15:21 on Nov 1, 2020 |
# ? Nov 1, 2020 15:19 |
|
Zedsdeadbaby posted:considering the difference in visual quality between DLSS and Fidelity contrast adaptive sharpening is light years apart There is definitely a difference in image quality but saying its light years apart is kinda hyperbolic. Techspot did a nice review of it when AMD initially released it a while back and they've got some nice 4K and other res pics comparing the 2 so you can see for yourself. https://www.techspot.com/article/1873-radeon-image-sharpening-vs-nvidia-dlss/ For what it is its not bad at all. They were getting something like a ~27% performance improvement in some games when using it to upscale 1800p to 4k resolution on a 5700XT and it looked fairly close to native 4k. They're comparing to the DLSS 1.whatever on a RTX 2070 so the RTX 3000 series DLSS2.x will be a step up from this mind you but they seemed to like it just fine and it seemed to think it was better over all vs the DLSS implementation that NV had at the time.
|
# ? Nov 1, 2020 15:41 |
|
Boxman posted:I asked this thread in the PC Building thread and was reminded this thread exists, but its possible my question was indirectly answered: I can't think of any brands that are genuinely bad, at least not in the US market right now. They have all had subpar models at some point in their history but I haven't seen a single bad 3070 or 3080 review to be honest. PC LOAD LETTER posted:They're comparing to the DLSS 1.whatever DLSS 2 is wildly better than DLSS 1, comparing something to DLSS 1 and declaring victory when DLSS 2 is the standard now for both RTX 20 and 30 is kind of silly.
|
# ? Nov 1, 2020 15:49 |
|
That's not the current iteration of DLSS though, the one in games such as Death Stranding is vastly superior and has been available for some time. It's wrong to use DLSS 1.0 as an example, it's an obsolete version that is no longer being included in new games. efb
|
# ? Nov 1, 2020 15:50 |
|
PC LOAD LETTER posted:Techspot did a nice review of it when AMD initially released it a while back and they've got some nice 4K and other res pics comparing the 2 so you can see for yourself. Screenshots don't convey how "dumb" non-temporal upscalers like FFX CAS exacerbate aliasing though. It's extremely obvious in Death Stranding where the CAS mode has far more shimmering than native, which in turn has more shimmering than DLSS.
|
# ? Nov 1, 2020 15:54 |
|
Yeah I noted it was a older version guys. The review was done in 2019 and was not declaring victory over current DLSS. I thought I was clear about that... That being said it still looks fairly decent vs native resolution shots in the review. Not seeing those "light years" if it looks fine vs native.
|
# ? Nov 1, 2020 15:55 |
|
PC LOAD LETTER posted:Yeah I noted it was a older version guys. The review was done in 2019 and was not declaring victory over current DLSS. I thought I was clear about that... See the post above yours. Screenshots aren't gameplay. repiv posted:Screenshots don't convey how "dumb" non-temporal upscalers like FFX CAS exacerbate aliasing though. It's extremely obvious in Death Stranding where the CAS mode has far more shimmering than native, which in turn has more shimmering than DLSS.
|
# ? Nov 1, 2020 15:55 |
|
repiv posted:Screenshots don't convey how "dumb" non-temporal upscalers like FFX CAS exacerbate aliasing though. True and they mentioned that in the review too but also said it wasn't a big deal to them. There are videos out there of CAS in action and there is some shimmering but its not like the image is mess of it either. Like I said it doesn't look bad at all for what it is.
|
# ? Nov 1, 2020 15:58 |
|
gradenko_2000 posted:the old series of consoles were also already AMD based Since they said New Zealand, I would imagine they mean NZD, which would be ~740 USD? Doesn't seem bad at all for a zotac 3080 you can actually buy.
|
# ? Nov 1, 2020 16:01 |
|
Well shimmering is a big YMMV factor, some people don't notice it as much as others It's worth bearing in mind the ratios involved too, even if you grant that FFX CAS can make a reasonably stable 4K image out of an 1800p input, that's in contrast to DLSS making an even more stable 4K image out of a 1080p-1440p input
|
# ? Nov 1, 2020 16:01 |
|
repiv posted:It's worth bearing in mind the ratios involved too, even if you grant that FFX CAS can make a reasonably stable 4K image out of an 1800p input, that's still nothing compared to DLSS making an even more stable 4K image out of a 1080p input But did I say it did? My OP was in response to the guy saying they looked 'light years' apart in difference and I was pretty clear that the review was done comparing a older version of DLSS as well. Did you guys get all triggered up on anything involving CAS or what? Is it really so incredibly controversial to say its not bad for what it is? Given that it can get you a "free" 20-30%-ish performance improvement for what could be considered a minor image quality hit (to some some I guess) I don't really see how you could see it as bad at all. edit: The worst thing about it to me is that it doesn't work for DX11 games FWIW. edit2: "significant" yeah I can see that. "Light years" is something else entirely different though, like CAS would have to be so completely unusable in comparison that no one could sensibly use to make that comment pan out, and still seems hyperbolic to me. If you want me to drop it fair enough. \/\/\/\/\/\/ PC LOAD LETTER fucked around with this message at 16:27 on Nov 1, 2020 |
# ? Nov 1, 2020 16:10 |
|
I think it's time we moved on. I stand by my view that there is a significant difference between DLSS and FCAS, and I don't feel that people are getting 'triggered' by the conversation. I dislike the trend of saying people are triggered whenever a debate occurs. You just have to accept that there are different viewpoints and they will come up in conversation. Personally I think that's a probe-worthy post because it's an insult to people struggling with mental illness
|
# ? Nov 1, 2020 16:20 |
|
I'm just trying to clarify what's actually going on with these techniques - FFX CAS by the nature of how it works can't add extra detail to the image, so if you can't tell the difference between CAS 1800p and native 4K then targeting 4K in that game on your display size/distance was overkill to begin with. That principle only holds at extremely high resolutions like 4K though, once you step back to the more reasonable 1440p there isn't any detail headroom to throw away, every pixel counts, and upscaling through something like CAS will always have an obvious quality hit. repiv fucked around with this message at 16:42 on Nov 1, 2020 |
# ? Nov 1, 2020 16:22 |
|
Comparing anything to DLSS 1.x is disingenuous because it's literally not used anymore. The point of comparison is DLSS 2.x because that's what is what is actually being put in games now, and what is supported on both RTX 20 and RTX 30 series cards. That's the alternative to whatever AMD offers.
|
# ? Nov 1, 2020 16:33 |
|
... weren't they comparing CAS to nothing?
|
# ? Nov 1, 2020 16:38 |
|
sean10mm posted:Comparing anything to DLSS 1.x is disingenuous because it's literally not used anymore. Except in F1 2020 which shipped DLSS 1.0 (copy pasted from their implementation in F1 2019) long after DLSS 2.0 shipped 😬 Nvidia keeps DLSS on such a tight leash that I have no idea why they allowed them to do that
|
# ? Nov 1, 2020 16:47 |
|
Penpal posted:... weren't they comparing CAS to nothing? They compared to it to native res and the DLSS available at the time. They even labelled the pics and stuff so I'm not sure where you see they're comparing it to nothing? sean10mm posted:Comparing anything to DLSS 1.x is disingenuous because it's literally not used anymore. The article compared to native res too though.
|
# ? Nov 1, 2020 17:06 |
|
'Light years ahead' is a bit of a weird phrase when it's more like 'Literally two years ahead, also NVIDA spend a hell of a lot more on software so unclear if they are gaining or losing ground in this space'.
|
# ? Nov 1, 2020 17:09 |
|
PC LOAD LETTER posted:The article compared to native res too though. DLSS 2.0 looks better than native in low-motion still shots like they're using to compare. Especially in that context, it is reasonable to call it "light years ahead" of any sort of upscaler. Any decent temporal upsampler should be. There's a huge difference between resampling on raw samples and resampling after losing all the subpixel precision data, which is the best you can do with an upscaler independently working off TAA output. This DF video has a good comparison of Native vs CAS vs DLSS 2.0, which is a followup to the DLSS stuff starting around 15:40. DF is focusing on temporal aliasing more than detail there, but in terms of pure detail are other examples that are even more stark - almost any time you have fine, high contrast detail like text at small pixel sizes, DLSS looks significantly better than native. The real question is, how soon do we get good, vendor-agnostic applications in every game? Besides raytracing, this is the biggest thing to happen in computer graphics since programmable pixel shaders. It's the future of real-time rendering, everything is going to be done this way. It's just a question of when, and how many good games get vendor-locked to Nvidia before that happens.
|
# ? Nov 1, 2020 17:37 |
|
The Grumbles posted:One thing I'd never really considered in terms of the ol' AMD vs Nvidia thing, until I switched this weekend from my 5600xt to a 3070 FE, is optimisation. Regardless of performance, most games just look plain better on this new Nvidia card than they ever did on the Radeon card. I play a lot of FF14, and it looks like a whole new game. Particle effects, shaders, lighting, texture mapping stuff - a lot was just not there on the Radeon card, but swapping to this new card makes the whole game just look better. Same for a lot of games I've tried. Interestingly, I had the opposite experience last time I switched from an Nvidia card to an AMD - when I built this current PC, I went from an Intel/Nvidia setup (1060 6gb) to all AMD (5700 XT) and I was immediately blown away by how much nicer games looked, just vibrant and really rich colors. I was playing a lot of Witcher 3 at the time and it was very apparent to me at the time, and CDPR and Nvidia partner together a lot... Or maybe it's all placebo effect and we were both justifying our purchases to ourselves
|
# ? Nov 1, 2020 17:43 |
|
Significantly higher frame rates can make everything feel nicer and you could be just misdirecting that feeling. I noticed the same thing with a few games when I upgraded to my 1070 from.... honestly forget I think it was a 600 series.
|
# ? Nov 1, 2020 17:51 |
|
One day I'll be able to upgrade from the 7850 to the 3080, and I'll be sure to let everyone know if it's a revelatory experience.
|
# ? Nov 1, 2020 18:06 |
|
K8.0 posted:DLSS 2.0 looks better than native in low-motion still shots no it doesn't
|
# ? Nov 1, 2020 18:13 |
|
I'd hesitate to say DLSS is better than native in general because "native" is a moving target, games native TAA implementations vary in quality It's certainly better in some cases though, as in Death Stranding, and close enough in others
|
# ? Nov 1, 2020 18:16 |
|
i'd argue that at high enough resolutions, still shots look better at native and no AA at all than whatever blurry bullshit TAA implementations often seem to be doing, so yeah, native is a moving target that'll heavily depend on the game and your settings
|
# ? Nov 1, 2020 18:21 |
|
K8.0 posted:DLSS 2.0 looks better than native in low-motion still shots like they're using to compare.
|
# ? Nov 1, 2020 18:25 |
|
That was pure bullshit though, you can't magic extra detail out of thin air. The difference is DLSS (and presumably AMDs upscaler too?) don't just make up extra detail, they accumulate real details over time, as does native since everything uses TAA nowadays. It just comes down to which approach can more accurately reassemble the details. repiv fucked around with this message at 18:33 on Nov 1, 2020 |
# ? Nov 1, 2020 18:27 |
|
And now my watch has ended! (not really, still want a Strix 3080, but this will do for now and then a friend is gonna snag this from me for cost minus 15% whenever I have a 3080 in hand)
|
# ? Nov 1, 2020 18:30 |
|
The Grumbles posted:One thing I'd never really considered in terms of the ol' AMD vs Nvidia thing, until I switched this weekend from my 5600xt to a 3070 FE, is optimisation. Regardless of performance, most games just look plain better on this new Nvidia card than they ever did on the Radeon card. I play a lot of FF14, and it looks like a whole new game. Particle effects, shaders, lighting, texture mapping stuff - a lot was just not there on the Radeon card, but swapping to this new card makes the whole game just look better. Same for a lot of games I've tried. I've had NVIDIA's Game-Ready drivers change my settings in Destiny 2, seemingly entirely unprompted because I have no memory of telling it to, but the game was suddenly rendering at 4k one day.
|
# ? Nov 1, 2020 18:44 |
|
are you mlb hall of famer tim raines???
|
# ? Nov 1, 2020 18:44 |
|
Truga posted:no it doesn't DLSS results in an image with additional edge sharpening compared to the single-pass full resolution render. This can look sharper or better than the native image in some cases. It's arguable whether this additional sharpening is "real" or "fake" detail, but since the entire image is rendered in the first place and you can apply all the blurring or sharpening filters you like to either image, I'm gonna say it doesn't matter
|
# ? Nov 1, 2020 18:51 |
|
We've had this discussion before but rendering at native resolution doesn't automatically give a perfect image, you're at the mercy of the anti-aliasing solution to capture sub-pixel details. If an upscaling algorithm is sufficiently smart it can reconstruct those subpixels better than the native TAA solution and wind up with a genuinely superior image, not just a more sharpened one. The line between anti-aliasing and upscaling is hard to pin down at this point, conceptually TAA is already "upscaling" native resolution to a greater than native resolution then scaling it back down to native. These fancy temporal upscalers just take it one step further by starting with a sub-native resolution and (hopefully) arriving at a similar result through smarter use of the input samples. repiv fucked around with this message at 19:17 on Nov 1, 2020 |
# ? Nov 1, 2020 18:54 |
|
I've consigned myself to just waiting for my EVGA Step Up to eventually happen, at this point. Gonna install the 2060 Super I grabbed for it probably this week and forgo cancelling the step up and returning the card to Newegg. Gonna be a long winter.
|
# ? Nov 1, 2020 19:12 |
|
DLSS 2.0 good.
|
# ? Nov 1, 2020 19:13 |
|
Sagebrush posted:DLSS results in an image with additional edge sharpening compared to the single-pass full resolution render. This can look sharper or better than the native image in some cases. DLSS looks like it has artificial sharpening, but unless Nvidia was straight up lying in their GTC presentation, it doesn't. It's just more accurate. They prove this by comparing it to a 32X supersampled version of the same shot, which is obviously a very idealized version of the image. The things that look like sharpening artifacts are present in the 32X supersampled images as well, and the DLSS 2.0 image is far more conformant to the supersampled image than the TAA image is. It's just a matter of de-training your brain from the TAA-induced blur we've unfortunately had to put up with, which as you said is somewhat subjective and unlike detail you can always add blur if you want it. Obviously in scenes with a lot of motion / de-occlusion DLSS is not going to be able to temporally accumulate as much data for everything, but the fact that there are more frames is going to generally be preferable even if high-motion objects don't look as good. If the framerate is high enough, then the occlusion artifacting is only going to be a few pixels and I'll be damned if I can see a few pixels of possible aliasing on newly de-occluded background. Still, I'd like to see some good analysis, but it's hard to do comparisons because we don't really have a way to generate 1:1 shots of motion where the frame timing is identical in any DLSS 2.0 game that exists today.
|
# ? Nov 1, 2020 19:34 |
|
What benchmarks and temp checks should I do on my current setup before I put in the 3080? I want to quantitatively know how much sweeter I am.
|
# ? Nov 1, 2020 19:35 |
|
KingKapalone posted:What benchmarks and temp checks should I do on my current setup before I put in the 3080? I want to quantitatively know how much sweeter I am. virt-a-mate on ultra settings with six female and one male model loaded
|
# ? Nov 1, 2020 19:37 |
|
Cabbages and Kings posted:
Was this the order that was delayed? Mine hasn’t shipped so you’re giving me hope.
|
# ? Nov 1, 2020 19:43 |
|
|
# ? Jun 9, 2024 11:39 |
|
K8.0 posted:If the framerate is high enough, then the occlusion artifacting is only going to be a few pixels and I'll be damned if I can see a few pixels of possible aliasing on newly de-occluded background. Yeah framerate is an underappreciated aspect with this stuff, the better quality TAA implementations out there can mostly avoid disocclusion artifacts even in 30fps console builds, so why are we wasting frametime feeding as many samples into them at >100fps? Fewer samples per frame but with more frames per second should cancel out and result in similar image quality. repiv fucked around with this message at 19:47 on Nov 1, 2020 |
# ? Nov 1, 2020 19:43 |