Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
The Grumbles
Jun 5, 2006

repiv posted:

Apparently FFXIV still has a DX9 renderer, maybe you were still using that and somehow got switched over to the DX11 renderer when you changed cards?

This is likely! Launcher had said DX11 but then the game doesn't like Radeon cards.

Adbot
ADBOT LOVES YOU

Truga
May 4, 2014
Lipstick Apathy

The Grumbles posted:

I mean I understand the reaction besides the weird accusations that it's all in my head. More reasonable assumptions would be - FF14 has problems with Radeon drivers (which is already something p well established - the game crashes a LOT on Radeon cards for people. And AMD drivers/software can be pretty miserable at the best of times) and/or there was something wild that's happening with my game that's unaccounted for. Yes, I take everyone's point that there's probably nothing innate to the cards themselves causing a different level of graphical quality, but it's stupid for that poster to jump to 'you're imagining every bit of it'.

i don't think you're imagining it, but you probably set something in the radeon control panel one time or another and forgot about it, or it was running on dx9. there's a ton of options you can choose there that override game settings, while your new gpu has default settings now where game settings take priority. i think it's likely that.

The Grumbles posted:

This is likely! Launcher had said DX11 but then the game doesn't like Radeon cards.

this is silly, because i played the game on dual 6950 radeons for over a year, and it gave me an almost 100% sli fps boost (i went from like average 33 fps on a single card to 60fps vsync locked 99 percentile), and it crashed on me maybe once. it's old anecdata but game's clearly capable of running super nicely even with the dumbest setups, and the dx11 client hasn't changed since heavensward

Truga fucked around with this message at 15:21 on Nov 1, 2020

PC LOAD LETTER
May 23, 2005
WTF?!

Zedsdeadbaby posted:

considering the difference in visual quality between DLSS and Fidelity contrast adaptive sharpening is light years apart

There is definitely a difference in image quality but saying its light years apart is kinda hyperbolic.

Techspot did a nice review of it when AMD initially released it a while back and they've got some nice 4K and other res pics comparing the 2 so you can see for yourself.

https://www.techspot.com/article/1873-radeon-image-sharpening-vs-nvidia-dlss/

For what it is its not bad at all.

They were getting something like a ~27% performance improvement in some games when using it to upscale 1800p to 4k resolution on a 5700XT and it looked fairly close to native 4k.

They're comparing to the DLSS 1.whatever on a RTX 2070 so the RTX 3000 series DLSS2.x will be a step up from this mind you but they seemed to like it just fine and it seemed to think it was better over all vs the DLSS implementation that NV had at the time.

sean10mm
Jun 29, 2005

It's a Mad, Mad, Mad, MAD-2R World

Boxman posted:

I asked this thread in the PC Building thread and was reminded this thread exists, but its possible my question was indirectly answered:


Is there any brand worth actively avoiding? I'd like to get a 3070, and i have access to a Microcenter, which means I can probably pick one up at the store a little easier than average (I noticed the Zotec was in stock a couple mornings ago), but it'll probably end up being luck of the draw on brand. Is there any brand that's shoddy enough that I should just stay home? There's no urgency to this pick up; I'm currently playing a lot of Hades and Apex Legends, which my 1060 is handling well enough.

I can't think of any brands that are genuinely bad, at least not in the US market right now. They have all had subpar models at some point in their history but I haven't seen a single bad 3070 or 3080 review to be honest.

PC LOAD LETTER posted:

They're comparing to the DLSS 1.whatever

DLSS 2 is wildly better than DLSS 1, comparing something to DLSS 1 and declaring victory when DLSS 2 is the standard now for both RTX 20 and 30 is kind of silly.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
That's not the current iteration of DLSS though, the one in games such as Death Stranding is vastly superior and has been available for some time.

It's wrong to use DLSS 1.0 as an example, it's an obsolete version that is no longer being included in new games.

efb

repiv
Aug 13, 2009

PC LOAD LETTER posted:

Techspot did a nice review of it when AMD initially released it a while back and they've got some nice 4K and other res pics comparing the 2 so you can see for yourself.

Screenshots don't convey how "dumb" non-temporal upscalers like FFX CAS exacerbate aliasing though. It's extremely obvious in Death Stranding where the CAS mode has far more shimmering than native, which in turn has more shimmering than DLSS.

PC LOAD LETTER
May 23, 2005
WTF?!
Yeah I noted it was a older version guys. The review was done in 2019 and was not declaring victory over current DLSS. I thought I was clear about that...

That being said it still looks fairly decent vs native resolution shots in the review.

Not seeing those "light years" if it looks fine vs native.

sean10mm
Jun 29, 2005

It's a Mad, Mad, Mad, MAD-2R World

PC LOAD LETTER posted:

Yeah I noted it was a older version guys. The review was done in 2019 and was not declaring victory over current DLSS. I thought I was clear about that...

That being said it still looks fairly decent vs native resolution shots in the review.

Not seeing those "light years" if it looks fine vs native.

See the post above yours. Screenshots aren't gameplay.

repiv posted:

Screenshots don't convey how "dumb" non-temporal upscalers like FFX CAS exacerbate aliasing though. It's extremely obvious in Death Stranding where the CAS mode has far more shimmering than native, which in turn has more shimmering than DLSS.

PC LOAD LETTER
May 23, 2005
WTF?!

repiv posted:

Screenshots don't convey how "dumb" non-temporal upscalers like FFX CAS exacerbate aliasing though.

True and they mentioned that in the review too but also said it wasn't a big deal to them. There are videos out there of CAS in action and there is some shimmering but its not like the image is mess of it either.

Like I said it doesn't look bad at all for what it is.

VorpalFish
Mar 22, 2007
reasonably awesometm

gradenko_2000 posted:

the old series of consoles were also already AMD based


1,110 USD seems like significantly higher than the MSRP, is the thing

Since they said New Zealand, I would imagine they mean NZD, which would be ~740 USD? Doesn't seem bad at all for a zotac 3080 you can actually buy.

repiv
Aug 13, 2009

Well shimmering is a big YMMV factor, some people don't notice it as much as others

It's worth bearing in mind the ratios involved too, even if you grant that FFX CAS can make a reasonably stable 4K image out of an 1800p input, that's in contrast to DLSS making an even more stable 4K image out of a 1080p-1440p input

PC LOAD LETTER
May 23, 2005
WTF?!

repiv posted:

It's worth bearing in mind the ratios involved too, even if you grant that FFX CAS can make a reasonably stable 4K image out of an 1800p input, that's still nothing compared to DLSS making an even more stable 4K image out of a 1080p input
Absolutely.

But did I say it did?

My OP was in response to the guy saying they looked 'light years' apart in difference and I was pretty clear that the review was done comparing a older version of DLSS as well.

Did you guys get all triggered up on anything involving CAS or what?

Is it really so incredibly controversial to say its not bad for what it is? Given that it can get you a "free" 20-30%-ish performance improvement for what could be considered a minor image quality hit (to some some I guess) I don't really see how you could see it as bad at all. edit: The worst thing about it to me is that it doesn't work for DX11 games FWIW.

edit2: "significant" yeah I can see that. "Light years" is something else entirely different though, like CAS would have to be so completely unusable in comparison that no one could sensibly use to make that comment pan out, and still seems hyperbolic to me. If you want me to drop it fair enough. \/\/\/\/\/\/

PC LOAD LETTER fucked around with this message at 16:27 on Nov 1, 2020

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
I think it's time we moved on. I stand by my view that there is a significant difference between DLSS and FCAS, and I don't feel that people are getting 'triggered' by the conversation. I dislike the trend of saying people are triggered whenever a debate occurs. You just have to accept that there are different viewpoints and they will come up in conversation.

Personally I think that's a probe-worthy post because it's an insult to people struggling with mental illness

repiv
Aug 13, 2009

I'm just trying to clarify what's actually going on with these techniques - FFX CAS by the nature of how it works can't add extra detail to the image, so if you can't tell the difference between CAS 1800p and native 4K then targeting 4K in that game on your display size/distance was overkill to begin with.

That principle only holds at extremely high resolutions like 4K though, once you step back to the more reasonable 1440p there isn't any detail headroom to throw away, every pixel counts, and upscaling through something like CAS will always have an obvious quality hit.

repiv fucked around with this message at 16:42 on Nov 1, 2020

sean10mm
Jun 29, 2005

It's a Mad, Mad, Mad, MAD-2R World
Comparing anything to DLSS 1.x is disingenuous because it's literally not used anymore.

The point of comparison is DLSS 2.x because that's what is what is actually being put in games now, and what is supported on both RTX 20 and RTX 30 series cards. That's the alternative to whatever AMD offers.

forest spirit
Apr 6, 2009

Frigate Hetman Sahaidachny
First to Fight Scuttle, First to Fall Sink


... weren't they comparing CAS to nothing?

repiv
Aug 13, 2009

sean10mm posted:

Comparing anything to DLSS 1.x is disingenuous because it's literally not used anymore.

Except in F1 2020 which shipped DLSS 1.0 (copy pasted from their implementation in F1 2019) long after DLSS 2.0 shipped 😬

Nvidia keeps DLSS on such a tight leash that I have no idea why they allowed them to do that

PC LOAD LETTER
May 23, 2005
WTF?!

Penpal posted:

... weren't they comparing CAS to nothing?

They compared to it to native res and the DLSS available at the time.

They even labelled the pics and stuff so I'm not sure where you see they're comparing it to nothing?

sean10mm posted:

Comparing anything to DLSS 1.x is disingenuous because it's literally not used anymore.

The article compared to native res too though.

Alchenar
Apr 9, 2008

'Light years ahead' is a bit of a weird phrase when it's more like 'Literally two years ahead, also NVIDA spend a hell of a lot more on software so unclear if they are gaining or losing ground in this space'.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

PC LOAD LETTER posted:

The article compared to native res too though.

DLSS 2.0 looks better than native in low-motion still shots like they're using to compare. Especially in that context, it is reasonable to call it "light years ahead" of any sort of upscaler. Any decent temporal upsampler should be. There's a huge difference between resampling on raw samples and resampling after losing all the subpixel precision data, which is the best you can do with an upscaler independently working off TAA output.

This DF video has a good comparison of Native vs CAS vs DLSS 2.0, which is a followup to the DLSS stuff starting around 15:40. DF is focusing on temporal aliasing more than detail there, but in terms of pure detail are other examples that are even more stark - almost any time you have fine, high contrast detail like text at small pixel sizes, DLSS looks significantly better than native.

The real question is, how soon do we get good, vendor-agnostic applications in every game? Besides raytracing, this is the biggest thing to happen in computer graphics since programmable pixel shaders. It's the future of real-time rendering, everything is going to be done this way. It's just a question of when, and how many good games get vendor-locked to Nvidia before that happens.

Azuren
Jul 15, 2001

The Grumbles posted:

One thing I'd never really considered in terms of the ol' AMD vs Nvidia thing, until I switched this weekend from my 5600xt to a 3070 FE, is optimisation. Regardless of performance, most games just look plain better on this new Nvidia card than they ever did on the Radeon card. I play a lot of FF14, and it looks like a whole new game. Particle effects, shaders, lighting, texture mapping stuff - a lot was just not there on the Radeon card, but swapping to this new card makes the whole game just look better. Same for a lot of games I've tried.

Interestingly, I had the opposite experience last time I switched from an Nvidia card to an AMD - when I built this current PC, I went from an Intel/Nvidia setup (1060 6gb) to all AMD (5700 XT) and I was immediately blown away by how much nicer games looked, just vibrant and really rich colors. I was playing a lot of Witcher 3 at the time and it was very apparent to me at the time, and CDPR and Nvidia partner together a lot... Or maybe it's all placebo effect and we were both justifying our purchases to ourselves :v:

hobbesmaster
Jan 28, 2008

Significantly higher frame rates can make everything feel nicer and you could be just misdirecting that feeling. I noticed the same thing with a few games when I upgraded to my 1070 from.... honestly forget I think it was a 600 series.

Unhappy Meal
Jul 27, 2010

Some smiles show mirth
Others merely show teeth

One day I'll be able to upgrade from the 7850 to the 3080, and I'll be sure to let everyone know if it's a revelatory experience.

Truga
May 4, 2014
Lipstick Apathy

K8.0 posted:

DLSS 2.0 looks better than native in low-motion still shots

no it doesn't

repiv
Aug 13, 2009

I'd hesitate to say DLSS is better than native in general because "native" is a moving target, games native TAA implementations vary in quality

It's certainly better in some cases though, as in Death Stranding, and close enough in others

Truga
May 4, 2014
Lipstick Apathy
i'd argue that at high enough resolutions, still shots look better at native and no AA at all than whatever blurry bullshit TAA implementations often seem to be doing, so yeah, native is a moving target that'll heavily depend on the game and your settings :v:

Inept
Jul 8, 2003

K8.0 posted:

DLSS 2.0 looks better than native in low-motion still shots like they're using to compare.

repiv
Aug 13, 2009


That was pure bullshit though, you can't magic extra detail out of thin air. The difference is DLSS (and presumably AMDs upscaler too?) don't just make up extra detail, they accumulate real details over time, as does native since everything uses TAA nowadays. It just comes down to which approach can more accurately reassemble the details.

repiv fucked around with this message at 18:33 on Nov 1, 2020

Cabbages and VHS
Aug 25, 2004

Listen, I've been around a bit, you know, and I thought I'd seen some creepy things go on in the movie business, but I really have to say this is the most disgusting thing that's ever happened to me.


And now my watch has ended!

(not really, still want a Strix 3080, but this will do for now and then a friend is gonna snag this from me for cost minus 15% whenever I have a 3080 in hand)

Rubellavator
Aug 16, 2007

The Grumbles posted:

One thing I'd never really considered in terms of the ol' AMD vs Nvidia thing, until I switched this weekend from my 5600xt to a 3070 FE, is optimisation. Regardless of performance, most games just look plain better on this new Nvidia card than they ever did on the Radeon card. I play a lot of FF14, and it looks like a whole new game. Particle effects, shaders, lighting, texture mapping stuff - a lot was just not there on the Radeon card, but swapping to this new card makes the whole game just look better. Same for a lot of games I've tried.

I don't know if this will change with the new series of consoles being AMD based?

I've had NVIDIA's Game-Ready drivers change my settings in Destiny 2, seemingly entirely unprompted because I have no memory of telling it to, but the game was suddenly rendering at 4k one day.

bus hustler
Mar 14, 2019

Ha Ha Ha... YES!
are you mlb hall of famer tim raines???

Sagebrush
Feb 26, 2012

ERM... Actually I have stellar scores on the surveys, and every year students tell me that my classes are the best ones they’ve ever taken.

Truga posted:

no it doesn't

DLSS results in an image with additional edge sharpening compared to the single-pass full resolution render. This can look sharper or better than the native image in some cases.

It's arguable whether this additional sharpening is "real" or "fake" detail, but since the entire image is rendered in the first place and you can apply all the blurring or sharpening filters you like to either image, I'm gonna say it doesn't matter

repiv
Aug 13, 2009

We've had this discussion before but rendering at native resolution doesn't automatically give a perfect image, you're at the mercy of the anti-aliasing solution to capture sub-pixel details.

If an upscaling algorithm is sufficiently smart it can reconstruct those subpixels better than the native TAA solution and wind up with a genuinely superior image, not just a more sharpened one.

The line between anti-aliasing and upscaling is hard to pin down at this point, conceptually TAA is already "upscaling" native resolution to a greater than native resolution then scaling it back down to native. These fancy temporal upscalers just take it one step further by starting with a sub-native resolution and (hopefully) arriving at a similar result through smarter use of the input samples.

repiv fucked around with this message at 19:17 on Nov 1, 2020

Fuzz
Jun 2, 2003

Avatar brought to you by the TG Sanity fund
I've consigned myself to just waiting for my EVGA Step Up to eventually happen, at this point. Gonna install the 2060 Super I grabbed for it probably this week and forgo cancelling the step up and returning the card to Newegg.

Gonna be a long winter.

B-Mac
Apr 21, 2003
I'll never catch "the gay"!
DLSS 2.0 good.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

Sagebrush posted:

DLSS results in an image with additional edge sharpening compared to the single-pass full resolution render. This can look sharper or better than the native image in some cases.

It's arguable whether this additional sharpening is "real" or "fake" detail, but since the entire image is rendered in the first place and you can apply all the blurring or sharpening filters you like to either image, I'm gonna say it doesn't matter

DLSS looks like it has artificial sharpening, but unless Nvidia was straight up lying in their GTC presentation, it doesn't. It's just more accurate. They prove this by comparing it to a 32X supersampled version of the same shot, which is obviously a very idealized version of the image. The things that look like sharpening artifacts are present in the 32X supersampled images as well, and the DLSS 2.0 image is far more conformant to the supersampled image than the TAA image is. It's just a matter of de-training your brain from the TAA-induced blur we've unfortunately had to put up with, which as you said is somewhat subjective and unlike detail you can always add blur if you want it.

Obviously in scenes with a lot of motion / de-occlusion DLSS is not going to be able to temporally accumulate as much data for everything, but the fact that there are more frames is going to generally be preferable even if high-motion objects don't look as good. If the framerate is high enough, then the occlusion artifacting is only going to be a few pixels and I'll be damned if I can see a few pixels of possible aliasing on newly de-occluded background. Still, I'd like to see some good analysis, but it's hard to do comparisons because we don't really have a way to generate 1:1 shots of motion where the frame timing is identical in any DLSS 2.0 game that exists today.

KingKapalone
Dec 20, 2005
1/16 Native American + 1/2 Hungarian = Totally Badass
What benchmarks and temp checks should I do on my current setup before I put in the 3080? I want to quantitatively know how much sweeter I am.

Cabbages and VHS
Aug 25, 2004

Listen, I've been around a bit, you know, and I thought I'd seen some creepy things go on in the movie business, but I really have to say this is the most disgusting thing that's ever happened to me.

KingKapalone posted:

What benchmarks and temp checks should I do on my current setup before I put in the 3080? I want to quantitatively know how much sweeter I am.

virt-a-mate on ultra settings with six female and one male model loaded

Rolo
Nov 16, 2005

Hmm, what have we here?

Cabbages and Kings posted:



And now my watch has ended!

(not really, still want a Strix 3080, but this will do for now and then a friend is gonna snag this from me for cost minus 15% whenever I have a 3080 in hand)

Was this the order that was delayed? Mine hasn’t shipped so you’re giving me hope.

Adbot
ADBOT LOVES YOU

repiv
Aug 13, 2009

K8.0 posted:

If the framerate is high enough, then the occlusion artifacting is only going to be a few pixels and I'll be damned if I can see a few pixels of possible aliasing on newly de-occluded background.

Yeah framerate is an underappreciated aspect with this stuff, the better quality TAA implementations out there can mostly avoid disocclusion artifacts even in 30fps console builds, so why are we wasting frametime feeding as many samples into them at >100fps? Fewer samples per frame but with more frames per second should cancel out and result in similar image quality.

repiv fucked around with this message at 19:47 on Nov 1, 2020

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply