Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
B-Mac
Apr 21, 2003
I'll never catch "the gay"!
Then again DLSS 2.0 sounds and looks great so perhaps it not wasted space.

Adbot
ADBOT LOVES YOU

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
While it does sound like tensor cores are probably gonna be in the 3000-series (or whatever they're gonna call them), I wouldn't expect things to be as bad as with the 2000-series. For one, they'll get the node shrink to 7nm, so that's a substantial performance gain even if they do basically nothing else. The first iteration of any new technology is also always the worst, so presumably now that they've had ~2 years to iterate on it, they can either use the cores for something more useful, or at least have figured out how to stick them on there with less of a negative impact to cost.

The 2000-series was always going to be "priming the pump" on tensor-related stuff. 3000 is where we should see what NVidia actually plans on doing with them for realsies.

Either way, it's not like the 3000 series is going to end up being a worse price:perf deal (temporary issues due to COVID notwithstanding), since by all accounts it looks like just a heavily optimized and shrunk 2000 series arch, with out any new big thing to complicate production.

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

26 series, with the non RTX cards being the 18 series.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
What nvidia needs more than anything else (well I say 'nvidia needs' but I really mean 'consumers need') is a £300 card with 1080ti-level performance. That was supposed to happen with the 2 series, everybody expected that to happen, but it never did. The 2 series were a gigantic, underwhelming rip-off.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.
Tensor Cores wasn't why the 2000 series was expensive, lack of competition was why it was expensive. And the 2000 series wasn't even that far off the curve adjusting for inflation. People forget that things always get more expensive.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
Yes/no. While I agree with you that the lack of meaningful competition on the high end meant that NVidia didn't even have to try to constrain costs, and inflation is A Thing, the Tensor cores also absolutely added to costs. Die costs are a large part of any GPU, and the 2000-series has utterly enormous dies:

1080: 314mm2
1080Ti: 471mm2
2070: 445mm2
2080: 545mm2
2080Ti: 754mm2

Last I checked no one really had been able to tack down how much space is actually used by the Tensor cores, but even if it's only 10%, that's a significant cost factor when dealing with dies that large. That said, I don't think it's as bad as some make it out to be--I don't think the 2080Ti would be $200 cheaper without Tensor cores, for example. Maybe $50. Who knows. The real gripe with the 2000-series is they added new tech and bumped prices but didn't bump performance to match in many cases. Hopefully that'll be resolved here with the next set.

LASER BEAM DREAM
Nov 3, 2005

Oh, what? So now I suppose you're just going to sit there and pout?

DrDork posted:

Hopefully that'll be resolved here with the next set.

I don't drop into this thread very often. Do we have a rough idea of when Nvidia will start their next line? Obviously, current circumstances are going to affect the release date.

Edit: I recently bought an LG B9 and it has me wanting to do native 4K more than ever. I would also be nice to drive ultra settings on my 3440X1440 screen.

LASER BEAM DREAM fucked around with this message at 20:52 on Mar 26, 2020

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.
If they didn't have Tensor cores they'd have charged the same amount, or at least really close. If charging more for the board gets them more profit and still hit the sales numbers, they wouldn't charge less.

Production cost is not a huge driving factor in MSRP, maximizing profit is the biggest driving factor. The rising board size is definitely a big factor as to why you're seeing slower release cycles and a bigger reliance on refreshes, but card cost is mostly determined by pricing at the highest point of projected profit. Another reason why in this case in particular is R&D costs are a huge part of the overall cost and the tensor cores are already existing tech for Nvidia.

Totally different industry, but the McDonald's example is a good illustration. A medium coke, medium fries, and a cheeseburger all cost way different to product (about 6c, 25c, and almost a dollar respectively) but they all cost about $1 for the consumer. This is because they know they can WAY over charge for soda and even if they lose money on a cheeseburger they will make it up with sides. Production costs are not very relevant.

LASER BEAM DREAM posted:

I don't drop into this thread very often. Do we have a rough idea of when Nvidia will start their next line? Obviously, current circumstances are going to affect the release date.

Edit: I recently bought an LG B9 and it has me wanting to do native 4K more than ever. I would also be nice to drive ultra settings on my 3440X1440 screen.

No idea, good bet was Late Summer/Early Fall before all this happened and at this point I imagine things will be delayed. There has been no concrete info in any direction and some of the telltale "signs" that things are 4-6 months out have not been seen yet, so "soon" seems very unlikely.

sauer kraut
Oct 2, 2004

DrDork posted:

Last I checked no one really had been able to tack down how much space is actually used by the Tensor cores, but even if it's only 10%, that's a significant cost factor when dealing with dies that large. That said, I don't think it's as bad as some make it out to be--I don't think the 2080Ti would be $200 cheaper without Tensor cores, for example. Maybe $50. Who knows. The real gripe with the 2000-series is they added new tech and bumped prices but didn't bump performance to match in many cases. Hopefully that'll be resolved here with the next set.

It's more like 5%. You conveniently left out the 1060 6GB/1660Ti die size comparison.
Turing was a 30..40% performance increase across the board on basically the same node, and it leaves Pascal in the dust with very recent games/VR stuff.
If you don't wanna pay for it, there's always the loving embrace of the Radeon Technologies Group.

eames
May 9, 2009

Nvidia is going to sell rebadged 2080tis as 3070 for $1200 and people will love it. 40% performance increase! Leaves Turing in the dust! :confuoot:

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

LASER BEAM DREAM posted:

I don't drop into this thread very often. Do we have a rough idea of when Nvidia will start their next line? Obviously, current circumstances are going to affect the release date.

Mostly rumors and speculation, but probably not before Q4/holiday season from the looks of it. Then again, they were supposed to have a "big press announcement" a bit ago that got canceled due to all the COVID stuff, and no one really knows what it is they were planning to announce.

Lockback posted:

If they didn't have Tensor cores they'd have charged the same amount, or at least really close. If charging more for the board gets them more profit and still hit the sales numbers, they wouldn't charge less.

Production cost is not a huge driving factor in MSRP, maximizing profit is the biggest driving factor. The rising board size is definitely a big factor as to why you're seeing slower release cycles and a bigger reliance on refreshes, but card cost is mostly determined by pricing at the highest point of projected profit. Another reason why in this case in particular is R&D costs are a huge part of the overall cost and the tensor cores are already existing tech for Nvidia.

Totally different industry, but the McDonald's example is a good illustration. A medium coke, medium fries, and a cheeseburger all cost way different to product (about 6c, 25c, and almost a dollar respectively) but they all cost about $1 for the consumer. This is because they know they can WAY over charge for soda and even if they lose money on a cheeseburger they will make it up with sides. Production costs are not very relevant.

The McDonald's bit isn't terribly relevant, since their model is predicated on the assumption that most people don't just order a single cheeseburger and walk out; they can afford to make less on one item based on the assumption you'll buy other, higher-margin items. Same applies to loss-leader sales. That only somewhat applies to graphics cards, where they charge higher margins on halo products and much less on lower-tier products. McDonald's makes most of their profit off of soda and fries with high margins, but similar volume to burgers; AMD/NVidia make most of theirs on mid-tier cards with lower margins but enormously higher volume than the halo products.

You're right that R&D costs are also important, but also remember that NVidia is still actively developing wtf to do with Tensor cores for gaming tech: that's not free, either. So whether you want to consider the extra dev costs, the extra die space costs, or both, the end result is that adding Tensor cores to cards is not free, and in the 2000-series there's little to nothing to show for it. Which, as I said, should have been more or less expected since it was a first-gen of brand new tech, and I'll be very interested to see what they do with them with the next-gen cards: it would be crazy to continue incurring costs for features that don't actually provide any benefit, so presumably they have a plan here.

treasure bear
Dec 10, 2012

DLSS2 is pretty impressive in Control and to me looks a bit better than native. Wish it hadn't taken this long. Hopefully it gets included in more games.

repiv
Aug 13, 2009

It should show up in games more regularly now since the new model is universal, they no longer have to tailor it for every game.

Plus the integration looks like it should be straightforward for engines that already have TAA, the inputs are the same.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
I'm still suspicious of some things. I wonder if the reason Nvidia doesn't show off 1080p/1440p except in quality mode is that the tensor cores can't keep up and bottleneck the framerate. To me, not being able to use it to push proper framerates (>90 FPS) would make it kinda useless. I really hope someone does some testing with Control and those type of settings to see how image quality and performance shake out.

TorakFade
Oct 3, 2006

I strongly disapprove


Hmm, after the last couple of driver updates (Nvidia, I have an EVGA 1080 FTW bought at the end of 2018) RDR2 seems to have some weird graphics glitch, with random red and white flashes. Other games don't seem to have any similar problem, I googled but can't seem to find anyone having the same issue.

Here's a short video, it's annoying as hell. The GPU is not overclocked or anything, fans are working correctly, I have disabled any changes I made in Precision X1 (EVGA's own afterburner-like program) so it should be at default values now.

https://www.youtube.com/watch?v=WIG5GjsqlOM

is this a known bug, or what could it be? Is it my GPU showing signs of dying? I hope it's not that because being in lockdown with no videogaming scares me :ohdear:

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.
Tried out DLSS 2.0 in Control on my 2080 Ti. At 3440x1440, my average FPS went from around 40 with everything enabled at native resolution up to around 70 FPS using the highest quality DLSS setting. So about a 70% increase in performance there, upwards of 100% if I used one of the other quality settings. Noticing no difference in image quality between native and DLSS. Turing's actually starting to shape up here, full DX12 Ultimate support so it'll have feature parity with the new consoles, DLSS 2.0 being the black magic NV claimed it would be at launch. DLSS also makes raytracing a far more viable option on even the lowest end RTX cards, Control can run at 4K 30 FPS on a GTX 2060 now, 4K 60 FPS on a 2080 Ti with DLSS.

Makes me excited for stuff like Cyberpunk, I may realistically be able to use all the RTX goodies and have DLSS cover the performance loss with no IQ penalty.

Unreal Engine also seems to have DLSS implemented in the engine now, so expect there to be a sharp rise of DLSS games coming out wiht how ubiquitous UE is. There's a 10 dollar game made by 1 guy named Bright Memory on Steam now that has DLSS 2.0 support and raytracing, as an example of how easily this tech can be implemented in UE now.

Now the real question is if the next gen Switch can get in on the DLSS game. It's rumored to be Volta based which would mean it could have access to tensor cores and theoretically DLSS as well.

Ganondork
Dec 26, 2012

Ganondork
RE: DLSS 2.0 - I wouldn’t get your hopes up. The DLSS dev page only mentions access for Unreal Engine. Given the need for motion vectors, it seems like DLSS isn’t a general solution, and requires support from the engine and game dev.

https://developer.nvidia.com/dlss

Given that few games implement DLSS 1.0, I’d expect the same for 2.0. Hopefully, we’ll see wider adoption, but given gaming’s past with proprietary tech, and consoles consistently going AMD...I doubt it.

repiv
Aug 13, 2009

Ganondork posted:

Given that few games implement DLSS 1.0, I’d expect the same for 2.0.

That's not really a useful comparison given how DLSS has changed - 1.0 required them to generate a huge corpus of training data for each game, then spend who knows how much CPU/GPU time training a one-off model based on that data, ultimately for underwhelming results. 2.0 on the other hand is a universal drop-in model that apparently requires no per-game tweaking.

It does require engine integration yeah, and it's not going to be the default choice as long as it can't run on AMD or older NV cards, but given the ubiquity of TAA and how DLSS uses the same jittered framebuffer and motion vector inputs that TAA does it should slot in fairly easily next to the TAA option most games already have. Plus as mentioned they have a UE4 fork with the integration already done.

I think what happened between 1.0 and 2.0 is they gave up on the original idea of having the neural network hallucinate fake detail out of thin air, which is why it originally needed per-game training, and 2.0 is closer to a conventional TAA approach except they're using deep learning to handle the awkward edge cases rather than the usual hand-tuned heuristics.

ufarn
May 30, 2009
Also the release of AMD's FidelityFX/CAS blew up most of the selling points for DLSS 1.0.

Stickman
Feb 1, 2004

Given that people are reporting 60%ish performance boosts with DLSS 2.0, it seems like it would have to still involve some up scaling somewhere. I think turning off TAA is usually only a 5-10% performance boost?

I guess that doesn’t mean that the deep learning bit is involved in the upscaling, but zero loss of graphical fidelity is pretty impressive!

repiv
Aug 13, 2009

Stickman posted:

Given that people are reporting 60%ish performance boosts with DLSS 2.0, it seems like it would have to still involve some up scaling somewhere. I think turning off TAA is usually only a 5-10% performance boost?

I guess that doesn’t mean that the deep learning bit is involved in the upscaling, but zero loss of graphical fidelity is pretty impressive!

Yeah DLSS 2.0 is still rendering at a lower res then upscaling to native. I should have said it's like TAAU (temporal upscaling) rather than TAA, since the low res inputs frames are accumulated into a higher res buffer.

Same principle though, TAAU requires ad-hoc heuristics to work around various artifacts and Nvidia are instead just letting a neural network figure out the tricky cases

repiv fucked around with this message at 15:32 on Mar 27, 2020

Stickman
Feb 1, 2004

I’d forgotten they’d started incorporating upscaling in TAA, thanks!

repiv
Aug 13, 2009

Just tried out DLSS 2.0 in Control (960p -> 1440p) and the results are pretty impressive, though there are some oversharpening artifacts in places. I wonder if the sharpening is an explicit pass or something the NN inferred by itself.

vvvv UI is native, at least in Controls integration. The engine determines what exactly gets upscaled so it's possible that other engines might upscale their UI if they half-rear end the integration.

repiv fucked around with this message at 16:55 on Mar 27, 2020

Sininu
Jan 8, 2014

Is the UI rendered native or does DLSS upscale it too?

VelociBacon
Dec 8, 2009

Can anyone speak to how different DLSS 2.0 is from 1.0 in Control? I thought 1.0 was pretty good.

5436
Jul 11, 2003

by astral
I have a GTX 1660 Super but it can only play overwatch at like low settings without dropping under 120fps, how can I debug this issue?

Intel® Core™ i5-8400 CPU @ 2.80GHz × 6
Memory: 15.6 GiB
GFX: GeForce GTX 1660 SUPER/PCIe/SSE2

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

5436 posted:

I have a GTX 1660 Super but it can only play overwatch at like low settings without dropping under 120fps, how can I debug this issue?

Intel® Core™ i5-8400 CPU @ 2.80GHz × 6
Memory: 15.6 GiB
GFX: GeForce GTX 1660 SUPER/PCIe/SSE2

Is your CPU or GPU pegged? MSI afterburner or maybe task manager on a separate screen will tell you.

repiv
Aug 13, 2009

VelociBacon posted:

Can anyone speak to how different DLSS 2.0 is from 1.0 in Control? I thought 1.0 was pretty good.

The version of DLSS that Control shipped with wasn't really DLSS at all. There's actually been three versions so far:

DLSS 1.0: Used in Metro, Battlefield 5, Final Fantasy XV, etc. Upscales using a neural network on the tensor cores. Pretty bad.
faux-DLSS: Used in Control at launch. Actually a fairly standard temporal upscaler running on the normal shader cores. Apparently required a lot of hand-tuning to get the results it did, and was only meant as a stop-gap until DLSS 2.0 was ready.
DLSS 2.0: Used in Control (post DLC patch), Youngblood, etc. Second iteration of neural network upscaling, pretty good this time.

Nvidia posted some comparisons between faux-DLSS and DLSS 2.0 in control, aside from being sharper it's also much more stable around moving objects. Pay attention to the big fan:

https://www.youtube.com/watch?v=KwDs6LrocR4

repiv fucked around with this message at 17:31 on Mar 27, 2020

Craptacular!
Jul 9, 2001

Fuck the DH

TorakFade posted:

Hmm, after the last couple of driver updates (Nvidia, I have an EVGA 1080 FTW bought at the end of 2018) RDR2 seems to have some weird graphics glitch, with random red and white flashes.

442.7x has been a garbage driver for in game flashing in various titles. DDU and install 442.59.

SHAQ4PREZ
Dec 21, 2004

How I Learned to Stop Worrying and Love the Economy Car
I have an EVGA 1080ti SC2 that has 38 days left on the warranty.

I've had some minor graphics issues that I've never really taken the time to troubleshoot, can anyone suggest a good torture test so I can confirm my card is in good working order before the warranty runs out?

VelociBacon
Dec 8, 2009

SHAQ4PREZ posted:

I have an EVGA 1080ti SC2 that has 38 days left on the warranty.

I've had some minor graphics issues that I've never really taken the time to troubleshoot, can anyone suggest a good torture test so I can confirm my card is in good working order before the warranty runs out?

OCCT, furmark, 3dMark stress test are all good for testing.

You should really really send that in while it's still in warranty even if you can't reproduce the error. It's extremely likely they'll just send you another card and you run a significant chance of getting a 2080 in the mail.

TorakFade
Oct 3, 2006

I strongly disapprove


Craptacular! posted:

442.7x has been a garbage driver for in game flashing in various titles. DDU and install 442.59.

Will try, thank you!

orcane
Jun 13, 2012

Fun Shoe
Run some Witcher 3 or a different demanding game if you can because the GPU stresstests are good for specific things (like testing whether your PSU or cooling is good enough) but actual games have been shown to trip over hardware issues and unstable OC while stresstests are still showing no problems. There are also ways to test VRAM specifically but honestly, if you suspect anything is wrong I'd send in the card as long as you're stilll in the warranty window. Troubleshooting GPUs can be a huge pain and if you're unlucky, later drivers or increased usage by newer software might push the card far enough so a tiny problem turns into regular crashes in another year or w/e.

Arzachel
May 12, 2012

5436 posted:

I have a GTX 1660 Super but it can only play overwatch at like low settings without dropping under 120fps, how can I debug this issue?

Intel® Core™ i5-8400 CPU @ 2.80GHz × 6
Memory: 15.6 GiB
GFX: GeForce GTX 1660 SUPER/PCIe/SSE2

I remember hearing that Overwatch is very sensitive to memory speed at high frame rates.

Anime Schoolgirl
Nov 28, 2002

ItBreathes posted:

26 series, with the non RTX cards being the 18 series.
I don't think I like this numbering scheme

pyrotek
May 21, 2004



Ganondork posted:

RE: DLSS 2.0 - I wouldn’t get your hopes up. The DLSS dev page only mentions access for Unreal Engine. Given the need for motion vectors, it seems like DLSS isn’t a general solution, and requires support from the engine and game dev.

https://developer.nvidia.com/dlss

Given that few games implement DLSS 1.0, I’d expect the same for 2.0. Hopefully, we’ll see wider adoption, but given gaming’s past with proprietary tech, and consoles consistently going AMD...I doubt it.

Neither Control nor Wolfenstein: Youngblood use Unreal, so hopefully it will have wider adoption than you think. It really looks great in person. Light halos and other related sharpening artifacts really annoy me so I'm normally not a fan of sharpening techniques, but the effects seem really minimal with DLSS 2.0 and the performance gains compared to native rendering are massive.

Ganondork
Dec 26, 2012

Ganondork

pyrotek posted:

Neither Control nor Wolfenstein: Youngblood use Unreal, so hopefully it will have wider adoption than you think. It really looks great in person. Light halos and other related sharpening artifacts really annoy me so I'm normally not a fan of sharpening techniques, but the effects seem really minimal with DLSS 2.0 and the performance gains compared to native rendering are massive.

Fair point, hopefully more will adopt!

repiv
Aug 13, 2009

Speaking of the UE4 DLSS fork, its documentation confirms that configurable sharpness is being worked on for a future version

quote:

Why doesn't r.NGX.DLSS.Sharpness doesn't seem to have any effect?
We are currently hard at work calibrating the user-adjustable sharpness setting to combine well with the internal sharpness value produced by DLSS's deep neural networks, in order to consistently deliver a high-quality output while still giving the user a significant level of flexibility over the amount of sharpening they want applied. It is currently available as a debug feature in non-production DLSS builds, but is disabled by default.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Having spent some time looking at user screenshots of DLSS 2.0 in Control, aside from the sharpening artifacts I think quality mode actually looks better than native despite being rendered at a lower resolution, which is absolutely insane. If Nvidia can further improve that without major compromise elsewhere, I think DLSS becomes the sort of thing you just always enable. Personally I probably already would be, the performance difference is so huge that I'd put up with the artifacts to gain the huge performance benefits. As some user testing has showed, the tensor cores really aren't bottlenecking performance the way they were in DLSS 1.0, so it's a viable option to push high refresh rates.

Adbot
ADBOT LOVES YOU

Craptacular!
Jul 9, 2001

Fuck the DH
I thought original DLSS was discovered to not be using the tensor cores for anything, causing us all to wonder why there were there adding to the cost.

If DLSS wasn't something that had to be supported by specific titles, it maybe wouldn't bother the hell out of me so much.

Craptacular! fucked around with this message at 09:55 on Mar 29, 2020

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply