Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

Craptacular! posted:

I thought original DLSS was discovered to not be using the tensor cores for anything, causing us all to wonder why there were there adding to the cost.

If DLSS wasn't something that had to be supported by specific titles, it maybe wouldn't bother the hell out of me so much.

DLSS 1.0 did use the tensors. The only DLSS implementation that didn't was Control's original unique DLSS implementation.

Adbot
ADBOT LOVES YOU

OhFunny
Jun 26, 2013

EXTREMELY PISSED AT THE DNC
Can a 650W PSU handle a GTX 1080 and a GTX 670? I want to slot the later in and have it join the former in folding work units from Folding@Home.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

OhFunny posted:

Can a 650W PSU handle a GTX 1080 and a GTX 670? I want to slot the later in and have it join the former in folding work units from Folding@Home.

Probably yes, but barely. You're talking 180W nominal with spiked up to ~250W for the 1080 (assuming it's not overclocked any) and another 170W for the 670. A normal non-OC'ed "rest of the system" number is usually ~200W under load, so you're brushing up against that PSU's limits already. If you were thinking about OC'ing anything, you'd be almost certainly over.

If you paused F@H before you launched a game, or if you only plan on playing Indy titles or whatever that don't tax the 1080, you'd be ok. But if you want serious gaming while F@H is running, I'd get a ~800W PSU.

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

Craptacular! posted:

I thought original DLSS was discovered to not be using the tensor cores for anything, causing us all to wonder why there were there adding to the cost.
?? No, that was only Control's first implementation, which was an outlier, it was 'DLSS' only in name. DLSS in other titles, and DLSS 2.0 in Control use the tensor cores.

e:f,b

Lambert
Apr 15, 2018

by Fluffdaddy
Fallen Rib
Pretty sure the assertion was that is was all marketing bs and the tensor cores didn't seem to be taxed by DLSS running, but I don't know whether that's true or not.

repiv
Aug 13, 2009

Beautiful Ninja posted:

DLSS 1.0 did use the tensors. The only DLSS implementation that didn't was Control's original unique DLSS implementation.

and lest there be any doubt, control with the dlss 2.0 update is lighting up the tensor cores



that spike wasn't there when i originally tested it

Party Boat
Nov 1, 2007

where did that other dog come from

who is he


DLSS 2.0 is only on RTX cards, right?

ActionExpress
Dec 28, 2002
Yes.

A comment I posted on Reddit:


Played a little tonight, (Control) it’s way better than the first iteration. Putting all settings to max (2080s 3440x1440p 100hz) nets you around 50-60 FPS. All medium with maxed RTX is about ~70 FPS. Turn off one of the 5 RTX settings will net you a big boost, depending on which one you disable. You can also choose between 3 DLSS resolutions which is really cool.

Big update is the visual clarity. 1.0 looked a bit blurry and washed out. It’s very hard to tell the difference in 2.0.

Big game changer IMO.

Party Boat
Nov 1, 2007

where did that other dog come from

who is he



Aw beans. Another plus for when I finally upgrade, then.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.
DLSS in Mechwarrior 5 is really good too, it was just added with a new patch a few days ago.

SwissArmyDruid
Feb 14, 2014

by sebmojo
But is it significantly better than the FidelityFX ripoff that Nvidia hastily ported over when it was clear that what AMD was doing killed DLSS 1.0 without the need for any special hardware?

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

SwissArmyDruid posted:

But is it significantly better than the FidelityFX ripoff that Nvidia hastily ported over when it was clear that what AMD was doing killed DLSS 1.0 without the need for any special hardware?

DLSS 2.0 is absolutely legit and pretty much every review site I've seen, plus my own experience with Control, has it as a real game changer. It's 50% more FPS without having to sacrifice IQ, unlike DLSS 1.0 or FidelityFX which absolutely traded IQ for performance. DLSS 2.0 can actually increase IQ by adding detail that doesn't exist at native because of the training it does at much higher resolutions, this is something mentioned in every review I've see so far as well. It's not 100% perfect and you can find instances where DLSS 2.0 can make some things look worse, but overall IQ is the same or better.

I've tried it out on Control on my desktop with a 2080 Ti running at 3440x1440 and I can't tell the difference between DLSS 2.0 and native, other than the massive increase in performance. I can now do 3440x1440 at around 70 FPS with all settings max, including raytracing, using the highest quality DLSS setting.

On my gaming laptop with a 2070 running at 1080p, DLSS 2.0 is actually a straight up IQ improvement over native IMO. This is in large part due to 1080p looking naturally blurry in Control because like most games now, it uses TAA for AA and TAA really needs high resolution to counter the blur it induces. But DLSS 2.0 solves that while increasing performance, something like FidelityFX could fix the IQ of TAA with sharpening but it wouldn't also increase my FPS.

exquisite tea
Apr 21, 2007

Carly shook her glass, willing the ice to melt. "You still haven't told me what the mission is."

She leaned forward. "We are going to assassinate the bad men of Hollywood."


Is there a lineup of games receiving DLSS 2.0 support in the near future? I'm very pleased with the results in Control.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

exquisite tea posted:

Is there a lineup of games receiving DLSS 2.0 support in the near future? I'm very pleased with the results in Control.

I don't think so, but from the sound of things, it shouldn't be hard to support in games that use TAA. The one game NV probably wants to have support more than anything else right now is Cyberpunk, it would be a huge coup de gras to be able to claim equivalent NV GPU's are 50% faster in that game due to DLSS 2.0.

OhFunny
Jun 26, 2013

EXTREMELY PISSED AT THE DNC

DrDork posted:

Probably yes, but barely. You're talking 180W nominal with spiked up to ~250W for the 1080 (assuming it's not overclocked any) and another 170W for the 670. A normal non-OC'ed "rest of the system" number is usually ~200W under load, so you're brushing up against that PSU's limits already. If you were thinking about OC'ing anything, you'd be almost certainly over.

If you paused F@H before you launched a game, or if you only plan on playing Indy titles or whatever that don't tax the 1080, you'd be ok. But if you want serious gaming while F@H is running, I'd get a ~800W PSU.

Thanks for the info. I wasn't planning to game on the GTX 1080, but have it run FAH WUs as well. Which is what it's doing now.

Ganondork
Dec 26, 2012

Ganondork

Beautiful Ninja posted:

I don't think so, but from the sound of things, it shouldn't be hard to support in games that use TAA.

I'm curious, what are the similarities between DLSS 2.0 and TAA? I've seen people make comments like this before, but I'm not sure how they're related.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

Ganondork posted:

I'm curious, what are the similarities between DLSS 2.0 and TAA? I've seen people make comments like this before, but I'm not sure how they're related.

It's something to do with the motion vectors used to make TAA work. Basically, if your game supports TAA, apparently Nvidia can use the information that TAA uses to train DLSS. Per-object motion blur is also supposed to use the same kind of information.

ufarn
May 30, 2009
Is that information used per-user, or does it have to get uploaded to Nvidia's AI hive mind for them to update DLSS 2.0 later on?

repiv
Aug 13, 2009

Ganondork posted:

I'm curious, what are the similarities between DLSS 2.0 and TAA? I've seen people make comments like this before, but I'm not sure how they're related.

https://www.nvidia.com/en-us/geforce/news/nvidia-dlss-2-0-a-big-leap-in-ai-rendering/

The way NV describes it sounds a lot like TAA/TAAU - they're using temporal feedback to incrementally build up detail over multiple frames, using motion vectors to compensate for object motion, and applying a subpixel jitter to the camera each frame to force unique samples into the algorithm even if the camera and/or objects aren't actually moving. That's standard TAA stuff, the difference is that DLSS 2.0 is using neural network magic to blend the frames together instead of the usual hand-crafted algorithms.

The approach to sharpening also sounds like it's inspired by FidelityFX CAS, except with AI. CAS adaptively varies the amount of sharpening at each pixel based on the local contrast, while DLSS 2.0 adaptively varies the sharpening at each pixel based on what the black box neural network decides will look best. Who knows how it decides on that value but it works pretty well.

GRINDCORE MEGGIDO
Feb 28, 1985


How much is AI simply a buzzword, in this case?

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

GRINDCORE MEGGIDO posted:

How much is AI simply a buzzword, in this case?

100%, just like it almost always is. It's not "intelligent" in any meaningful sense, it's just a pretty standard neural network being applied to a specific problem set to optimize results. Your GPU is not going to gain sentience out of this, but being able to run the NN in hardware like this means it's actually fast enough to be useful.

ufarn
May 30, 2009
It's more about mimicking certain features of the human brain than (re)creating straight up sentience. I miss when AI just referred to pathfinding and making enemies take cover and use flanks in games.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

ufarn posted:

It's more about mimicking certain features of the human brain than (re)creating straight up sentience. I miss when AI just referred to pathfinding and making enemies take cover and use flanks in games.

Well, yeah, obviously there's a lot of space south of straight up sentience--that was mostly sarcasm on my part. Normally, though, "AI" when used seriously is intended to convey that the system is producing non-trivial novel solutions to a problem; systems inventing new langues to communicate with each other are good examples, as is the nifty system that figured out how to use a long trace on its system board as a radio antenna. DLSS isn't doing that, it's just using a NN to dynamically adjust weightings and implementation details for the algorithms NVidia cooked up (as far as what I've gathered from what they've released so far).

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

DrDork posted:

100%, just like it almost always is. It's not "intelligent" in any meaningful sense, it's just a pretty standard neural network being applied to a specific problem set to optimize results. Your GPU is not going to gain sentience out of this, but being able to run the NN in hardware like this means it's actually fast enough to be useful.

If you're using a neural network to build the output, it's AI. AI doesn't mean sentience, it means you are letting your software make decisions, not programatically deciding them.

So AI is accurate in this case. You can't use a neural net to make decisions and then say AI is 100% a buzzword.

Rigged Death Trap
Feb 13, 2012

BEEP BEEP BEEP BEEP

Ud prefer if people used Machine Learning, rather than AI.
But Machine Learning doesnt get big investor bucks

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

Rigged Death Trap posted:

Ud prefer if people used Machine Learning, rather than AI.
But Machine Learning doesnt get big investor bucks

Machine Learning is a subset of AI.

Deep learning (which this is, according to NVidia) is a subset of machine learning, and definitely gets big investor bucks.

latinotwink1997
Jan 2, 2008

Taste my Ball of Hope, foul dragon!


My CPU is a neural net processor; a learning computer.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
If anything, ML gets more money than honest AI, because ML is much easier to apply to existing problem sets and produce useful results. Doing image recognition, content filtering, etc., is all a lot easier to build a business case around than trying to explain why some of the more esoteric areas of AI is worth R&D dollars.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
DLSS 2.0 is incredible. This is a giant, giant deal. I don't know what else to say. I'm flabbergasted that they actually made it work.

Craptacular!
Jul 9, 2001

Fuck the DH
The reports of DLSS 2.0 make me sad, since it looks like after years of hair follicle physics and extraneous sparkle effects that you'd be happy turning off, we've reached a point where major games are going to offer a drastically pronounced improvement when played with a partnered graphics card.

Somebody's got to buy Radeon cards for the sake of the rest of us, guys. :smith:

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

Craptacular! posted:

The reports of DLSS 2.0 make me sad, since it looks like after years of hair follicle physics and extraneous sparkle effects that you'd be happy turning off, we've reached a point where major games are going to offer a drastically pronounced improvement when played with a partnered graphics card.

Somebody's got to buy Radeon cards for the sake of the rest of us, guys. :smith:


yeah i bet on radeon and the drivers sucked rear end for 4mo

nvidia all the way baybee lets see RTX 3000 asap

eames
May 9, 2009

Sounds like DLSS 2.0 is what most of us expected DLSS 1.0 to be. Pretty cool. Wonder if they can even squeeze out some additional power efficiency in mobile chipsets by rendering at a lower resolution.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Craptacular! posted:

Somebody's got to buy Radeon cards for the sake of the rest of us, guys. :smith:

Well, AMD/Radeon has survived on mid-tier parts for...how long now? So unless the 3000-series trickles tensor cores down into whatever they'll call the replacements for the 1660/1650 parts, Radeons will still have a market.

But yeah, if they can't come up with an AMD-analogue and DLSS continues to improve, they can pretty much say goodbye to even wishing they could compete above the xx70 level.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
It was on amd hardware that convincing checkerboarding and upscaling was first conceived was it not

I'm sure amd is working on their own solutions to making lower resolutions look better, they did put out a drastically better alternative to dlss 1.0 in short time and I would be surprised if they left it at that.

dlss 2.0 still has to work on a per-game basis, if amd can make one that works across the board yet is only in the ball-park (much like what they did to compete with dlss 1.0), that's all they need imo.

Zedsdeadbaby fucked around with this message at 22:38 on Mar 30, 2020

Craptacular!
Jul 9, 2001

Fuck the DH

DrDork posted:

Well, AMD/Radeon has survived on mid-tier parts for...how long now? So unless the 3000-series trickles tensor cores down into whatever they'll call the replacements for the 1660/1650 parts, Radeons will still have a market.

But yeah, if they can't come up with an AMD-analogue and DLSS continues to improve, they can pretty much say goodbye to even wishing they could compete above the xx70 level.

It seems like the lower tier cards would be where DLSS would be most useful? If you're on a high end card and saying "this tweak lets me run at ultra settings instead of high", that doesn't seem like an accomplishment when those settings are barelyvisible and require a quality tradeoff in other respects. For xx70, DLSS might simply be a way to get the game to run at your desired frames period.

ShaneB
Oct 22, 2002


Man this makes me want a 2060 Super...

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Craptacular! posted:

It seems like the lower tier cards would be where DLSS would be most useful? If you're on a high end card and saying "this tweak lets me run at ultra settings instead of high", that doesn't seem like an accomplishment when those settings are barelyvisible and require a quality tradeoff in other respects. For xx70, DLSS might simply be a way to get the game to run at your desired frames period.

Well, some of these performance increases are on the order of 50%, which is pretty significant regardless of how you want to frame it: sure, in some cases that might be roughly going from High to Ultra, but certainly at 4k there are a ton of games that even at moderate settings are very hard to push >60 FPS, and even a 2080Ti can't do 144+ in many situations. Drop down the stack to a 2070 and you're already turning things down to Medium or less to hit 60. A 50% increase there is huge.

You're right that it'd be even more meaningful further down the stack, but NVidia didn't include tensor cores that far down. It'll be interesting to see if they repeat the same segmentation with the next series, or if they try to really stick it to AMD by pushing them out across the full lineup.

repiv
Aug 13, 2009

Zedsdeadbaby posted:

dlss 2.0 still has to work on a per-game basis, if amd can make one that works across the board yet is only in the ball-park (much like what they did to compete with dlss 1.0), that's all they need imo.

The difficulty is, something that works across the board is limited to simple ReShade-like filters. That's good enough to add extra sharpening like they did with CAS but not sufficient to bolt on a high quality upscaler

Temporal integration is the backbone of all the quality upscalers in modern games (not just DLSS) and there's no practical way to pull that off without cooperation from the engine

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

Craptacular! posted:

The reports of DLSS 2.0 make me sad, since it looks like after years of hair follicle physics and extraneous sparkle effects that you'd be happy turning off, we've reached a point where major games are going to offer a drastically pronounced improvement when played with a partnered graphics card.

Somebody's got to buy Radeon cards for the sake of the rest of us, guys. :smith:

I like AMD cards for Radeon Chill and VSR

Adbot
ADBOT LOVES YOU

Looten Plunder
Jul 11, 2006
Grimey Drawer
I'm wanting to upgrade to Windows 10 and I'm currently weighing up some upgrades on my system too as I dread having to reformat a second time should I want to do it in a few months. I'm currently running one of those Core 2 Duos that everyone loved back in the day (9600k or something?) which I assume is getting long in the tooth by now along with GTX970.

How much of a performance bump will I get out of 1650 Super or a 1660 Super? I really can't afford anything more and just wondering if there is a ton of upside with those options or whether I'm just better off sticking with my current GPU and waiting a while for price drops on something better.

This is currently a gaming rig running 1440p on one of those overclocked 96hz Korean monitors.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply