Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

shrike82 posted:

tbh, people should be following Microsoft more than AMD for a viable DLSS alternative -
this is what they had to say about their console's RDNA2 GPU

But what the specifically does that mean? Like you were saying earlier, every GPU is some sense an "ML accelerator" in a sense because 3d graphics is matrix math. It would seem to imply that they have included some sort of inference-specific optimizations rather than just optimizing for graphics and throwing ML workloads on the hardware like everything pre-Turing. I suppose we'll probably find out some more meaningful information soon when they do their architecture reveal. I have a feeling AMD has a few more reveals up their sleeves that are going to be promising, but not in the short term where it likely matters for these current GPUs.

Regardless, AMD is going to be well behind Nvidia on getting implementation in the real world, but as with everything else Nvidia has pioneered I'm sure that eventually we'll move away from vendor-locked scenarios. The sooner the better too, the last thing we want to be dealing with 10 years from now is 3-4 years of oddly vendor-locked implementations.

Truga posted:

give me supersampling or give me death.

*runs quake 3 at 4x supersample* see, runs at 700fps, works fine!

DLSS is supersampling, over time. So is TAA for that matter, TAA is just a dramatically inferior way of doing it and everyone understands that we're inevitably moving in the direction of DLSS-style temporal supersampling.

Adbot
ADBOT LOVES YOU

Cream-of-Plenty
Apr 21, 2010

"The world is a hellish place, and bad writing is destroying the quality of our suffering."

Fuzz posted:

So who has actually gotten a 3080 step up card, and when did you sign up?

Multiple pages ago somebody said that they had signed up around 10:20 AM on the first day and hadn’t moved to step 2 yet, I believe. There’s a pretty good chance they’ll move on next week, from what I can tell.

I signed up about 12 hours after they started taking step ups for ampere cards, so I have a ways to go I think.

redreader
Nov 2, 2009

I am the coolest person ever with my pirate chalice. Seriously.

Dinosaur Gum
I went and worked out and came back and my bot had opened best buy about 15 times but all of the pages said out of stock (maybe it is javascript and updates after you load the page and it goes out of stock.) Yesterday I was also able to see my bot open a nonexistent asus page.

Either way if my bot is finding stock (even if it's not loading when my bot sends my browser to the page) it might mean stock is starting to get better. I saw no bot pings for the last week or so. And yeah, last time I said this, I said "I am not checking for zotac because it won't fit in my case" and someone told me my bot was bad because there was zotac stock every day. So anyway, I am only checking for evga/founders/asus 3080's.

Truga
May 4, 2014
Lipstick Apathy

K8.0 posted:

DLSS is supersampling, over time. So is TAA for that matter, TAA is just a dramatically inferior way of doing it and everyone understands that we're inevitably moving in the direction of DLSS-style temporal supersampling.
i dunno man even as good as dlss2.x is, i can notice the artifacts it produces on youtube's lovely compressed videos, never mind the real loving thing. not a big problem in a high speed games like quake, would be a giant loving turnoff in a chill game like minecraft, to me.

and don't even get me started again on TAA, there's been zero implementations i've seen so far that didn't blow

vvv: counterpoint: it's the same loving card in 99% use cases today, but also works on linux

Truga fucked around with this message at 02:33 on Oct 29, 2020

BurritoJustice
Oct 9, 2012

I'm confused that someone who has $650 to spend on a luxury graphics card doesn't have $50 to stretch to one that'll make their games look better as well as faster. If it had a raster advantage I'd understand, but like, delay buying ONE triple-A until steam sale time and use that saving to get a 3080

Sagebrush
Feb 26, 2012

Truga posted:

i dunno man even as good as dlss2.x is, i can notice the artifacts it produces on youtube's lovely compressed videos, never mind the real loving thing. not a big problem in a high speed games like quake, would be a giant loving turnoff in a chill game like minecraft, to me.

and don't even get me started again on TAA, there's been zero implementations i've seen so far that didn't blow

You don't have to use it as an upscaler. If you can run the game smoothly at native resolution, then you can run it at native resolution with DLSS and for the same performance get best-in-class antialiasing, comparable to 32x SSAA.

Truga
May 4, 2014
Lipstick Apathy

Sagebrush posted:

You don't have to use it as an upscaler. If you can run the game smoothly at native resolution, then you can run it at native resolution with DLSS and just get best-in-class antialiasing, comparable to 32x SSAA.

oh yeah, that's fair. i forgot about that :v:

repiv
Aug 13, 2009

Truga posted:

i dunno man even as good as dlss2.x is, i can notice the artifacts it produces on youtube's lovely compressed videos, never mind the real loving thing. not a big problem in a high speed games like quake, would be a giant loving turnoff in a chill game like minecraft, to me.

Are you seeing those artifacts on Minecraft RTX videos? Because that has a ton of temporal feedback going on even with DLSS disabled, you might be noticing denoiser artifacts.

hobbesmaster
Jan 28, 2008

Truga posted:

vvv: counterpoint: it's the same loving card in 99% use cases today, but also works on linux

I wonder what percentage of “PC game GPU time” Minecraft and Fortnite are together.

Truga
May 4, 2014
Lipstick Apathy
no i was looking up control dlss 1.0/2.0/native comparisons. but yeah as sagebrush said, you can always not use AI upscaling and just use it for antialiasing, which should be pretty drat good

hobbesmaster posted:

I wonder what percentage of “PC game GPU time” Minecraft and Fortnite are together.
the minecraft version people play doesn't do dlss and 99% people playing fortnite will continue playing it on potato pcs :v:

Shipon
Nov 7, 2005

Sagebrush posted:

You don't have to use it as an upscaler. If you can run the game smoothly at native resolution, then you can run it at native resolution with DLSS and for the same performance get best-in-class antialiasing, comparable to 32x SSAA.

Do games that support DLSS allow for this? I would love to just run a game at 1440p and have it look incredibly nice for less cost than regular antialiasing.

CaptainSarcastic
Jul 6, 2013



Truga posted:

vvv: counterpoint: it's the same loving card in 99% use cases today, but also works on linux

I'm not sure I'm completely up to speed on this argument as I've been using Nvidia GPUs on Linux for years and years now. Yeah, it means running the proprietary driver and yeah, it means I can't run some specific things (like Wayland), but mostly it runs just fine and dandy.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
man I loving wonder if that Witcher 3 RTX patch will include DLSS support

maybe I could finally get a full 144 fps on it, like five years after the fact

right now it just hits a wall around the 100 fps (iirc) mark no matter what you do

shrike82
Jun 11, 2005

the latest nvidia linux drivers and cuda toolkit seem borked for a lot of people

SCheeseman
Apr 23, 2003

What have Nvidia got to lose from opening up their DLSS API and tools to everyone? Couldn't it be used with injectors like ReShade (or at least potentially so)?

Truga
May 4, 2014
Lipstick Apathy

CaptainSarcastic posted:

I'm not sure I'm completely up to speed on this argument as I've been using Nvidia GPUs on Linux for years and years now. Yeah, it means running the proprietary driver and yeah, it means I can't run some specific things (like Wayland), but mostly it runs just fine and dandy.

if you're running latest kernel a couple times a year nvidia driver will not work/compile. running latest kernel/vulkan/GL packages is often a decent 3D performance boost as the stack evolves, which it has been doing at a pretty rapid pace since about 2018 or whenever AMDGPU finally got included. if i'm gonna buy a $600 piece of hardware, a couple times a year is too much. :v:

also, maybe this is just an artifact of my card being 5 years old, but i lose between 30 and 50% performance in 2 games i play, whereas my friends with AMD gpus do not, compared to windows. :shrug:

iv46vi
Apr 2, 2010
GPU Megat[H]read - Demand RAGE MODE

CaptainSarcastic
Jul 6, 2013



shrike82 posted:

the latest nvidia linux drivers and cuda toolkit seem borked for a lot of people

Ah, I'm running openSUSE Tumbleweed on this machine and 15.2 on another, and they generally do pretty good about not pushing out updates until they work. I might be a little spoiled by that. The only time poo poo has broken for me is stuff like a kernel update on 15.2 from a 4.x kernel to a 5.x kernel, and that resolved after an update or two.

Truga posted:

if you're running latest kernel a couple times a year nvidia driver will not work/compile. running latest kernel/vulkan/GL packages is often a decent 3D performance boost as the stack evolves, which it has been doing at a pretty rapid pace since about 2018 or whenever AMDGPU finally got included. if i'm gonna buy a $600 piece of hardware, a couple times a year is too much. :v:

also, maybe this is just an artifact of my card being 5 years old, but i lose between 30 and 50% performance in 2 games i play, whereas my friends with AMD gpus do not, compared to windows. :shrug:

Okay, gotcha. Like I said above, I kinda rely on openSUSE to test updates before I install them. And I think my performance has been okay, but still tend to game on Windows so don't have a huge amount of data to compare. The games I have tested seem to run about equivalent, but I haven't tested many at this point.

CaptainSarcastic fucked around with this message at 02:56 on Oct 29, 2020

shrike82
Jun 11, 2005

SCheeseman posted:

What have Nvidia got to lose from opening up their DLSS API and tools to everyone? Couldn't it be used with injectors like ReShade (or at least potentially so)?

selling point for their cards. could also be an indication that you don't need their hardware to run it well

hobbesmaster
Jan 28, 2008

Shipon posted:

Do games that support DLSS allow for this? I would love to just run a game at 1440p and have it look incredibly nice for less cost than regular antialiasing.

Control does iirc but it isn’t labeled too clearly.

Edit: I would like to emphasize that if you have an nvidia 2000 or 3000 series you need to play Control it’s both a great game and the greatest tech demo and ad for RT and DLSS.

hobbesmaster fucked around with this message at 03:00 on Oct 29, 2020

repiv
Aug 13, 2009

Shipon posted:

Do games that support DLSS allow for this? I would love to just run a game at 1440p and have it look incredibly nice for less cost than regular antialiasing.

Nvidia doesn't support using DLSS just for anti-aliasing, their stance is that Quality mode (2x upscaling) is already good enough that there's no point throwing more samples at it

They may well be right, even if there's artifacts in Quality mode that doesn't mean more samples would necessarily fix them (e.g. the cryptobiote bug in death stranding isn't from a lack of samples)

I think you can force Control to use DLSS as just AA by messing with the config files though

Paul MaudDib posted:

man I loving wonder if that Witcher 3 RTX patch will include DLSS support

Probably if they're backporting the CP2077 renderer?

Cygni
Nov 12, 2005

raring to post

SCheeseman posted:

What have Nvidia got to lose from opening up their DLSS API and tools to everyone? Couldn't it be used with injectors like ReShade (or at least potentially so)?

There is still the step where information is preloaded into the driver itself.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

shrike82 posted:

the latest nvidia linux drivers and cuda toolkit seem borked for a lot of people

"plays games in Linux" is probably one of the use-cases where AMD outweighs NVidia, if I'm not mistaken

shrike82
Jun 11, 2005

hot take: control looks good and is a showcase for DLSS/RTX but controls (haha) like crap

Shipon
Nov 7, 2005
i just want more frames in flight sim

SCheeseman
Apr 23, 2003

shrike82 posted:

selling point for their cards. could also be an indication that you don't need their hardware to run it well
I don't mean open up the tech to other manufacturers, just give access to the tools and APIs they use. It would still be locked to Nvidia hardware since it requires tensor cores anyway.

Cygni posted:

There is still the step where information is preloaded into the driver itself.

What information?

SCheeseman fucked around with this message at 03:14 on Oct 29, 2020

hobbesmaster
Jan 28, 2008

shrike82 posted:

hot take: control looks good and is a showcase for DLSS/RTX but controls (haha) like crap

Just press E

Enos Cabell
Nov 3, 2004


Fuzz posted:

So who has actually gotten a 3080 step up card, and when did you sign up?

One of my brothers got his confirmation email at 9:45est, and then step 2 came on the 13th and he got his 3080 on the 21st.

I got my confirmation email at 10:49est and haven't gotten to step 2 yet. Third brother got his in at 11:02, so hopeful we'll both get it pretty soon.

I'm guessing there is a huge backlog in the first few hours.

Cygni
Nov 12, 2005

raring to post

SCheeseman posted:

What information?

Ignore what I said, i am nowhere near smart enough to understand how DLSS 2.0 works and i really dont want to start DLSS chat back up like we had all summer.

Bleep blop magic box, electrons go in and bouncy real time anime titty and shootmans come out. The more you buy the more you save.

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!

This is the good poo poo, I remember chuckling at this ridiculous naming scheme and I was literally 12 at the time

BurritoJustice
Oct 9, 2012

Truga posted:

vvv: counterpoint: it's the same loving card in 99% use cases today, but also works on linux

I meant to include a Linux disclaimer when I made the post but it slipped my mind. If you have a specific use case then yeah, makes perfect sense. I'd say the same if it had better raster performance than a 3080

I'm not going to back down on $50 been a poor saving with the major industry push towards some form of RTX/DLSS

BurritoJustice
Oct 9, 2012

repiv posted:

Nvidia doesn't support using DLSS just for anti-aliasing, their stance is that Quality mode (2x upscaling) is already good enough that there's no point throwing more samples at it

They may well be right, even if there's artifacts in Quality mode that doesn't mean more samples would necessarily fix them (e.g. the cryptobiote bug in death stranding isn't from a lack of samples)

I think you can force Control to use DLSS as just AA by messing with the config files though


Probably if they're backporting the CP2077 renderer?

You can use DLSS as pseudo-AA by setting the target resolution higher. For example, targetting 8K when you have a 4K display so that it upsamples and then supersamples.

2kliksphillip did a test of it and it's pretty neat

Truga
May 4, 2014
Lipstick Apathy
yeah sorry i'm just angry because it's the first time i'm playing minecraft in a long while and my 980Ti can't handle supersampled minecraft anymore because i have 4k now and only 6GB vram lmfao

repiv
Aug 13, 2009

on the subject of upscaling i wonder if the new variable rate shading feature in turing/ampere/rdna2 will provide a more nuanced alternative

the idea is you can split the screen up into 16x16 pixel tiles and select a different shading rate for each one (ranging from undersampling to supersampling) so instead of rendering at a flat 960p and upscaling to 1440p or whatever they could maybe render at 1440p but with only the tiles the TAA/TAAU/DLSS/whatever is having difficulty resolving shaded at full-rate or supersampled and everything else shaded at half or quarter rate

Truga
May 4, 2014
Lipstick Apathy
with eye tracking that would be loving amazing for vr performance at least

shrike82
Jun 11, 2005

i mean for VR, the dream is foveated rendering

the way our eyes upsamples vision is loving nuts when you think about it

Sphyre
Jun 14, 2001

It feels like ASUS stopped making TUF cards entirely in favour of STRIX cards :negative:

pik_d
Feb 24, 2006

follow the white dove





TRP Post of the Month October 2021

Sphyre posted:

It feels like ASUS stopped making TUF cards entirely in favour of STRIX cards :negative:

That's just TUF luck

pik_d
Feb 24, 2006

follow the white dove





TRP Post of the Month October 2021

Sphyre posted:

It feels like ASUS stopped making TUF cards entirely in favour of STRIX cards :negative:

Actually this is in stock as a combo deal right now

https://www.newegg.com/Product/ComboDealDetails?ItemList=Combo.4191579

You save :5bux:

Adbot
ADBOT LOVES YOU

BurritoJustice
Oct 9, 2012

Has anyone seen a measurable performance difference when overclocked with the 2x8pin cards like the TUF versus 3x8pin like the Strix? I know you can easily hit the 375w theoretical draw really easily, but I don't know how much you get for the extra power limit.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply