Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
repiv
Aug 13, 2009

Agreed posted:

:psyduck:

Hold up though, how does interpolated frames and a higher frame count actually improve the game's feel if the base framerate would still be below vsync? Is it frame skipping and filling in the blanks that it isn't rendering while still advancing the engine to the next frame with interpolation, to somehow get the engine to perform as though it were running at a higher framerate?

no, the engine ticks at the "internal" framerate and the doubled framerate is purely visual

DLSS3 won't improve input responsiveness, it's purely to make the graphics smoother

the upshot of that is it works in CPU-limited scenarios, if your CPU chokes at 70fps then DLSS3 will still be able to churn out "~140fps"

repiv fucked around with this message at 00:13 on Sep 22, 2022

Adbot
ADBOT LOVES YOU

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
Do we know yet if the extra "fake" frames (there's probably a better word for this) that are generated will be effectively free (any artifacting notwithstanding)? As in will there be as many "real" frames with Frame Generation turned off and on, or will it have less of them with it on?

repiv
Aug 13, 2009

synthesizing the fake frames will chew up some amount of compute time so it's not quite free, in practice if you're GPU limited it will be something like 60fps with DLSS off, or 55fps->110fps with DLSS on

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

repiv posted:

no, the engine ticks at the "internal" framerate and the doubled framerate is purely visual

DLSS3 won't improve input responsiveness, it's purely to make the graphics smoother

I do not see how this is a feature. Or how they're calling that "performance." Does that mean in situations where the engine is putting out, I dunno, 26 fps, it's just going to look smoother while still actually being 26fps as far as input and engine frames are concerned? Like, if the base performance is choking, how is this not going to be... just... smoother choking? I am trying to figure out how this improves gaming in practice.

If a game dips to sub-30fps right now, that clearly sucks and makes the whole thing slower and weird and jerky. If this just makes it so frames keep coming smoothly, but the engine is going all slow and weird, that's going to be bizarre. Like, does it go to super smooth looking slow-mo or something? Still lagging and slowing down for all intents and purposes but not jerky with obvious frame drops?

Once again I must :psyduck: This can't be as bad as I'm imagining it or they'd never have focused on this for this generation, right, I must be missing something.

Agreed fucked around with this message at 00:19 on Sep 22, 2022

Llamadeus
Dec 20, 2005
The demonstrations they're doing aren't like 25->50, more 60->120 and higher. Even if there aren't any latency benefits you do at least get the graphical benefit of double fps (in theory)

repiv
Aug 13, 2009

yeah there's no doubt the experience is still going to suck if the internal framerate goes very low, but from what we've seen nvidia is positioning it as an enhancement on top of a game already running reasonably well (at least 60 real frames per second)

lih
May 15, 2013

Just a friendly reminder of what it looks like.

We'll do punctuation later.
well you get the actual performance uplift from the DLSS upscaling too, the interpolation is just on top of that

there are situations where the improved smoothness will be nice (really depends on how well it works, how much artefacting is there etc.) but it's probably pretty marginal a lot of the time & certainly not as big of a deal as the introduction of upscaling

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

If it's anything like the motionPlus TVs that interpolate frames, I'm not even slightly interested, I think those make poo poo look weird. I don't see why it's good to have cards do anything more than actually get faster and better at rendering real frames per second, honestly, maybe I'm getting old but this seems like such a dud of a feature and then they use it in the figures to act like it's a big real framerate boost. Whaaaaat?

Forgive me friends I'm still an old head using 60FPS and thinking that looks pretty good, I'm sure there's someone out there really looking forward to fake 140fps if they can only run at 70fps or whatever

Dr. Video Games 0031
Jul 17, 2004

In that CP2077 video, they show the 2 to 3 jump going from 60fps to 90fps.

Low-fps gameplay is going to feel weird with it on, but I don't think it will appear in slow-motion or anything. You'll just have more input lag.

Dr. Video Games 0031
Jul 17, 2004

Agreed posted:

If it's anything like the motionPlus TVs that interpolate frames, I'm not even slightly interested, I think those make poo poo look weird. I don't see why it's good to have cards do anything more than actually get faster and better at rendering real frames per second, honestly, maybe I'm getting old but this seems like such a dud of a feature and then they use it in the figures to act like it's a big real framerate boost. Whaaaaat?

Forgive me friends I'm still an old head using 60FPS and thinking that looks pretty good, I'm sure there's someone out there really looking forward to fake 140fps if they can only run at 70fps or whatever

They are positioning it as giving much better results than TV motion smoothing. The idea is that they're using AI stuff and advanced optical flow engines to make the generated frames look as indistinguishable from real frames as possible. And the goal is to also insert them seamlessly. I think, outside of artifacting and input lag concerns, it should hopefully look pretty normal. But I also agree that it will be a pretty niche feature in practice and it's incredibly dumb that Nvidia is positioning it as real FPS that should be used as the basis of comparing this generation vs the last. It feels like a desperate attempt to gussy up the small and somewhat underwhelming GPUs they're using for the 80-tier cards.

edit: thank you repiv for editing your post to reflect my edited word choice. one of these days i'm going to learn how to make a good post without doing 30 edits after the fact

Dr. Video Games 0031 fucked around with this message at 00:42 on Sep 22, 2022

repiv
Aug 13, 2009

Dr. Video Games 0031 posted:

They are positioning it as giving much better results than TV motion smoothing. The idea is that they're using AI stuff and advanced optical flow engines to make the generated frames look as indistinguishable from real frames as possible.

DLSS also has the benefit of getting exact motion vectors and depth buffers from the engine, which can be combined with the optical flow to get a better idea of how the scene is changing

the video interpolator in your TV, or even high-end software like twixtor has nothing to work with besides optical flow, so it's much harder for them to get it right

repiv fucked around with this message at 00:38 on Sep 22, 2022

MarcusSA
Sep 23, 2007

Dr. Video Games 0031 posted:

You'll just have more input lag.

At a certain point we might as well stream the poo poo and not deal with any of the heat / power bullshit.

CaptainSarcastic
Jul 6, 2013



MarcusSA posted:

At a certain point we might as well stream the poo poo and not deal with any of the heat / power bullshit.

I've been using GeForce Now for the last several days and it is pretty impressive, but it is highly bandwidth intensive and very vulnerable to network hiccups. When my ping is hanging around 19 then it's remarkably smooth and competitive with native rendering, but when my ping gets close 30 it lags and stutters pretty badly.

Thinking of burning a few extra dollars to try out the 3080 tier - I've been playing at the entry-level paid tier which maxes out at 1080p, but have a few more days left where I am with no bandwidth cap and a 1440p monitor.

repiv
Aug 13, 2009

Dr. Video Games 0031 posted:

In that CP2077 video, they show the 2 to 3 jump going from 60fps to 90fps.

that would imply the overhead of the frame synthesis is 5.55ms if my math is correct, though that's specific to whatever output resolution they're targeting (i guess 4K going by the video resolution?)

it'll be cheaper at lower resolutions, but that's on the 4090 so the lower SKUs will get hit harder. that's a pretty big chunk of frametime, far from free.

repiv fucked around with this message at 01:09 on Sep 22, 2022

8-bit Miniboss
May 24, 2005

CORPO COPS CAME FOR MY :filez:
For laughs, I did some digital archaeology (mostly, the last card there was bought from some random computer shop on paper receipts) looking at all the graphics cards I've ever purchased. I was mostly in the affordable range of cards (not accounting for inflation):



The outliers being the 3080 FE because I had the opportunity to get one and them stimmys came at the right time and the 8800 GTX was because I was working a great job with absurd pay at the time and splurged a little. 970 probably one of the best price for performance cards out of the bunch.

Fun times that I will never see again. :qq:

Edit: Also now that I think about it, all but the Geforce 2 MX have been used for World of Warcraft in some form.... :shepface:

8-bit Miniboss fucked around with this message at 01:34 on Sep 22, 2022

repiv
Aug 13, 2009

Dr. Video Games 0031 posted:

In that CP2077 video, they show the 2 to 3 jump going from 60fps to 90fps.

repiv posted:

that would imply the overhead of the frame synthesis is 5.55ms if my math is correct, though that's specific to whatever output resolution they're targeting (i guess 4K going by the video resolution?)

it'll be cheaper at lower resolutions, but that's on the 4090 so the lower SKUs will get hit harder. that's a pretty big chunk of frametime, far from free.

extrapolating from this doesn't paint a very flattering picture, if you're GPU limited at 100fps then enabling DLSS3 would drop your internal framerate to 64fps for a simulated 128fps output

at 180fps you hit the tipping point where enabling DLSS3 drops to 90fps internal for a simulated 180fps output, just a worse version of what you had in the first place. no free lunch for 360hz monitor havers.



again this is based on 4K performance i think, so the overhead would be less at lower resolutions, but if this is accurate then it's pretty rough

repiv fucked around with this message at 01:51 on Sep 22, 2022

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

repiv posted:

that would imply the overhead of the frame synthesis is 5.55ms if my math is correct, though that's specific to whatever output resolution they're targeting (i guess 4K going by the video resolution?)

it'll be cheaper at lower resolutions, but that's on the 4090 so the lower SKUs will get hit harder. that's a pretty big chunk of frametime, far from free.

repiv posted:

extrapolating from this doesn't paint a very flattering picture, if you're GPU limited at 100fps then enabling DLSS3 would drop your internal framerate to 64fps for a simulated 128fps output

at 180fps you hit the tipping point where enabling DLSS3 drops to 90fps internal for a simulated 180fps output, just a worse version of what you had in the first place. no free lunch for 360hz monitor havers.



again this is based on 4K performance i think, so the overhead would be less at lower resolutions, but if this is accurate then it's pretty rough

Technically, the GPU isn't specified. We have no idea about the specifics of the settings, but whatever the card is, it's doing fairly poorly without DLSS.

MarcusSA
Sep 23, 2007

Rinkles posted:

Technically, the GPU isn't specified. We have no idea about the specifics of the settings, but whatever the card is, it's doing fairly poorly without DLSS.

I have to imagine its the 407080 like the Dr said.

Like that's pretty drat rough if its the 4090

repiv
Aug 13, 2009

oh you're right they didn't specify the card

well whatever card it is, the fixed overhead of running DLSS3 on it at 4K is significant

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.
I'm fairly certain that the CP2077 example that they showed was on a 4090, for a specific new settings option that's designed to run like poo poo.

MarcusSA
Sep 23, 2007

NGL if I got a 4090 and it couldn’t pull 30fps without DLSS I’d be kinda wtf

Dr. Video Games 0031
Jul 17, 2004

I don't know. It's running at 22fps in that video, which is what a 3090 Ti does at 4K Psycho RT in the current version. The new options are going to be heavier, but THAT much heavier?

repiv posted:

extrapolating from this doesn't paint a very flattering picture, if you're GPU limited at 100fps then enabling DLSS3 would drop your internal framerate to 64fps for a simulated 128fps output

at 180fps you hit the tipping point where enabling DLSS3 drops to 90fps internal for a simulated 180fps output, just a worse version of what you had in the first place. no free lunch for 360hz monitor havers.



again this is based on 4K performance i think, so the overhead would be less at lower resolutions, but if this is accurate then it's pretty rough

Yeah, that would be a pretty serious reduction to game responsiveness when turning it on at 60fps. It really has to be more seamless than this to be usable, in my opinion. For what it's worth, Digital Foundry's DLSS 3.0 preview also showed a 50% "visual fps" boost between 2.0 and 3.0.



(386.9 / 254 = 1.523)

Dr. Video Games 0031 fucked around with this message at 02:17 on Sep 22, 2022

tehinternet
Feb 14, 2005

Semantically, "you" is both singular and plural, though syntactically it is always plural. It always takes a verb form that originally marked the word as plural.

Also, there is no plural when the context is an argument with an individual rather than a group. Somfin shouldn't put words in my mouth.

Ulio posted:

Just gonna wait for a 4090 ti or something. Not gonna upgrade from 3090 to 4090.

Yeah, that’s how I felt about the 3090 having a 2080Ti. If you’ve got a flagship card or whatever, upgrading every generation doesn’t make sense to me given the price increases.

Cygni
Nov 12, 2005

raring to post

I wouldn't hold my breath for price drops:

quote:

During the Q&A session, Jensen Huang was asked about GPU prices. His response was very telling.

“Moore’s Law is dead. […] A 12-inch wafer is a lot more expensive today. The idea that the chip is going to go down in price is a story of the past,” said Nvidia CEO Jensen Huang in a response to PC World’s Gordon Ung.

repiv
Aug 13, 2009

Dr. Video Games 0031 posted:

Yeah, that would be a pretty serious reduction to game responsiveness when turning it on at 60fps

it would explain why enabling DLSS3 automatically enables Reflex, if enabling frame synthesis reduces the internal FPS by a significant amount then they're probably relying on Reflex to claw back some of the lost responsiveness

Dr. Video Games 0031
Jul 17, 2004

We'll see about that. As a rule, I generally don't trust CEOs when they're justifying high prices with increased costs. That may be some part of this, but there is no way they aren't also increasing the prices way beyond what they have to.

MarcusSA
Sep 23, 2007

They saw what people were willing to pay in 2020/2021/ early 22 and were like hell yeah this train ain’t ever gonna end!

CatelynIsAZombie
Nov 16, 2006

I can't wait to bomb DO-DON-GOES!

repiv posted:

these blender benchmarks are just sad

https://www.phoronix.com/review/blender32-hip-cuda/2

even if you switch the nvidia cards to the old CUDA backend, which is pure compute not leveraging the RT cores at all, they still run circles around the AMD cards

Do you know a good place to find performance benchmarks for unreal 5 rendering?

mcbexx
Jul 4, 2004

British dentistry is
not on trial here!




Gotta love how they are making GBS threads on their own filthy last gen NVENC tech.
It's nowhere near as bad on the old cards as they are making it out to be here.

My 2070S encoded streams do not look that bad at 1080p60.

CatelynIsAZombie
Nov 16, 2006

I can't wait to bomb DO-DON-GOES!

mcbexx posted:

Gotta love how they are making GBS threads on their own filthy last gen NVENC tech.
It's nowhere near as bad on the old cards as they are making it out to be here.

My 2070S encoded streams do not look that bad at 1080p60.

youtube at least suggests 1080p60fps h.264 requires 12mbps in bitrate to not look like that though. Those samples are at 6mbps bitrate which is the recommended bitrate for 1080 30fps. (cbr 12000 vs cbr 6000 if we're going by a more relatable obs setting) . Av1 really is good at doubling quality or halving bitrate, something the Intel Arc gpus and general Av1 encode testing has demonstrated (cpu encode gets the same uplift). So I'm at least semi confident any card encoding h.264 at half the recommended bitrate for that codec would look like poo poo.

KillHour
Oct 28, 2007


Subjunctive posted:

OK, fine, when do orders open up for the 4090? Gonna be that sucker for the second generation in a row.

Means a 1080 will fall off the end of the household upgrade flow for a friend’s kid, at least. Now just need Zen 4 so she can get a CPU too…

Quit rubbing it in my face that I'm still running a 1080 in my main PC :(

joe football
Dec 22, 2012

Cygni posted:

I wouldn't hold my breath for price drops:

While good for justifying the pricing and reassuring wall street today, and maybe true, this seems like a horrible thing for nvidia beyond the next year or so and not something I'd want to say a lot as CEO

lih
May 15, 2013

Just a friendly reminder of what it looks like.

We'll do punctuation later.
there's no way it's true to the extent they're pretending it is

MarcusSA
Sep 23, 2007

I really think it’s going to be tough for Nvidia not to take a bath this generation. They obviously aren’t going to go under but they are probably going to struggle.

Dr. Video Games 0031
Jul 17, 2004

If going up a node every gen is going to reduce in ever skyrocketing GPU costs, then I guess it's time to get comfortable at 5/4nm for a while. 3nm can wait.

infraboy
Aug 15, 2002

Phungshwei!!!!!!1123
What happens when they go beyond 1 nanometer.

MarcusSA
Sep 23, 2007

infraboy posted:

What happens when they go beyond 1 nanometer.

By the time that happens you’ll have the steam brain link installed

Llamadeus
Dec 20, 2005

infraboy posted:

What happens when they go beyond 1 nanometer.
Intel are going with angstroms

Taitale
Feb 19, 2011

Cygni posted:

I wouldn't hold my breath for price drops:

How often do you see CEOs come out and say "yes will be dropping the price for this brand new product that isn't even on sale yet"?

Doing so would only hurt early sales as people who have no urgent need decide to hold off and wait for that lower price.

Adbot
ADBOT LOVES YOU

teagone
Jun 10, 2003

That was pretty intense, huh?

What would be considered the best-in-slot card that'll be fine under a 600W power supply and paired with a 65W CPU?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply