|
Agreed posted:
no, the engine ticks at the "internal" framerate and the doubled framerate is purely visual DLSS3 won't improve input responsiveness, it's purely to make the graphics smoother the upshot of that is it works in CPU-limited scenarios, if your CPU chokes at 70fps then DLSS3 will still be able to churn out "~140fps" repiv fucked around with this message at 00:13 on Sep 22, 2022 |
# ? Sep 22, 2022 00:11 |
|
|
# ? May 28, 2024 16:19 |
|
Do we know yet if the extra "fake" frames (there's probably a better word for this) that are generated will be effectively free (any artifacting notwithstanding)? As in will there be as many "real" frames with Frame Generation turned off and on, or will it have less of them with it on?
|
# ? Sep 22, 2022 00:12 |
|
synthesizing the fake frames will chew up some amount of compute time so it's not quite free, in practice if you're GPU limited it will be something like 60fps with DLSS off, or 55fps->110fps with DLSS on
|
# ? Sep 22, 2022 00:15 |
|
repiv posted:no, the engine ticks at the "internal" framerate and the doubled framerate is purely visual I do not see how this is a feature. Or how they're calling that "performance." Does that mean in situations where the engine is putting out, I dunno, 26 fps, it's just going to look smoother while still actually being 26fps as far as input and engine frames are concerned? Like, if the base performance is choking, how is this not going to be... just... smoother choking? I am trying to figure out how this improves gaming in practice. If a game dips to sub-30fps right now, that clearly sucks and makes the whole thing slower and weird and jerky. If this just makes it so frames keep coming smoothly, but the engine is going all slow and weird, that's going to be bizarre. Like, does it go to super smooth looking slow-mo or something? Still lagging and slowing down for all intents and purposes but not jerky with obvious frame drops? Once again I must This can't be as bad as I'm imagining it or they'd never have focused on this for this generation, right, I must be missing something. Agreed fucked around with this message at 00:19 on Sep 22, 2022 |
# ? Sep 22, 2022 00:16 |
|
The demonstrations they're doing aren't like 25->50, more 60->120 and higher. Even if there aren't any latency benefits you do at least get the graphical benefit of double fps (in theory)
|
# ? Sep 22, 2022 00:20 |
|
yeah there's no doubt the experience is still going to suck if the internal framerate goes very low, but from what we've seen nvidia is positioning it as an enhancement on top of a game already running reasonably well (at least 60 real frames per second)
|
# ? Sep 22, 2022 00:21 |
|
well you get the actual performance uplift from the DLSS upscaling too, the interpolation is just on top of that there are situations where the improved smoothness will be nice (really depends on how well it works, how much artefacting is there etc.) but it's probably pretty marginal a lot of the time & certainly not as big of a deal as the introduction of upscaling
|
# ? Sep 22, 2022 00:22 |
|
If it's anything like the motionPlus TVs that interpolate frames, I'm not even slightly interested, I think those make poo poo look weird. I don't see why it's good to have cards do anything more than actually get faster and better at rendering real frames per second, honestly, maybe I'm getting old but this seems like such a dud of a feature and then they use it in the figures to act like it's a big real framerate boost. Whaaaaat? Forgive me friends I'm still an old head using 60FPS and thinking that looks pretty good, I'm sure there's someone out there really looking forward to fake 140fps if they can only run at 70fps or whatever
|
# ? Sep 22, 2022 00:25 |
|
In that CP2077 video, they show the 2 to 3 jump going from 60fps to 90fps. Low-fps gameplay is going to feel weird with it on, but I don't think it will appear in slow-motion or anything. You'll just have more input lag.
|
# ? Sep 22, 2022 00:27 |
|
Agreed posted:If it's anything like the motionPlus TVs that interpolate frames, I'm not even slightly interested, I think those make poo poo look weird. I don't see why it's good to have cards do anything more than actually get faster and better at rendering real frames per second, honestly, maybe I'm getting old but this seems like such a dud of a feature and then they use it in the figures to act like it's a big real framerate boost. Whaaaaat? They are positioning it as giving much better results than TV motion smoothing. The idea is that they're using AI stuff and advanced optical flow engines to make the generated frames look as indistinguishable from real frames as possible. And the goal is to also insert them seamlessly. I think, outside of artifacting and input lag concerns, it should hopefully look pretty normal. But I also agree that it will be a pretty niche feature in practice and it's incredibly dumb that Nvidia is positioning it as real FPS that should be used as the basis of comparing this generation vs the last. It feels like a desperate attempt to gussy up the small and somewhat underwhelming GPUs they're using for the 80-tier cards. edit: thank you repiv for editing your post to reflect my edited word choice. one of these days i'm going to learn how to make a good post without doing 30 edits after the fact Dr. Video Games 0031 fucked around with this message at 00:42 on Sep 22, 2022 |
# ? Sep 22, 2022 00:29 |
|
Dr. Video Games 0031 posted:They are positioning it as giving much better results than TV motion smoothing. The idea is that they're using AI stuff and advanced optical flow engines to make the generated frames look as indistinguishable from real frames as possible. DLSS also has the benefit of getting exact motion vectors and depth buffers from the engine, which can be combined with the optical flow to get a better idea of how the scene is changing the video interpolator in your TV, or even high-end software like twixtor has nothing to work with besides optical flow, so it's much harder for them to get it right repiv fucked around with this message at 00:38 on Sep 22, 2022 |
# ? Sep 22, 2022 00:29 |
|
Dr. Video Games 0031 posted:You'll just have more input lag. At a certain point we might as well stream the poo poo and not deal with any of the heat / power bullshit.
|
# ? Sep 22, 2022 00:53 |
|
MarcusSA posted:At a certain point we might as well stream the poo poo and not deal with any of the heat / power bullshit. I've been using GeForce Now for the last several days and it is pretty impressive, but it is highly bandwidth intensive and very vulnerable to network hiccups. When my ping is hanging around 19 then it's remarkably smooth and competitive with native rendering, but when my ping gets close 30 it lags and stutters pretty badly. Thinking of burning a few extra dollars to try out the 3080 tier - I've been playing at the entry-level paid tier which maxes out at 1080p, but have a few more days left where I am with no bandwidth cap and a 1440p monitor.
|
# ? Sep 22, 2022 00:57 |
|
Dr. Video Games 0031 posted:In that CP2077 video, they show the 2 to 3 jump going from 60fps to 90fps. that would imply the overhead of the frame synthesis is 5.55ms if my math is correct, though that's specific to whatever output resolution they're targeting (i guess 4K going by the video resolution?) it'll be cheaper at lower resolutions, but that's on the 4090 so the lower SKUs will get hit harder. that's a pretty big chunk of frametime, far from free. repiv fucked around with this message at 01:09 on Sep 22, 2022 |
# ? Sep 22, 2022 01:03 |
|
For laughs, I did some digital archaeology (mostly, the last card there was bought from some random computer shop on paper receipts) looking at all the graphics cards I've ever purchased. I was mostly in the affordable range of cards (not accounting for inflation): The outliers being the 3080 FE because I had the opportunity to get one and them stimmys came at the right time and the 8800 GTX was because I was working a great job with absurd pay at the time and splurged a little. 970 probably one of the best price for performance cards out of the bunch. Fun times that I will never see again. Edit: Also now that I think about it, all but the Geforce 2 MX have been used for World of Warcraft in some form.... 8-bit Miniboss fucked around with this message at 01:34 on Sep 22, 2022 |
# ? Sep 22, 2022 01:28 |
|
Dr. Video Games 0031 posted:In that CP2077 video, they show the 2 to 3 jump going from 60fps to 90fps. repiv posted:that would imply the overhead of the frame synthesis is 5.55ms if my math is correct, though that's specific to whatever output resolution they're targeting (i guess 4K going by the video resolution?) extrapolating from this doesn't paint a very flattering picture, if you're GPU limited at 100fps then enabling DLSS3 would drop your internal framerate to 64fps for a simulated 128fps output at 180fps you hit the tipping point where enabling DLSS3 drops to 90fps internal for a simulated 180fps output, just a worse version of what you had in the first place. no free lunch for 360hz monitor havers. again this is based on 4K performance i think, so the overhead would be less at lower resolutions, but if this is accurate then it's pretty rough repiv fucked around with this message at 01:51 on Sep 22, 2022 |
# ? Sep 22, 2022 01:36 |
|
repiv posted:that would imply the overhead of the frame synthesis is 5.55ms if my math is correct, though that's specific to whatever output resolution they're targeting (i guess 4K going by the video resolution?) repiv posted:extrapolating from this doesn't paint a very flattering picture, if you're GPU limited at 100fps then enabling DLSS3 would drop your internal framerate to 64fps for a simulated 128fps output Technically, the GPU isn't specified. We have no idea about the specifics of the settings, but whatever the card is, it's doing fairly poorly without DLSS.
|
# ? Sep 22, 2022 01:41 |
|
Rinkles posted:Technically, the GPU isn't specified. We have no idea about the specifics of the settings, but whatever the card is, it's doing fairly poorly without DLSS. I have to imagine its the 40 Like that's pretty drat rough if its the 4090
|
# ? Sep 22, 2022 01:43 |
|
oh you're right they didn't specify the card well whatever card it is, the fixed overhead of running DLSS3 on it at 4K is significant
|
# ? Sep 22, 2022 01:45 |
|
I'm fairly certain that the CP2077 example that they showed was on a 4090, for a specific new settings option that's designed to run like poo poo.
|
# ? Sep 22, 2022 01:55 |
|
NGL if I got a 4090 and it couldn’t pull 30fps without DLSS I’d be kinda wtf
|
# ? Sep 22, 2022 01:57 |
|
I don't know. It's running at 22fps in that video, which is what a 3090 Ti does at 4K Psycho RT in the current version. The new options are going to be heavier, but THAT much heavier? repiv posted:extrapolating from this doesn't paint a very flattering picture, if you're GPU limited at 100fps then enabling DLSS3 would drop your internal framerate to 64fps for a simulated 128fps output Yeah, that would be a pretty serious reduction to game responsiveness when turning it on at 60fps. It really has to be more seamless than this to be usable, in my opinion. For what it's worth, Digital Foundry's DLSS 3.0 preview also showed a 50% "visual fps" boost between 2.0 and 3.0. (386.9 / 254 = 1.523) Dr. Video Games 0031 fucked around with this message at 02:17 on Sep 22, 2022 |
# ? Sep 22, 2022 02:15 |
|
Ulio posted:Just gonna wait for a 4090 ti or something. Not gonna upgrade from 3090 to 4090. Yeah, that’s how I felt about the 3090 having a 2080Ti. If you’ve got a flagship card or whatever, upgrading every generation doesn’t make sense to me given the price increases.
|
# ? Sep 22, 2022 02:18 |
|
I wouldn't hold my breath for price drops:quote:During the Q&A session, Jensen Huang was asked about GPU prices. His response was very telling.
|
# ? Sep 22, 2022 02:21 |
|
Dr. Video Games 0031 posted:Yeah, that would be a pretty serious reduction to game responsiveness when turning it on at 60fps it would explain why enabling DLSS3 automatically enables Reflex, if enabling frame synthesis reduces the internal FPS by a significant amount then they're probably relying on Reflex to claw back some of the lost responsiveness
|
# ? Sep 22, 2022 02:24 |
|
We'll see about that. As a rule, I generally don't trust CEOs when they're justifying high prices with increased costs. That may be some part of this, but there is no way they aren't also increasing the prices way beyond what they have to.
|
# ? Sep 22, 2022 02:24 |
|
They saw what people were willing to pay in 2020/2021/ early 22 and were like hell yeah this train ain’t ever gonna end!
|
# ? Sep 22, 2022 02:26 |
|
repiv posted:these blender benchmarks are just sad Do you know a good place to find performance benchmarks for unreal 5 rendering?
|
# ? Sep 22, 2022 02:30 |
|
repiv posted:twitch please enable AV1 already Gotta love how they are making GBS threads on their own filthy last gen NVENC tech. It's nowhere near as bad on the old cards as they are making it out to be here. My 2070S encoded streams do not look that bad at 1080p60.
|
# ? Sep 22, 2022 02:34 |
|
mcbexx posted:Gotta love how they are making GBS threads on their own filthy last gen NVENC tech. youtube at least suggests 1080p60fps h.264 requires 12mbps in bitrate to not look like that though. Those samples are at 6mbps bitrate which is the recommended bitrate for 1080 30fps. (cbr 12000 vs cbr 6000 if we're going by a more relatable obs setting) . Av1 really is good at doubling quality or halving bitrate, something the Intel Arc gpus and general Av1 encode testing has demonstrated (cpu encode gets the same uplift). So I'm at least semi confident any card encoding h.264 at half the recommended bitrate for that codec would look like poo poo.
|
# ? Sep 22, 2022 02:47 |
|
Subjunctive posted:OK, fine, when do orders open up for the 4090? Gonna be that sucker for the second generation in a row. Quit rubbing it in my face that I'm still running a 1080 in my main PC
|
# ? Sep 22, 2022 02:48 |
|
Cygni posted:I wouldn't hold my breath for price drops: While good for justifying the pricing and reassuring wall street today, and maybe true, this seems like a horrible thing for nvidia beyond the next year or so and not something I'd want to say a lot as CEO
|
# ? Sep 22, 2022 04:55 |
|
there's no way it's true to the extent they're pretending it is
|
# ? Sep 22, 2022 05:14 |
|
I really think it’s going to be tough for Nvidia not to take a bath this generation. They obviously aren’t going to go under but they are probably going to struggle.
|
# ? Sep 22, 2022 05:21 |
|
If going up a node every gen is going to reduce in ever skyrocketing GPU costs, then I guess it's time to get comfortable at 5/4nm for a while. 3nm can wait.
|
# ? Sep 22, 2022 06:11 |
|
What happens when they go beyond 1 nanometer.
|
# ? Sep 22, 2022 06:54 |
|
infraboy posted:What happens when they go beyond 1 nanometer. By the time that happens you’ll have the steam brain link installed
|
# ? Sep 22, 2022 06:55 |
|
infraboy posted:What happens when they go beyond 1 nanometer.
|
# ? Sep 22, 2022 06:56 |
|
Cygni posted:I wouldn't hold my breath for price drops: How often do you see CEOs come out and say "yes will be dropping the price for this brand new product that isn't even on sale yet"? Doing so would only hurt early sales as people who have no urgent need decide to hold off and wait for that lower price.
|
# ? Sep 22, 2022 07:12 |
|
|
# ? May 28, 2024 16:19 |
|
What would be considered the best-in-slot card that'll be fine under a 600W power supply and paired with a 65W CPU?
|
# ? Sep 22, 2022 07:34 |