|
Cross-Section posted:Yeah, I was planning on taking a break until more patches dropped but then I stumbled upon that DLSS3 mod It didn't get rid of the stutter it just hid it. The game is still hitching and locking up like complete garbage, your GPU is just masking it from you with a fake frame.
|
# ? May 5, 2023 02:58 |
|
|
# ? May 31, 2024 00:01 |
|
Can someone explain what is going on for those of us who are less sophisticated (read: me)? The devs are inducing stalls by not loading assess to vram? I know that consoles have a different memory hierarchy, and I have a good idea how the hardware works, but the d3d calls are Greek to me.
|
# ? May 5, 2023 03:28 |
|
I’m confused, is that just frame gen or dlss upscaling too?
|
# ? May 5, 2023 03:30 |
|
Yudo posted:Can someone explain what is going on for those of us who are less sophisticated (read: me)? The devs are inducing stalls by not loading assess to vram? I have a good idea how the hardware works, but the d3d calls are Greek to me. This isn’t really anything d3d specific, it’s basic multithreading and real time task management. Basically there are a bunch of threads. One of them is the render thread. It is spending a lot of time either waiting for other threads to release a resource or doing other stuff that causes long waits (ie the kernel calls). The specific kernel call there is an allocation. This is something you never do with a real time task or “tight loop” because on a general purpose OS asking for a new resource means checking that you’re allowed the resource, recording the allocation, randomizing the layout and all sorts of other stuff. On a console I guess that works differently, though you’d think Xbox wouldn’t be that different. hobbesmaster fucked around with this message at 03:44 on May 5, 2023 |
# ? May 5, 2023 03:37 |
|
hobbesmaster posted:This isn’t really anything d3d specific, it’s basic multithreading and real time task management. I see, thank you. That is...more basic than I thought.
|
# ? May 5, 2023 03:43 |
|
Taima posted:I’m confused, is that just frame gen or dlss upscaling too? Yeah it has DLSS2 upscaling as well. It supposedly works pretty good
|
# ? May 5, 2023 04:01 |
|
now I haven't programmed in over a decade, but from a process-oriented perspective, it seems to me like you would approach this problem from the end result backwards if you want a game to run at 60 FPS, then you have 16.67ms to get everything done before a new frame needs to be rendered that's your, let's call it a temporal budget, and so you'd look at what exactly "everything" breaks down into, and then parcel it out across the 16 milliseconds you start with what absolutely needs to happen, and go on down the list - if anything is still taking up time that could be shoveled off into another thread, or doesn't need to be done at all, or can be made to run faster/shorter, you trim it even after you've pared everything down to something that fits within your budget, you'd be doing it with a particular hardware profile in mind, and something like a 4-thread Pentium might not have enough threads to handle all the ones you'd spawn off the main render thread, or something like a Ryzen 1400 might not have enough clock speed/instructions-per-clock to complete your main render thread as fast as what you baseline against, and that's where hardware requirements and recommendations come in, but if you were baselining against, say, a Ryzen 3600, then you'd know that anyone with an even faster CPU than that would be able to hit the 60 FPS mark (setting aside the GPU now) I'm just shooting the poo poo and everything I've mentioned is probably way easier said than done, but does this make sense?
|
# ? May 5, 2023 04:13 |
|
gradenko_2000 posted:now I haven't programmed in over a decade, but from a process-oriented perspective, it seems to me like you would approach this problem from the end result backwards
|
# ? May 5, 2023 04:26 |
|
gradenko_2000 posted:now I haven't programmed in over a decade, but from a process-oriented perspective, it seems to me like you would approach this problem from the end result backwards Wiggly Wayne DDS posted:you've just described the FOX Engine as developed at Konami. the development costs for that were around ~$50m from memory, then it got abandoned as everyone gawked from the side at how it ran perfectly on any platform. it costs a lot to create variations of fast or accurate shaders, budget ai, interleave it all across threads, but the end result was insane. that was a ground-up redesign where other engines are bolting on dynamic resolution scaling and calling it at day Basically the devs planned on AI upscaling via DLSS and AI false frame generation saving the day, and were supremely lazy in prioritization of tasks in the render queue. It's gone from "must have been a bug" to "wow this is shocking incompetence and laziness, how in the gently caress did this pass QA", and yet EA will still make a fuckton of money because Star Wars is popular. I don't really know how you fix something that is broken at such a low level that your render pipeline has 900ms thread locks, like, they developed the whole game wrong, as a joke, or something.
|
# ? May 5, 2023 04:35 |
|
orange juche posted:EA will still make a fuckton of money because Star Wars is popular. Star Wars fans will eat up literally any slop that is fed to them, so long as it has that branding on it. They could sell bags of actual fecal matter in a paper sack, say it’s “Wookie Poop” and stamp the Star Wars logo on the bag and people will buy it. Also it seems like the only way to actually “fix” these low level issues would be an entirely new from scratch port. Though it’s doubtful EA will spend the money or manhours required to do that, and they’ll just put out a few patches to try to mask the worst issues and then move on, because Star Wars fans ultimately don’t care.
|
# ? May 5, 2023 04:45 |
|
Wiggly Wayne DDS posted:you've just described the FOX Engine as developed at Konami. the development costs for that were around ~$50m from memory, then it got abandoned as everyone gawked from the side at how it ran perfectly on any platform. it costs a lot to create variations of fast or accurate shaders, budget ai, interleave it all across threads, but the end result was insane. that was a ground-up redesign where other engines are bolting on dynamic resolution scaling and calling it at day it really hurts because they didn't compromise on the game/level design for the 360/PS3, it's the full-fledged game just with awful graphics, and the gamedesign suffers in some ways. Bullets/missiles don't work past 255 meters or something, and everyone freezes in time when you're not in a "base" cell right down to the time it takes them to run to machineguns. Timefreezing a soldier in place while you step out of the checkpoint to trigger a save and then he magically resumes when you walk back in breaks the immersion. There were so many unexploited possibilities for more advanced/intelligent responses at the base too. Compare the simple soldier agent loop to something like FEAR (got mentioned recently and lol imagine). More intelligent behavior around "is still in the area but unsighted" vs "has extracted and reinserted and is thus invisible". Etc. Some of the Infinite Heaven stuff is much cooler in terms of possibilities, because it makes the game much much harder vs the speedrun simulator it kinda is. But, I got it to run at minres/minsettings on a J5005 NUC at somewhat playable levels, 20-40 fps type stuff. I S-ranked Code Talker on it my first time through. It ran on the PS3's CPU and that's fairly impressive. Paul MaudDib fucked around with this message at 05:16 on May 5, 2023 |
# ? May 5, 2023 05:11 |
|
orange juche posted:It didn't get rid of the stutter it just hid it. The game is still hitching and locking up like complete garbage, your GPU is just masking it from you with a fake frame. … sweet!
|
# ? May 5, 2023 05:20 |
|
orange juche posted:Basically the devs planned on AI upscaling via DLSS and AI false frame generation saving the day, and were supremely lazy in prioritization of tasks in the render queue. It's gone from "must have been a bug" to "wow this is shocking incompetence and laziness, how in the gently caress did this pass QA", and yet EA will still make a fuckton of money because Star Wars is popular. they were not planning on relying on ai frame generation or dlss (just fsr) since they had a marketing deal with amd, which is why it doesn't have dlss. regular dlss/fsr (without frame generation which fsr doesn't have yet) do nothing about cpu bottlenecks which are the main problem anyway. it sounds like the terrible approach they chose is less of a problem on console (though still a bad idea) which is maybe how they ended up with it. but it still seems so obviously a bad idea it's hard to understand
|
# ? May 5, 2023 07:14 |
|
plus frame generation can't really hide long stutters, you need reasonable frame pacing in the first place
|
# ? May 5, 2023 10:55 |
|
Branch Nvidian posted:Star Wars fans will eat up literally any slop that is fed to them, so long as it has that branding on it. They could sell bags of actual fecal matter in a paper sack, say it’s “Wookie Poop” and stamp the Star Wars logo on the bag and people will buy it. The most confusing thing to me is that I thought they were using unreal engine, and I would've thought that unreal would handle basically all of that low level stuff. Like I get that you still have to optimize for your different console targets and stuff, it's not quite as simple as push the PC button for the PC release and the PS5 button for the PS5 afaik But you'd think it would let you avoid all these general mistakes otherwise. Or did they specifically modify it and make it worse somehow it's kinda baffling
|
# ? May 5, 2023 12:45 |
|
unreal gives all licensees full source code access, so if they wanted to poke around in the guts and make it worse they were certainly able to
|
# ? May 5, 2023 12:47 |
|
https://videocardz.com/newz/nvidia-neural-texture-compression-offers-4-times-higher-resolution-than-standard-compression-with-30-less-memory This is very cool. Nvidia engineers have developed a texture compression algorithm using tensor cores that achieves 16x texel count at a lower memory footprint and 'only' a 2x increase to decode time. Decoding happens in shaders and is thus hardware agnostic (this should benefit everyone). We'll have to see how this scales, but if you can achieve decode time parity while still having a large increase to fidelity and reduction to vram utilization, then it could be huge.
|
# ? May 5, 2023 12:50 |
|
we're overdue for some improvements to texture compression, the most advanced formats on current hardware (BC6/BC7) were defined by DX11 over a decade ago traditionally decoding textures has always been offloaded to the TMUs but maybe it is time to explore doing it in software for more flexibility
|
# ? May 5, 2023 12:54 |
|
We call it, Mega Textures.... 2.
|
# ? May 5, 2023 12:59 |
|
sampler feedback streaming is more or less megatexture 2: now in hardware
|
# ? May 5, 2023 13:00 |
|
repiv posted:unreal gives all licensees full source code access, so if they wanted to poke around in the guts and make it worse they were certainly able to Oh I guess that would make sense. Sort of seems like the kind of thing to not touch unless you have a real good reason though! Or is it pretty common for big games to modify the engine a bunch?
|
# ? May 5, 2023 13:11 |
|
it's not unheard of, though i have no idea how much work respawn might have done on it the coalition are the studio most well known for extensively customizing unreal, to the extent that they were shipping DX12 by default all the way back in 2016 and pretty much nailed the implementation on their first try, while upstream unreal still struggles repiv fucked around with this message at 13:22 on May 5, 2023 |
# ? May 5, 2023 13:18 |
|
the big thing is that you can take the time to properly extend unreal engine in a supported way to make it do what you want. this gives you some nice things like relatively predictable behavior, decent performance, and to a lesser extent, being able to upgrade engine versions with much less tinkering with your own code. but it also takes some time to do or you can pile on a bunch of quick hacks and dump the game on steam, and we all know it's almost universally gonna be the 2nd one because gamedev, AAA gamedev especially, is very stupid
|
# ? May 5, 2023 13:18 |
|
Weird Pumpkin posted:Oh I guess that would make sense. Sort of seems like the kind of thing to not touch unless you have a real good reason though! Based off the previous tweets which mention how the memory addresses are fixed on console, my noob armchair guess is someone decided to target that, and when everything is in order we'll go back and redo the implementation for PC. Except the launch date arrived before that.
|
# ? May 5, 2023 14:28 |
|
With the 4070 being actually available in dual-fan versions* I'm thinking of upgrading, but at the moment I have a Ryzen 5 3600 - is a slow CPU bottlenecking GPU performance still a problem to look out for, and if so, how do you check whether that will be a problem? *I thought mini-ITX was a good idea at the time
|
# ? May 5, 2023 19:03 |
|
overeager overeater posted:With the 4070 being actually available in dual-fan versions* I'm thinking of upgrading, but at the moment I have a Ryzen 5 3600 - is a slow CPU bottlenecking GPU performance still a problem to look out for, and if so, how do you check whether that will be a problem? What resolution and refresh rate/fps are you targeting? The newer CPUs will give you better 1 and 0.1% lows, which leads to a smoother-feeling experience, even if the max and average FPS aren't very different.
|
# ? May 5, 2023 19:07 |
|
Kibner posted:What resolution and refresh rate/fps are you targeting? The newer CPUs will give you better 1 and 0.1% lows, which leads to a smoother-feeling experience, even if the max and average FPS aren't very different. Oh, sorry, I should have specified - I have a 3440x1440 ultrawide monitor with a max refresh rate of 100 FPS.
|
# ? May 5, 2023 19:14 |
|
overeager overeater posted:Oh, sorry, I should have specified - I have a 3440x1440 ultrawide monitor with a max refresh rate of 100 FPS. Open palm slam a 5800X3D into the (AM4) slot along with the 4070. The CPU alone is going to result in a massive performance uplift. I’d expect a 3600 to hold a 4070 back at ultrawide 1440 (since this is the resolution I use too).
|
# ? May 5, 2023 19:39 |
|
overeager overeater posted:Oh, sorry, I should have specified - I have a 3440x1440 ultrawide monitor with a max refresh rate of 100 FPS. What main board do you have? I think the 5800x3d is capped at 105w, which is to say a decent vrm won't break a sweat. Fwiw, I went from a ryzen 3700 to a 5900 (I don't need that many cores, but it was on sale). The uplift is noticeable in every day tasks. For you, I imagine a relatively inexpensive upgrade will be quite worthwhile.
|
# ? May 5, 2023 20:12 |
|
Branch Nvidian posted:Open palm slam a 5800X3D into the (AM4) slot along with the 4070. The CPU alone is going to result in a massive performance uplift. I’d expect a 3600 to hold a 4070 back at ultrawide 1440 (since this is the resolution I use too). The 5800X3D looks solid, thanks for the tip! Is it fine to run with the stock cooler if I'm not planning on overclocking? Yudo posted:What main board do you have? I think the 5800x3d is capped at 105w, which is to say a decent vrm won't break a sweat. My main board is an ASUS B450-I - it seems like it should work?
|
# ? May 5, 2023 22:04 |
|
overeager overeater posted:The 5800X3D looks solid, thanks for the tip! Is it fine to run with the stock cooler if I'm not planning on overclocking? I’d spend the $55 on a better cooler.
|
# ? May 5, 2023 22:09 |
|
overeager overeater posted:The 5800X3D looks solid, thanks for the tip! Is it fine to run with the stock cooler if I'm not planning on overclocking? Technically, but it'd probably be smart to just pick up a Gamers Nexus approved Thermalright Peerless Assassin or similar: https://www.amazon.com/Thermalright-Peerless-SE-Aluminium-Technology/dp/B09LGY38L4. Just don't buy a Hyper 212 Evo in TYOOL2023.
|
# ? May 5, 2023 22:54 |
|
The 5800X3D is notoriously a nightmare to cool due to the stacked die topology (the cache die effectively acts as a thermal insulator). The stock cooler won't break it or anything, but getting a fancier one may appreciably improve its performance by letting it boost for longer.
|
# ? May 5, 2023 23:11 |
|
Either my Peerless Assassin or the default fans on my Fractal North are whining a bit. Just replacing them all with Noctua fans feels a bit silly but it might happen. IDK about longevity, but better cooling means higher clocks and such I think. Subjunctive fucked around with this message at 23:16 on May 5, 2023 |
# ? May 5, 2023 23:12 |
|
I posted a few days ago about poor performance with a new 4090 FE and 12900k, in TimeSpy and some games but good performance in very heavy beanchmarks like TimeSpy Extreme and Cyberpunk. I "fixed" it. I went into the BIOS and disabled all the efficiency cores. There seems to be a problem with the Windows 11 scheduler and its assigning the efficiency cores to Timespy and games like MW:DMZ, so I was getting horrible performance because I was CPU limited. Nothing to do but keep the big.little architecture feature disabled and hope that Microsoft will fix this. Animal fucked around with this message at 23:59 on May 5, 2023 |
# ? May 5, 2023 23:57 |
|
Process Lasso is a good software solution for that. Can customize per app which cores.
|
# ? May 6, 2023 02:12 |
|
Animal posted:I posted a few days ago about poor performance with a new 4090 FE and 12900k, in TimeSpy and some games but good performance in very heavy beanchmarks like TimeSpy Extreme and Cyberpunk. its so awesome we having these inane problems in 2023
|
# ? May 6, 2023 02:16 |
|
overeager overeater posted:
Yes, it supports the 5800x (I don't know if all b450 boards do). Be sure to update the bios if you have not done so recently before removing your current cpu.
|
# ? May 6, 2023 02:17 |
|
https://twitter.com/VideoCardz/status/1654461220831805440?s=20 going to go day 1 on the rog ally if they launch at $700-800
|
# ? May 6, 2023 02:17 |
|
|
# ? May 31, 2024 00:01 |
|
Palladium posted:its so awesome we having these inane problems in 2023 I just installed the developer experimental preview build of Windows 11 and it seems to be fully fixed.
|
# ? May 6, 2023 02:41 |