Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Yaoi Gagarin
Feb 20, 2014

its almost a whole rear end history lesson to explain how we got to this mess. but there used to be this idea called AZDO (approaching zero driver overhead). basically the cutting edge devs in graphics had realized that they were getting bottlenecked on the dx or opengl driver implementation and not able to keep the GPU fed. so they came up with all kinds of tricks to try to avoid making the driver do too much work. and over time little things started being added to the APIs themselves to help this process, but it just wasn't enough because these are pretty high level APIs and they require the driver to keep track of a lot of state. so they wanted something that was a lower level abstraction over the GPU. let the game developer do more of the work because he knows whats best for his specific game

it turns out, programming the GPU is loving hard. who knew!

Adbot
ADBOT LOVES YOU

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
I wonder if the DX12 alignment comes from the time in which it was created. Games have only become more difficult and time consuming to make; the bar was lower 10 years ago or whenever DX12 starting coming together at Microsoft, right?

At that time it probably made more sense to provide power that developers saw as the future of gaming. But in the meantime, top level productions have become almost unbearably complex in comparison (and in ways only tangential to raw rendering power). There's so many things to consider and bring together if you want to make a AAA game these days.

Anyways that's just speculation but it tracks with my internal view at least, of how the market has developed since DX12 was planned and executed. It seemed like at the time, developers were at least more on board with putting time into gaining a competitive edge visually, and sure, they still are, but the raw graphics of a game are so much less important now.

:shrug: is the answer to spend a million years compiling shaders? I would honestly be completely ok with that. I've come to appreciate it well enough in TLOU1, though of course it's rough having to do it every time you update drivers, the game has no stutters that I can perceive on my machine at least.

e: What confuses me more than anything is why UE didn't solve these issues? Is it because this type of rendering ask is completely separate per title and therefore can't be initiated by the engine on its own?

repiv posted:

unreal has been a showcase for how not to use DX12 though, they're the poster child for shader compilation jank and they're too buried in technical debt and backwards compatibility guarantees to rework the fundamentals of the engine

Is that really it? It's just due to backwards compatibility, primarily? Is there anything on the horizon that would get us out of this mess or is it 100% a wait and see situation?

Even Yuzu has compilation stutter issues on W11. Which is strange imo because it spends like 9 seconds compiling shaders; could it not spend more time compiling and avoid the issues? A core misunderstanding I have is how much of this is solved by simply spending the time compiling and how much is due to other factors.

Overall it's really difficult to even speak on this issue as it's so complex, and I know first-hand that if a technical fix is hard to explain to the money people and the C Suite people, it can be exceptionally hard to convince anyone to do it in the first place. I really appreciate the attempts to explain the situation so far.

The first step to really mobilizing to fix this issue is for laymen like me and others in this thread to understand the situation, so that we know what is going on and what to ask for. Just saying "make the game not stutter" doesn't appear to be enough in this area.

If we can't form a united front in the gaming community on this issue - and again, in a more specific way than "no stutters please" how are developers even supposed to pitch the level of work that it would require to people in charge when no easy explanation even exists?

You are managing a gaming division; developers pitch you for fewer stutters but they could be doing 9 other things that would help the end product. What do you do? Probably nothing or very little because the situation is complex, it's present in other games so the bar is low, and there are always tons of other priorities that are awaiting dev work. I totally see how it gets on the back burner, even if there is some kind of nominal commitment to fixing stutters as a broad mandate. On top of that consumers have no idea what the problem even is, broadly speaking, which makes it even easier to throw up your hands about it.

The other thing: do consumers actually care about these stutters? Does it actually affect sales? the first Jedi Fallen Order had horrible stuttering, and people talked about it, but I don't know anyone who didn't buy it because of the stuttering. So that might be another tough reality of the situation too. Of course when it gets really bad like Jedi Survivor it's an issue but it feels like a ton of games get away with these stutters in general.

Taima fucked around with this message at 10:18 on Jun 20, 2023

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

VostokProgram posted:

its almost a whole rear end history lesson to explain how we got to this mess. but there used to be this idea called AZDO (approaching zero driver overhead). basically the cutting edge devs in graphics had realized that they were getting bottlenecked on the dx or opengl driver implementation and not able to keep the GPU fed. so they came up with all kinds of tricks to try to avoid making the driver do too much work. and over time little things started being added to the APIs themselves to help this process, but it just wasn't enough because these are pretty high level APIs and they require the driver to keep track of a lot of state. so they wanted something that was a lower level abstraction over the GPU. let the game developer do more of the work because he knows whats best for his specific game

it turns out, programming the GPU is loving hard. who knew!

yeah that was sort of why I asked the question that I did: what I heard from people talking about DX12 (and Vulkan) some years back was that it was a good thing because you could squeeze out lots more performance from it, almost implying that DX11 was "holding them back"

Truga
May 4, 2014
Lipstick Apathy
IMO, then only people it was "holding back" were insane wizards like DE's steve and id's carmack (fake edit: did carmack even do anything interesting over the last 10 years, actually?), and because they're wizards they did insane poo poo with just directx/opengl anyway

for most people it just lets them make mistakes higher level apis didn't

Geemer
Nov 4, 2010



I recently updated my motherboard bios to upgrade my cpu from a 3600 to a 5800x3d and noticed the new bios added ReBAR support.
I enabled it, but I wonder if my 1070 can even benefit from it and if I need to change any settings in Windows 10 to do so.

mobby_6kl
Aug 9, 2009

by Fluffdaddy

Truga posted:

IMO, then only people it was "holding back" were insane wizards like DE's steve and id's carmack (fake edit: did carmack even do anything interesting over the last 10 years, actually?), and because they're wizards they did insane poo poo with just directx/opengl anyway

for most people it just lets them make mistakes higher level apis didn't

Carmack I think pretty much quit games to gently caress with rockets and cars and then eventually VR around Oculus and then facebook. I don't recall exactly what he's done but a lot of work over reducing latency and improving performance etc.

kliras
Mar 27, 2021

Geemer posted:

I recently updated my motherboard bios to upgrade my cpu from a 3600 to a 5800x3d and noticed the new bios added ReBAR support.
I enabled it, but I wonder if my 1070 can even benefit from it and if I need to change any settings in Windows 10 to do so.
mine just doesn’t boot with it enabled. same cpu and gpu

lih
May 15, 2013

Just a friendly reminder of what it looks like.

We'll do punctuation later.

Taima posted:

e: What confuses me more than anything is why UE didn't solve these issues? Is it because this type of rendering ask is completely separate per title and therefore can't be initiated by the engine on its own?

Is that really it? It's just due to backwards compatibility, primarily? Is there anything on the horizon that would get us out of this mess or is it 100% a wait and see situation?
they've been working on it, but as mentioned their solutions so far are not particularly good and only in the latest version (5.2), which only came out last month. meanwhile, games made in ue4 are still coming out now, usually with major stuttering issues unless the devs have taken the time to manually address all the flaws with unreal's dx12 usage.

my impression is that ue4's pso caching requires a lot of manual effort from developers and even then doesn't have perfect coverage - so there will still be some shaders missing from the precompiled cache in the best case where the devs have made the best use of how ue4 wants them to try to do it. additionally it just doesn't support raytracing shaders at all? and ue4 just compiles shaders immediately upon needing them for rendering instead of trying to anticipate that shaders missing from the cache will be used in time to compile them, and setting up some system to do that is obviously a lot of work for the developers. the architecture of the engine was just not designed for dx12 and they've taken forever to start addressing the problems.

a separate issue is that the default way ue4 uses multithreading is apparently pretty poor as it will load assets in the same thread used for the core game logic, also often causing stuttering unless the devs have gone to the effort of manually tuning asset loading and making sure it happens on its own thread?

repiv
Aug 13, 2009

Taima posted:

Even Yuzu has compilation stutter issues on W11. Which is strange imo because it spends like 9 seconds compiling shaders; could it not spend more time compiling and avoid the issues? A core misunderstanding I have is how much of this is solved by simply spending the time compiling and how much is due to other factors.

yuzu is in a similar position to DXVK, it's running other peoples code which is oblivious to the fact that it's running under Vulkan/DX12 and doesn't know or care that Vulkan/DX12 wants PSOs to be compiled ahead of time, so yuzu just has to deal with it just-in-time

yuzu does offer a similar workaround to what dunia and unreal are doing, skip drawing anything which needs a shader that hasn't been compiled yet, but it can cause glitches since the games aren't expecting that

in TOTK for example, enabling async shader building messes up the weapon icons for your fused weapons because the game renders them once and caches the results, and if yuzu skips drawing pieces of the weapon because the shaders aren't ready yet then it get saved in an incomplete state

mobby_6kl posted:

Carmack I think pretty much quit games to gently caress with rockets and cars and then eventually VR around Oculus and then facebook. I don't recall exactly what he's done but a lot of work over reducing latency and improving performance etc.

nevertheless idtech still kept up its technical leadership even without carmack, they absolutely nailed the migration to vulkan and that work was all led by tiago sousa

idtech took a two-pronged approach to aligning into the vulkan model though, aside from code they also built a new art pipeline which allowed them to create nearly all of the assets in the game using just a handful of shaders, greatly simplifying the engines job. unreal gives artists an incredible amount of flexibility, and epic isn't in a position where they can walk that back and break existing UE4 assets, so the engine just has to deal with the avalanche of shader variants their users end up creating.

repiv fucked around with this message at 13:58 on Jun 20, 2023

Geemer
Nov 4, 2010



kliras posted:

mine just doesn’t boot with it enabled. same cpu and gpu

After updating the bios and also after subsequently changing the cpu (which made the motherboard clear it again) it did need a few loops to train itself to boot, with the first loop being stuck on vga for a good 10 seconds according to the diagnostic leds on the motherboard.

But your mileage may vary, I guess.
My board is an MSI Tomahawk max.

kliras
Mar 27, 2021
might be it's better this time; asus x470 bios support has been absolute rear end so far. i don't even think the latest update from this month (after a four-month update drought) comes with curve optimizer support

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
Not to mention Carmack is a maga chud nowadays, so the less we see of him the better. He hasn't contributed anything of note in years.

Arivia
Mar 17, 2011

Zedsdeadbaby posted:

Not to mention Carmack is a maga chud nowadays, so the less we see of him the better. He hasn't contributed anything of note in years.

I have to disagree. His keynote speech at meta’s vr conference just before he quit where he basically told Zuckerberg, his boss, “hey this company? This ain’t gonna be poo poo, it’s going to be someone younger and hungrier and better” publicly in face of a huge audience was excellent.

UHD
Nov 11, 2006


carmack's always been a libertarian, twitter just made it easier to recognize.

njsykora
Jan 23, 2012

Robots confuse squirrels.


He left Facebook to start an AI company which is hilarious.

Lord Stimperor
Jun 13, 2018

I'm a lovable meme.

mobby_6kl posted:

Carmack I think pretty much quit games to gently caress with rockets and cars and then eventually VR around Oculus and then facebook. I don't recall exactly what he's done but a lot of work over reducing latency and improving performance etc.

Afaik he was the first to come up with frame interpolation/prediction, effectively doubling framerates for oculus. Steam then came with a similar concept (asynchronous screenwarp/ASW). This massively reduced the requirements for VR, it was like running modern DLLS on 4K.

repiv
Aug 13, 2009

the paper oculus released on timewarp is credited to Jan Paul van Waveren, who also happened (RIP) to be the guy who developed virtual texturing at id software (also often credited to carmack)

Lord Stimperor
Jun 13, 2018

I'm a lovable meme.

Oh gently caress that's nasty

kliras
Mar 27, 2021
speaking of, what was the main takeaway from the whole zenimax deal with carmack? did he actually make off with a bunch of stuff, or?

Yaoi Gagarin
Feb 20, 2014

Truga posted:

IMO, then only people it was "holding back" were insane wizards like DE's steve and id's carmack (fake edit: did carmack even do anything interesting over the last 10 years, actually?), and because they're wizards they did insane poo poo with just directx/opengl anyway

for most people it just lets them make mistakes higher level apis didn't

I wouldn't quite put it like that. It's probably true that it was holding the entire industry back. At least in opengl you had to jump through some serious hoops to even get multiple threads to be able to do stuff with the API. AFAIK (maybe someone who's in gamedev more directly can correct me) we have had a huge explosion in geometry complexity since DX12/VK and that wouldn't have been possible in the old APIs.

Basically the old way had to go, but it's starting to look like they maybe threw the baby out with the bathwater too.

Yudo
May 15, 2003

Open GL can do multiple threads? I thought it couldn't multi-thread at all?

Scoss
Aug 17, 2015
I don't know anything about graphics APIs. How possible is it for whatever entities are responsible for stewarding over these things to make changes that could "walk back" some of their misguided judgments that have lead to shader compilation problems? Are we talking about DX13 or some other massive version upgrade that won't happen any time soon, or could it happen more gradually in the nearer future?

repiv
Aug 13, 2009

Scoss posted:

I don't know anything about graphics APIs. How possible is it for whatever entities are responsible for stewarding over these things to make changes that could "walk back" some of their misguided judgments that have lead to shader compilation problems? Are we talking about DX13 or some other massive version upgrade that won't happen any time soon, or could it happen more gradually in the nearer future?

as i mentioned above this is already happening with vulkan, they've added an extension which allows an engine to opt back into a compilation model closer to how DX11 did it

VostokProgram posted:

I wouldn't quite put it like that. It's probably true that it was holding the entire industry back. At least in opengl you had to jump through some serious hoops to even get multiple threads to be able to do stuff with the API. AFAIK (maybe someone who's in gamedev more directly can correct me) we have had a huge explosion in geometry complexity since DX12/VK and that wouldn't have been possible in the old APIs.

Basically the old way had to go, but it's starting to look like they maybe threw the baby out with the bathwater too.

apple is probably feeling pretty smug about the direction they went with metal, which is more of a "DX11 but with multithreading" abstraction by default, with opt-ins for advanced fine tuning like manual memory allocation if developers want to go there

repiv fucked around with this message at 21:29 on Jun 20, 2023

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

repiv posted:

nevertheless idtech still kept up its technical leadership even without carmack, they absolutely nailed the migration to vulkan and that work was all led by tiago sousa

idtech took a two-pronged approach to aligning into the vulkan model though, aside from code they also built a new art pipeline which allowed them to create nearly all of the assets in the game using just a handful of shaders, greatly simplifying the engines job. unreal gives artists an incredible amount of flexibility, and epic isn't in a position where they can walk that back and break existing UE4 assets, so the engine just has to deal with the avalanche of shader variants their users end up creating.

I don’t know how to put this without it coming out too critical, but for as good as DOOM Eternal looked, never mind ran, there was something a little “plain” about its graphics. Idk if that’s related, or if this is just my dumb subjective response to their artistic choices.

Josh Lyman
May 24, 2009


Zedsdeadbaby posted:

Some devs are asked why PC ports in 2023 are garbage

Short answer: nobody does PSO caching (stutter) and asynchronous operations (poor CPU utilization) properly, if at all. UE4 has made it so that developers don't need to do either, so they end up skipping both almost all the time.
Do you mean that UE4 takes care of it for devs so they develop bad habits with other engines, or that UE4 does a lovely job?

repiv
Aug 13, 2009

Rinkles posted:

I don’t know how to put this without it coming out too critical, but for as good as DOOM Eternal looked, never mind ran, there was something a little “plain” about its graphics. Idk if that’s related, or if this is just my dumb subjective response to their artistic choices.

that's possible, having to filter all artistic choices through what's possible to express in a limited set of shader parameters would limit the breadth of what the art team could do

there's plenty of middle ground between idtechs one-shader-for-everything and unreals eleventy-billion-shaders though

wolrah
May 8, 2006
what?

Former Human posted:

There were emulators for PS1, N64, and GBA while those systems were all current. Dolphin and Yuzu aren't that groundbreaking.
I don't think PS1 and N64 are great counterpoints though, they were emulated while current but it was late in their generation. PS1 came out in 1994 and N64 1996, emulators for those both really hit their "commercial games are playable" stride in 1999 just a few months before the Dreamcast kicked off the next generation.

Switch came out in 2017 and in less than a year there were two different emulators that were running homebrew and booting commercial titles, and within a few months after that multiple significant games were playable. Here we are five years later and Nintendo realized they aren't competing with anyone else so the same hardware is still current, of course the emulation is good at this point.


GBA though, that is a good one. The console released in 2001 and was first emulated in...2000. There were multiple emulators capable of booting commercial games on launch day which beats Switch by a lot.

repiv
Aug 13, 2009

Josh Lyman posted:

Do you mean that UE4 takes care of it for devs so they develop bad habits with other engines, or that UE4 does a lovely job?

UE4 doesn't take care of it, the default behaviour of a UE4 game is to stutter especially in DX12 mode. there is an optional process to mitigate the issue but it's labor-intensive on the part of the developer and can't guarantee that it will catch every possible stutter.

epic have added some more automatic mitigations to UE5, but those haven't been backported to UE4 and there's still a long tail of UE4 projects coming out

repiv
Aug 13, 2009

biblically accurate graphics card :prepop:

https://twitter.com/ServeTheHome/status/1671174099358437376

Yaoi Gagarin
Feb 20, 2014

Yudo posted:

Open GL can do multiple threads? I thought it couldn't multi-thread at all?

you can, it just sucks. you make a new rendering context for each thread but only one can issue actual rendering commands so the others just upload data to buffers and compile shaders (lol). because of these limitations i dont think anyone ever bothered to use it

Wistful of Dollars
Aug 25, 2009

dx13 will fix it

Stanley Pain
Jun 16, 2001

by Fluffdaddy
The next version of DX should be 360. So we can all walk away from it.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.
I honestly can't believe that Blizzard was too lazy to port Diablo to Metal, especially because it runs pretty well on low end hardware.

What is the PS4/5 graphics API like? DX11 or 12? I know both are available on Xbox, I'd guess it's similar on PS but I don't know if it's openGL based or what.

repiv
Aug 13, 2009

sony makes their own APIs and there's pretty much zero public information about them beyond the broad strokes, sony keeps a tight lid on it

surprisingly the PS5 APIs are supposedly completely different to the PS4 APIs but nobody has dared go into any more detail than that on a public forum as far as i'm aware

Dr. Video Games 0031
Jul 17, 2004

The 4060 Ti has already fallen below $350: https://www.amazon.com/dp/B0C5B4XNWR

$379.99 with a $38 digital coupon, bringing it down to $341.99.

njsykora
Jan 23, 2012

Robots confuse squirrels.


Man they really ain't selling huh.

Yudo
May 15, 2003

VostokProgram posted:

you can, it just sucks. you make a new rendering context for each thread but only one can issue actual rendering commands so the others just upload data to buffers and compile shaders (lol). because of these limitations i dont think anyone ever bothered to use it

That's interesting. I didn't even think it was technically possible. Thanks for explaining.

Anime Schoolgirl
Nov 28, 2002

njsykora posted:

Man they really ain't selling huh.
the other end of the equation is that most people who wanted a gpu upgrade already got one that's "good enough" whether it be by paying out the rear end during the crypto craze or joining the feeding frenzy on price-cut new stock/former mining cards when that collapsed

there's no incredibly compelling reason to get a new card if you got an Ampere or RDNA 2 that isn't the absolute bottom tier, you just wait for developers to turn texture detail down enough (edit: or in some cases do it yourself) in a patch to make 8gb stutter-proof if you bought one of the many 8gb cards or you're already beating console performance in most cases if you bought 10-16gb cards from last gen (sans Arc, but some people are just drawn to trainwrecks) and the only reason to get a more expensive card is if you love burning money for a higher 3dmark run (this demographic also rarely actually plays games)

Anime Schoolgirl fucked around with this message at 05:21 on Jun 21, 2023

Yudo
May 15, 2003

There are still a lot of people on pascal. I have a hard time believing that a <$400 4060ti with 12gb+ of ram wouldn't sell. I think it also was a mistake to market the card as being for 1080p. A lot of older cards hold up well in HD with expensive settings turned down.

Adbot
ADBOT LOVES YOU

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Twerk from Home posted:

I honestly can't believe that Blizzard was too lazy to port Diablo to Metal, especially because it runs pretty well on low end hardware.

Pretty sure it had nothing to do with laziness, and solely the bottom line/maximizing profit.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply