|
Suspicious Dish posted:back in the 90s, engines wanted their games to go fast so they started preloading all their textures at level load time. nvidia noticed that engines were preloading a texture to the gpu and then not actually using it right away, so they decided to "optimize their driver" by not actually loading the texture into the gpu until you actually draw with it. this caused frame drops mid-level when the texture loaded in but nvidia got to market their lovely dumb driver as "loads faster" and gaming magazines tested this and consumers bought the marketing hype. This is really interesting; I knew the big card manufacturer's "cheat" for benchmarks and stuff like that (I remember reading that they could detect when 3dmark was playing and change their behavior) but I never assumed that cheating would have adverse effect on games... So now nvidia can claim it "loads textures faster" but it has to explain why its frame rates drop randomly mid game... or did they pass that blame off on the game developers? Do triple a studios really not have access to nvidia/ati reps? I would have assumed they'd fall over themselves to either give developers tips or even change their driver behavior per title (I had thought the "game specific driver tweaks" were made IN CONJUNCTION with the game developers, not totally isolated from them) . Why wouldn't either want developers/the studio earnestly saying "our poo poo works better with <whatever> because they worked with us and made sure it did.
|
# ¿ Sep 29, 2017 13:49 |
|
|
# ¿ May 18, 2024 09:33 |
|
Beast of Bourbon posted:the biggest of the big games might have direct support from amd/nvidia but when i was working on some AAA "big" games but not like...super mega huge, we'd have an nvidia or amd rep come by the studio with a handful of new cards and be like "hey can you give these to the graphics guys to make sure they all work with your, here's my business card, let me know if you guys want to do a marketing promo or something" pro Qwertycoatl posted:i used to work for a company making a now-defunct mobile gpu (i just did video stuff, not 3d, so don't blame me too hard please) lol
|
# ¿ Sep 29, 2017 22:49 |
|
Seems like a consistent theme that people are developing on relatively old graphics cards, because if it works there, it'll work on basically anything. So where do you think we go from here? Whats the next big thing that would actually make you use some of the new tech new cards supposedly have? Other than "poo poo should just work right", and "poo poo should be debug-able" what features would you wanna see in some theoretical next generation of graphics cards?
|
# ¿ Sep 29, 2017 22:51 |