Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
ADINSX
Sep 9, 2003

Wanna run with my crew huh? Rule cyberspace and crunch numbers like I do?

Suspicious Dish posted:

back in the 90s, engines wanted their games to go fast so they started preloading all their textures at level load time. nvidia noticed that engines were preloading a texture to the gpu and then not actually using it right away, so they decided to "optimize their driver" by not actually loading the texture into the gpu until you actually draw with it. this caused frame drops mid-level when the texture loaded in but nvidia got to market their lovely dumb driver as "loads faster" and gaming magazines tested this and consumers bought the marketing hype.

so then games, during load times, draws a 1x1 square off screen in the top left to convince nvidia to actually load the texture. nvidia started checking for 1x1 sized rect draws and then skipping those too. so the typical advice nowadays is to actually draw "random" sized rects (the engine i work on has a pattern that alternates between 1x4 and 1x6) so nvidia's driver actually loads poo poo at the loading screen.

also since nvidia did it and got fast review scores every other vendor started doing it, but slightly differently because nvidia didnt exactly tell people how the trick worked precisely. so now everybody has to do this stupid rear end trick and has to figure out the "right" way to do it so it works on all gpus and this is just tribal industry knowledge.

This is really interesting; I knew the big card manufacturer's "cheat" for benchmarks and stuff like that (I remember reading that they could detect when 3dmark was playing and change their behavior) but I never assumed that cheating would have adverse effect on games... So now nvidia can claim it "loads textures faster" but it has to explain why its frame rates drop randomly mid game... or did they pass that blame off on the game developers?

Do triple a studios really not have access to nvidia/ati reps? I would have assumed they'd fall over themselves to either give developers tips or even change their driver behavior per title (I had thought the "game specific driver tweaks" were made IN CONJUNCTION with the game developers, not totally isolated from them) . Why wouldn't either want developers/the studio earnestly saying "our poo poo works better with <whatever> because they worked with us and made sure it did.

Adbot
ADBOT LOVES YOU

ADINSX
Sep 9, 2003

Wanna run with my crew huh? Rule cyberspace and crunch numbers like I do?

Beast of Bourbon posted:

the biggest of the big games might have direct support from amd/nvidia but when i was working on some AAA "big" games but not like...super mega huge, we'd have an nvidia or amd rep come by the studio with a handful of new cards and be like "hey can you give these to the graphics guys to make sure they all work with your, here's my business card, let me know if you guys want to do a marketing promo or something"

the programmers would take those cards home and put them in their gaming rigs, and we'd just continue working on whatever 3 year old graphics cards were in the work computers.

pro

Qwertycoatl posted:

i used to work for a company making a now-defunct mobile gpu (i just did video stuff, not 3d, so don't blame me too hard please)

we ran into a problem with a game that didn't render properly because they had some code like
code:
switch (gpu_type) {
  case MALI:
   render_things_using_mali_workarounds();
   break;
  case POWERVR:
   render_things_using_powervr_workarounds();
   break;
.....
  default:
   // don't bother rendering anything lol
}
and we weren't in the list

actually there's no point to this story but i've typed it now so here you go, my experience of gpu-related shittiness

lol

ADINSX
Sep 9, 2003

Wanna run with my crew huh? Rule cyberspace and crunch numbers like I do?

Seems like a consistent theme that people are developing on relatively old graphics cards, because if it works there, it'll work on basically anything.

So where do you think we go from here? Whats the next big thing that would actually make you use some of the new tech new cards supposedly have? Other than "poo poo should just work right", and "poo poo should be debug-able" what features would you wanna see in some theoretical next generation of graphics cards?

  • Locked thread