Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Arsten
Feb 18, 2003

PC LOAD LETTER posted:

Veblen goods. You see it with other products and markets (housing, cars, jewelry, etc.) too. Essentially it boils down to people being irrational and frequently make poor or sub-optimal decisions even if they're actually intelligent.

Veblen is the "bling" issue. You buy the more expensive item because you want to flaunt your wealth. You get a double-decker chrome-plated Lexus RX7 to show that you are the ultimate playa. :v: You basically pay the money for the esteem you think it'll grant you. While I'm sure this happens with all products, I think the "bling" aspect is relatively limited on things like video cards.

There is another term (that I can't remember to hunt up in google) where the price becomes the indicator quality if an individual doesn't understand the product or service. So if you don't know anything about, say, Doctors and you need one for a quality care reason (like "I have cancer" versus a "I have a sinus infection give me some drat antibiotics" type visit) you will go "Doctor John charges $95 and Doctor Jim charges $195." and you prefer to go to Jim because he can't charge that much if he isn't quality goods.

I think this is what happens on the video cards more than Veblen Goods. A random guy walks into a Best Buy and sees an AMD Q9 for $150 and an nVidia PG3 for $170 and they get the $170 because they don't know the product and price became their indicator of quality.

Adbot
ADBOT LOVES YOU

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Also RAM is overvalued as hell because it's a very easy thing to put a number on, similar to clock speed.

As far as the rumors go, it's possible that NV is mum about async because they have none but it's possible they have it and want to spike interest when they launch.

It's possible that Pascal is going to be the best of GCN and Maxwell together like peanut butter and jelly, but it's possible that it's got the drawbacks of both.

It's possible that AMD just warmed over GCN, and it's possible that some of the nifty power gating and unit boosting they've got patented is going to show up and they'll wring out unprecedented performance from their wattage targets.

There, I think I covered all the possibilities please hire me wccftech.

PC LOAD LETTER
May 23, 2005
WTF?!
Well I have seen people do flaunt their GPU online in game. Does seem to come up a lot in CS while waiting to respawn for instance. But that is anecdotal and so is not definitive. I don't really know what other term could apply to what we're seeing currently in the GPU market though, 'veblen goods' seems to fit well enough to be useful.

Food for thought: AMD didn't have much trouble selling their CPU's to consumers back in the K6, K7, and K8 days against Intel despite Intel often having similar or better performance + decent brand recognition + large market influence and OEM support. AMD were often even able to sell for prices not far from Intel's even on the high end at times when they were performance competitive.

Now the GPU market is very different but still sells to much of the same crowd of buyers. So you'd think AMD would be able to get good to decent sales and market share by reducing prices on the 290/390/etc., their GPU performance isn't THAT far behind nV's in general, yet they still seem to be losing market share despite offering good if not better value.

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!
AMD's market share really tanked after the 970 was released since it offered better performance/$ than Nvidia cards typically had in the past, plus AMD hasn't released an actual new card other than at the high end in a very long time.

mobby_6kl
Aug 9, 2009

by Fluffdaddy



The 970 was released around end of Q3/14 so yeah that kinds of matches.

Even though I haven't bought a GPU since 650Ti-B, I've mostly looked at Nvidia as AMD's stuff just comes across as not being the best all-around performers, deliver lower minimum FPS and use more power. This is probably not entirely accurate but that's the impression from just casually following the market.

PC LOAD LETTER
May 23, 2005
WTF?!

MaxxBot posted:

AMD's market share really tanked after the 970 was released since it offered better performance/$ than Nvidia cards typically had in the past
When AMD was selling the 290 for ~$400 and the 290X for around $500 sure. But AMD did some pretty decent price drops and sales never seemed to have recovered.

edit: \/\/\/\/Price drops are a offering though even if its on a existing product. OK looks like they did get some more marketshare in late 2015 but that seems more due to the 390/X which is rebrand of a 290/X with more VRAM than the market responding to price cuts of the 290/X. Which is dumb but there you go. Digitimes is expecting them lose some or all of it in Q1/2016 even though the product offerings from nV and AMD didn't really change much in that time period. Mixed signals at best, certainly not the definitive proof I thought it to be at first. :/

PC LOAD LETTER fucked around with this message at 15:31 on Apr 17, 2016

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

PC LOAD LETTER posted:

When AMD was selling the 290 for ~$400 and the 290X for around $500 sure. But AMD did some pretty decent price drops and sales never seemed to have recovered.

Yeah, back up to 22-24%, so it's partly not really having anything compelling and superior marketing of Maxwell.

EoRaptor
Sep 13, 2003

by Fluffdaddy

PC LOAD LETTER posted:

When AMD was selling the 290 for ~$400 and the 290X for around $500 sure. But AMD did some pretty decent price drops and sales never seemed to have recovered.

This is because the price cuts reinforced the idea that the 970 was better value. By cutting the price, AMD admitted their card wasn't as good, and since price segments are set by income, not by the value of the item being purchased, the 970 ran away with its target market.

People might be predictable actors, but they are not rational actors. Anytime somebody talks about people being rational actors, punch them in the face, that poo poo shouldn't be tolerated.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

I'm still pretty sure that the 300 series refresh gave AMD some ground back simply because the benches of the 290 were considerably lower than actual performance you could get from it. What messes with me a lot more is just how well the 960 sold especially when it was just the 2 GB version for around 200, the 280 was still very high performance and it was literally better price/performance to buy a 290 and a decent power supply than a 960.

Naffer
Oct 26, 2004

Not a good chemist

EoRaptor posted:

This is because the price cuts reinforced the idea that the 970 was better value. By cutting the price, AMD admitted their card wasn't as good, and since price segments are set by income, not by the value of the item being purchased, the 970 ran away with its target market.

People might be predictable actors, but they are not rational actors. Anytime somebody talks about people being rational actors, punch them in the face, that poo poo shouldn't be tolerated.

I'm not quite getting this argument. The 390x costs ~$50-$75 more than the 970 right now on newegg, and according to this comparison: http://www.anandtech.com/bench/product/1559?vs=1595 is similar in performance but draws an extra 175W. Are you saying that the 390x would sell better if it was even more expensive?

edit: I guess we're talking about the 290x which is slightly slower, but still, what fraction of people who buy $300+ GPU's aren't looking at benchmarks?

Naffer fucked around with this message at 17:13 on Apr 17, 2016

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

xthetenth posted:

I'm still pretty sure that the 300 series refresh gave AMD some ground back simply because the benches of the 290 were considerably lower than actual performance you could get from it. What messes with me a lot more is just how well the 960 sold especially when it was just the 2 GB version for around 200, the 280 was still very high performance and it was literally better price/performance to buy a 290 and a decent power supply than a 960.

Because humans don't think like that, and it's likely why AMD will be trying to deliver on the under 300$ segment. On an ignorant consumer, power draw and heat are one of the bigger things for uninformed consumers on their list of concerns. Budget is also up there, but cores counts, clock speeds, ram, performance, etc are all very abstract and I either deal with a person who can't put these abstract ideas into context or I don't meet the person who can.

Seriously, like 80% of peoples eyes will glaze over merely trying to explain advantages/disadvantages between AMD and Nvidia, on like any given generation. You need to condense everything into a sentence on why things matter and in babby speak.

"A 960 doesn't eat a lot of power and clocks really high for better performance, you should be okay with a Dell PSU"; Instant Buy. Arguments on why 4GB is needed and they'll turn glassy eyed because the 2GB is cheaper. This is also why companies get away with dumb ram amounts on smaller cards, not because consumers think more RAM is just better but they just don't grasp how it loving works and relates the the card period.
"A 280X will beat a 960 in most games but you'll need a new PSU, and you'll need good airflow"; I've never met a person who'll buy the 280X because the guy doing so understands power draw and case flow already and doesn't need me. Average people are already turned off because they need to consider factors beyond the card itself.

EmpyreanFlux fucked around with this message at 18:32 on Apr 17, 2016

Panty Saluter
Jan 17, 2004

Making learning fun!
I bought a 970 when I did because it was crushing everything at 1080 or thereabouts for a very reasonable price. Before that I'd had two AMD GPUs, one of which I'd regarded as one of the best purchases I'd made (crappy single slot cooler and all). Before that it was a 6800. I say just go with whatever's running the games you want at the price you can afford


v:v:v

Alchenar
Apr 9, 2008

As Carmak says:

"Worth remembering that AMD has often had stronger hardware than Nvidia, but Nvidia's software and ecosystem has been a critical edge."

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Alchenar posted:

As Carmak says:

"Worth remembering that AMD has often had stronger hardware than Nvidia, but Nvidia's software and ecosystem has been a critical edge."

This is impressive when you consider that between AMD and Nvidia, AMD has used consistently smaller die sizes.

Krailor
Nov 2, 2001
I'm only pretending to care
Taco Defender

FaustianQ posted:

Because humans don't think like that, and it's likely why AMD will be trying to deliver on the under 300$ segment. On an ignorant consumer, power draw and heat are one of the bigger things for uninformed consumers on their list of concerns. Budget is also up there, but cores counts, clock speeds, ram, performance, etc are all very abstract and I either deal with a person who can't put these abstract ideas into context or I don't meet the person who can.

Seriously, like 80% of peoples eyes will glaze over merely trying to explain advantages/disadvantages between AMD and Nvidia, on like any given generation. You need to condense everything into a sentence on why things matter and in babby speak.

"A 960 doesn't eat a lot of power and clocks really high for better performance, you should be okay with a Dell PSU"; Instant Buy. Arguments on why 4GB is needed and they'll turn glassy eyed because the 2GB is cheaper. This is also why companies get away with dumb ram amounts on smaller cards, not because consumers think more RAM is just better but they just don't grasp how it loving works and relates the the card period.
"A 280X will beat a 960 in most games but you'll need a new PSU, and you'll need good airflow"; I've never met a person who'll buy the 280X because the guy doing so understands power draw and case flow already and doesn't need me. Average people are already turned off because they need to consider factors beyond the card itself.

I think it's even simpler than that for most consumers, especially on the low end.

Most people just ask 'Which is better AMD or NVidia?'; not which specific card in my price range is better, or what's the better overall value, just which is better. Right now the best card on the market is the 980ti; therefore, to most people, that means NVidia is 'better' so they just buy the best NVidia card they can afford.

If the Fury had come out and beaten the pants off of the 980ti you'd suddenly see people buying a lot more 370s and 380s because then AMD would be 'better'.

Anime Schoolgirl
Nov 28, 2002

it's more hilarious now that top of the line consumer cards are cut from 400w HPC monstrosities that use hundreds to thousands of the things to map bird genomes or run fluid dynamics simulations of millions of gallons

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Anime Schoolgirl posted:

it's more hilarious now that top of the line consumer cards are cut from 400w HPC monstrosities that use hundreds to thousands of the things to map bird genomes or run fluid dynamics simulations of millions of gallons

There's usually a separate chip for compute users at the start. We didn't get the GTX Titan/780 Ti/780 until the compute market was saturated with Tesla K20/K20X/K40 cards, and we won't get a Pascal Titan until they're done selling Tesla P100 cards.

Still though, it's mostly a case of convergent evolution. Graphics and fluid-dynamics simulations both need to run a lot of FLOPs and SIMD processing is one of relatively few ways to accomplish that. You either need to build a better processor or get more processors. And there's limitations/downsides to both approaches when you start scaling them up, so you kinda need to keep everything in some degree of balance (regardless of CPU vs GPU).

PC LOAD LETTER
May 23, 2005
WTF?!

EoRaptor posted:

This is because the price cuts reinforced the idea that the 970 was better value.
They were able to keep sales and marketshare up, even increasing it at times, in the past by cutting prices on their CPU's though.

EoRaptor posted:

People might be predictable actors, but they are not rational actors. Anytime somebody talks about people being rational actors, punch them in the face, that poo poo shouldn't be tolerated.
Absolutely!

AMD's single card windows drivers have been pretty good for the last couple years though (e) and really weren't all that bad even in the VLIW days. They do lag nV when it comes to supporting the latest games at launch by a few days to a couple of weeks but the difference doesn't really matter anymore. For CF I think they're very close if not just as good as nV's SLi now. The problem is CF/SLi is hard to do period so neither AMD or nV's solution is all that recommendable to most people. Even for enthusiasts its a tough sell.

Its on Linux that AMD's drivers are a real mess and they do seem to have been trying to improve that greatly in the last year.

Krailor posted:

Most people just ask 'Which is better AMD or NVidia?'; not which specific card in my price range is better, or what's the better overall value, just which is better. Right now the best card on the market is the 980ti; therefore, to most people, that means NVidia is 'better' so they just buy the best NVidia card they can afford.
I've seen this called the "halo card effect" by some of the industry guys over in B3D and they do seem to talk as if its a significant factor in sales. No one really gives numbers that I've seen though.

edit: \/\/\/\/You're right but I was talking from a perspective of game support/driver quality. CF/SLi are both great when they work properly, thought I'd say CF has a bit of an edge right now for reasons you mention, but neither work properly most of the time due to poor game/driver support.

PC LOAD LETTER fucked around with this message at 03:43 on Apr 18, 2016

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

I'm pretty sure CF is outright better from a smoothness perspective, but SLI's okay enough and they suck for compatibility.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

FaustianQ posted:

This is impressive when you consider that between AMD and Nvidia, AMD has used consistently smaller die sizes.

For some reason I'm reminded of the exchange that took place between an Army Major and NVA Colonel at the Paris Peace Talks...

"You know, you never beat us on the battlefield die sizes."

"That may be so, but it is also irrelevant."

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

PC LOAD LETTER posted:

AMD's single card windows drivers have been pretty good for the last couple years though (e) and really weren't all that bad even in the VLIW days. They do lag nV when it comes to supporting the latest games at launch by a few days to a couple of weeks but the difference doesn't really matter anymore. For CF I think they're very close if not just as good as nV's SLi now. The problem is CF/SLi is hard to do period so neither AMD or nV's solution is all that recommendable to most people. Even for enthusiasts its a tough sell.

XDMA CrossFire is absolutely bulletproof and significantly outperforms SLI in every way - frame pacing, scaling, and physical durability.

I wouldn't be surprised to see NVIDIA deploy a similar solution on Pascal or Volta or something. I haven't actually seen a PCIe carrier card yet, but it's really the obvious solution and NVIDIA have to know it too.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

I assumed that NVLink was their XDMA response.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Paul MaudDib posted:

XDMA CrossFire is absolutely bulletproof and significantly outperforms SLI in every way - frame pacing, scaling, and physical durability.
Do you have a link to a recent test showing this? All the tests I'm finding on old drivers show it beating old Crossfire by a huge margin but only achieving almost-parity with SLI, but it makes a lot of sense that this would have improved in the Crimson drivers.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Alereon posted:

Do you have a link to a recent test showing this? All the tests I'm finding on old drivers show it beating old Crossfire by a huge margin but only achieving almost-parity with SLI, but it makes a lot of sense that this would have improved in the Crimson drivers.

I think sweclockers still frapses multi card configs as a matter of habit. I'd check but I'd rather sleep, it's kind of important.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Subjunctive posted:

I assumed that NVLink was their XDMA response.

NVLink is their interface for binding cards together for dedicated compute.

Blackfyre
Jul 8, 2012

I want wings.
Heres an aside I remembered over the weekend.

Way back when I was using ATi, I remember noticing something that bothered me way more than it should have and was a big factor for me at the time to go back to Nvidia. It was KOTOR.

In KOTOR circa 2003/2004 I was playing it on my new PC and thought something was off. Couldn't put my finger on it. Watched my brother play the xbox version and realised what it was - confirmed when I checked screenshots online. There was no glow on any of the lights or lightsabers, they looked like colored breadsticks.

Looking around forum posts this was an ATi problem so being 15 and positive I thought oh well at least its a known issue and will be solved soon with a driver update. Can't remember how long I waited before giving up and getting a Geforce but it was several months. In fact I'm curious if this issue is still a thing today with KOTOR on new AMD cards? Its kinda put me off going Ati again, as I said with the extra effects etc which are team green exclusive and the fact that Nv have been pretty good in benchmarks I didn't mind paying a bit more for them.

That said, the price/performance is getting even better for AMD, Crossfire better (and more appealing because of this) and Freesync screens are sooooo much cheaper than Gsync ones mean I'm seriously considering making the switch should Polaris be good.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

SwissArmyDruid posted:

NVLink is their interface for binding cards together for dedicated compute.

Ah, OK. I wonder what their story is for higher resolutions.

Big Mackson
Sep 26, 2009
as a 290x veteran and freesync survivor i can say with certainty that i am happy with having this setup in 2016.

i am happy.

SwissArmyDruid
Feb 14, 2014

by sebmojo
Tangentially related: Android N DP2 includes Vuikan! http://www.pcper.com/news/Mobile/Android-N-Developer-Preview-2-Includes-Vulkan

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

I have high hopes for Vulkan. Not necessarily because I will be playing games with it but because I think it should allow games that take advantage of stuff DX 11 can't fo to target multiple platforms.

Hubis
May 18, 2003

Boy, I wish we had one of those doomsday machines...
Carmack's comment is not just (or possibly at all) about driver stability; NV has always invested a lot more in building chips they can software optimize more asggressively, and putting in the work in app analysis to add those optimization strategies to the driver. If you consider how many millions both companies spend on gaining another 5% fps in a benchmark, it completely makes sense to dedicate a fraction of those resources on the other side of the PCI-E bus if there are some high impact gains to be made. There are also cases where developers just do something "stupid" but which can be accounted for and fixed on the driver side.

This is a big part of the DX12 story - a ton of the advantage NV has in that area suddenly got moved to the other side of the API. Meanwhile, ATI/AMD/RTG has much less sophisticated driver optimization strategies, and has a much bigger problem with CPU-side driver perf, so pushing something lower level like that allows then to offload some of the costs for that software to the ISVs and mitigate one of their big weaknesses in the process. It might be that NV just needs to refactor a lot of their work to get back some of the functionality they have developed for DX11, but at the end of the day DX12 simply puts a lot more responsibility for making sure the GPU is humming along perfectly in the developers hands and gives IHVs less room to compensate.

Yeah it shouldn't HAVE to happen, but when someone shops a game with a bug that leaves 15% total perf on the table and they just don't have the time to fix it in pre launch crunch or the inclination to fix it post launch, both the consumers and the IHVs have an incentive to see it fixed.

xthetenth posted:

I have high hopes for Vulkan. Not necessarily because I will be playing games with it but because I think it should allow games that take advantage of stuff DX 11 can't fo to target multiple platforms.

Vulkan and DX12 are pretty equivalent for the most part. Microsoft will have to do alot of leg work with debug tools and dev support to get DX12 to beat out vulkan for mindshare (although controlling the XB1 will help). vulkan definitely seems like the one to bet on though, based on the headwinds fom people like Valve.

sauer kraut
Oct 2, 2004

Blackfyre posted:

Heres an aside I remembered over the weekend.

Way back when I was using ATi, I remember noticing something that bothered me way more than it should have and was a big factor for me at the time to go back to Nvidia. It was KOTOR.

In KOTOR circa 2003/2004 I was playing it on my new PC and thought something was off. Couldn't put my finger on it. Watched my brother play the xbox version and realised what it was - confirmed when I checked screenshots online. There was no glow on any of the lights or lightsabers, they looked like colored breadsticks.

Looking around forum posts this was an ATi problem so being 15 and positive I thought oh well at least its a known issue and will be solved soon with a driver update. Can't remember how long I waited before giving up and getting a Geforce but it was several months. In fact I'm curious if this issue is still a thing today with KOTOR on new AMD cards? Its kinda put me off going Ati again, as I said with the extra effects etc which are team green exclusive and the fact that Nv have been pretty good in benchmarks I didn't mind paying a bit more for them.

That said, the price/performance is getting even better for AMD, Crossfire better (and more appealing because of this) and Freesync screens are sooooo much cheaper than Gsync ones mean I'm seriously considering making the switch should Polaris be good.

AMD and the KotoR games have a troubled history. I tried playing one of them on my 6850 about a year ago, and it would crash unless you disabled all shader/particle effects.
I'd bet :tenbux: that they haven't bothered fixing their OpenGL driver to this day.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

sauer kraut posted:

AMD and the KotoR games have a troubled history. I tried playing one of them on my 6850 about a year ago, and it would crash unless you disabled all shader/particle effects.
I'd bet :tenbux: that they haven't bothered fixing their OpenGL driver to this day.

Guess I'll have to check kotor 2 when I get home.

penus penus penus
Nov 9, 2014

by piss__donald

xthetenth posted:

I'm pretty sure CF is outright better from a smoothness perspective, but SLI's okay enough and they suck for compatibility.


Paul MaudDib posted:

XDMA CrossFire is absolutely bulletproof and significantly outperforms SLI in every way - frame pacing, scaling, and physical durability.

I wouldn't be surprised to see NVIDIA deploy a similar solution on Pascal or Volta or something. I haven't actually seen a PCIe carrier card yet, but it's really the obvious solution and NVIDIA have to know it too.

When did this happen? As far as I know XDMA fixed something that was nearly outright broken, but I never knew SLI paled in comparison to it.

Basing that on "old data" though back when it was news.

http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-AMD-Radeon-R9-290X-CrossFire-and-4K-Preview-Testing/Battlefield-

I know its splitting hairs here to say SLI is better here (I would consider these results practically equal), but to say that crossfire significantly outperforms SLI in every way suggests something has changed since then

Blackfyre
Jul 8, 2012

I want wings.

xthetenth posted:

Guess I'll have to check kotor 2 when I get home.

I'm very interested to see the results for this and KOTOR 1. I'll be shocked if its still a hot mess on the shader front 12/13 years on!

Captain Hair
Dec 31, 2007

Of course, that can backfire... some men like their bitches crazy.
I had a kotor problem with my ati card back in the day. Was a 9600xt I think? Came with a free pre order key of hl2 and possibly orange box.

Anyway, it had a problem where many textured had a multicolour oil slick overlay, the game looked funky as hell. Ended up being a checkbox hidden deep within the driver settings that when disabled displayed textures properly. Also setting disablevertexbuffer=1 or whatever it was to make fps not take a dive whenever smoke or holograms were near or whatever the problem was.

Card died after blasting the case through with air to clean it all out. The bastard.

Edit: also the game HATED dual monitors. Oh the game bound the mouse fine and behaved exactly how it should. Aside from the fps was irrationally all over the place. Like standing in place for 1 minute you'd have fps from <5fps right the way up to whatever the limit of my system was, even changing settings to low and making the screen res the smallest didn't effect my min or max framerates. It just simply could not handle it.

Heck even now the games seem to arbitrarily drop from 60fps to 30fps for certain parts of levels, best I can gather it's intentional when for when you're near an area transition or cutscenes, but not certain.

Captain Hair fucked around with this message at 15:44 on Apr 18, 2016

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Blackfyre posted:

I'm very interested to see the results for this and KOTOR 1. I'll be shocked if its still a hot mess on the shader front 12/13 years on!

2 got a patch pretty recently too.

Setzer Gabbiani
Oct 13, 2004

Blackfyre posted:

KOTOR and AMD woes

sauer kraut posted:

AMD and the KotoR games have a troubled history. I tried playing one of them on my 6850 about a year ago, and it would crash unless you disabled all shader/particle effects.
I'd bet :tenbux: that they haven't bothered fixing their OpenGL driver to this day.

It has to do with it's ancient-rear end OpenGL implementation being incompatible with newer drivers due to some incredibly-specific calls being removed, and as such, you have to disable framebuffer effects and soft shadows in kotor.ini to force the game to use only the most basic shader functions, let alone startup without crashing. This happened around the time they overhauled their OGL drivers nearing Rage's release, or maybe earlier, but depending on your card, you can just stick an older atioglxx.dll in KOTOR's directory and still use all the fancy effects, the one from Catalyst 9.12 is what most people use for weird OGL games. This gets fucky for cards that didn't exist when those drivers did though, and in my testing on a 390 and Fury, the game WILL work, but it does something horrible to the drivers to where upon existing, any GPU-intensive usage tends to give you a BSOD. This is still unfixed in both the GOG and Steam versions of the game, with no sign of Bioware hinting at an attempt to fix it, so this is as good as it gets for now

At the end of the day, it's mostly just bloom effects and blurrier shadows you're missing, and playing the game with both disabled isn't an issue, as I did just this a couple weeks ago. The stuff that ISN'T fixed (and probably won't be) are grass issues and alpha channel fuckery, you can fix the grass with GLOverride by enabling ForceAlphaToCoverage in the .ini, but the textures are a driver quirk on newer hardware, so if you have anything even remotely resembling a modern AMD card, certain detail textures will be transparent, there aren't a lot of them, but some mods add them, so if any custom armor does that, just open the .tga in PS and remove the alpha channel, all it'll remove is shiny shading and detail

KOTOR 2 on the other hand got a huge update on Steam that fixed it's OGL issues on modern hardware, along with workshop support, so you can now run the game with all your favorite shader effects with no issue, once again showcasing it's superiority to the first. The game ships with a typo in it's kotor.ini that causes certain models (Lightsabers mostly) to show as untextured though, so add Disable Vertex Buffer Objects=1 under the one that's all one word

Captain Hair posted:

I had a kotor problem with my ati card back in the day. Was a 9600xt I think? Came with a free pre order key of hl2 and possibly orange box.

My 9800XT was absolutely steamrolling KOTOR up until I went outside on Dantooine, where the sun decides to murder all performance in the name of the most unoptimized lighting known to man, although that was yet another VBO fix

Captain Hair
Dec 31, 2007

Of course, that can backfire... some men like their bitches crazy.

Setzer Gabbiani posted:

certain detail textures will be transparent, there aren't a lot of them, but some mods add them, so if any custom armor does that, just open the .tga in PS and remove the alpha channel, all it'll remove is shiny shading and detail

That happened to me on a nvidia 560ti, had a mod that made certain weapons have unique 3d models which was great till I realised I could see right through baccas sword.

Game certainly has its quirks, speaking of, how about that republic commando bump mapping!

Adbot
ADBOT LOVES YOU

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
Okay, I'm getting a severe headache using MSI afterburner for some reason. I'm on 16.3 Crimson and when I try to adjust my clock speed in afterburner (down to 960) it flatout ignores it except on the desktop. In a game I'll get 1050mhz, and now it'll run 960mhz constantly on the desktop. It's also ignoring voltage and power controls so instead of .975V it'll instead sit at 1.25V, even on desktop.

Urrgghhh

EDIT: Oh cool, now even opening MSI afterburner hard freezes/crashes the computer.

EmpyreanFlux fucked around with this message at 16:38 on Apr 18, 2016

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply