Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Fanelien
Nov 23, 2003

straw man posted:

Has anybody explored ways to turn off the GPU Boost in the 600 series cards, or the basic EVGA 670 in particular? As convinced as nVidia seems to be of the effectiveness of their algorithm, I'm watching my framerates drop into the 20s (in Minecraft) because the card doesn't think it needs to be running at its rated clock speed.

As I understand it's like turbo boost, but I've not heard of a way to turn it off. Maybe use EVGA Precision to set a fixed voltage so it can't downvolt and lower the clockspeed?

Adbot
ADBOT LOVES YOU

Fanelien
Nov 23, 2003

Continuing the PhysX discussion in regards to Borderlands 2, I have a pair of 570s in SLI at the moment. Is this likely to handle high settings with GPU PhysX at 1080p? Or would I be looking for another card to slot in for PhysX? If so, price - performance as a dedicated PhysX processor what should I be looking at, I was considering a used 460 or similar.

Fanelien
Nov 23, 2003

Agreed posted:

For dedicated CUDA you do have to remove the bridge, but now that you mention it (and thinking about PhysX vs. CUDA and that they're really not the same thing even if PhysX does use calculation on the same cores...) it's possible that you could just handle it in the drivers, anyone with a SLI setup mind testing this real quick? 570x570 gent, you could do us a favor and give a definite yes/no on that if you don't mind poking around the control panel :)



Did some testing with SLI/Physx gpu/cpu with Batman Arkham City:

GPU Physx dedication works when you break the SLI pairing in the control panel.



All settings are at absolute maximum with FXAA

2x 570, 1 render 1 physx: min fps 49 max fps 121 avg fps 91
2x 570, SLI, GPU physx forced: min fps 0 max fps 151, avg fps 119
2x 570 SLI, CPU physx forced: min fps 2 max fps 30 avg fps 22

Found the last test particularly surprising, a 3930k at 4ghz is no slouch, but it seems the physx workload is just not designed for a CPU at all.

I just realised I have an 8800 Ultra 768mb sitting in a box around here somewhere. As I understand the 8 series cards are the first generation of physx capable. Might try it with that when I next clean the dust out of my rig.

Fanelien
Nov 23, 2003

So I have a 2 card SLI 570 1280mb set at the moment, I just upgraded to a surround setup and I am noticing some absolutely insane temps in some games like Euro Truck Simulator 2(80c+) and World of Tanks(95c) when run in surround resolutions, even with AA off. It's frightened me off turning on surround in Borderlands 2 etc because I didn't expect cards that basically never got above 75c to shoot for the moon on temps when exposed to a surround set up. I have a massive amount of air flowing through the case from two 180mm fans in the bottom but I just can't keep the temps under control, the only good thing I guess is that the cards haven't started to throttle yet. Other than going to water is there anything I can do to decrease the heat?

Fanelien
Nov 23, 2003

Welp, looks like this finally spells the end for my ailing SLI 570 setup. Next week I buy new cards!

Fanelien
Nov 23, 2003

welp, went to local retailer, ordered a EVGA 970. Goodbye money. Card arrives next week.

Fanelien
Nov 23, 2003

Got an EVGA 970 yesterday, no coil whine and it's positively quiet compared to my SLI 570s it replaced. Definitely going to do SLI again for surround res though, when pushing the massive amount of pixels required for surround it struggles a little bit.

Adbot
ADBOT LOVES YOU

Fanelien
Nov 23, 2003

It's also possible AMD due to significant losses on the GPU front are working with Samsung on their mobile products in exchange for a discount on DRAM. I wouldn't be surprised to see a new phone soon with "AMD Graphics Technology" from Samsung married to their current ARM core as a marketing tool to offset the GPU in the latest Apple phones. That said, unless you're a shareholder in AMD, you will pretty much be told to see the door when asking about stuff like this.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply