Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
The Slack Lagoon
Jun 17, 2008



Quote is not the same as edit I guess.

Adbot
ADBOT LOVES YOU

penus penus penus
Nov 9, 2014

by piss__donald
If I had to give you a guess around $140 to $160. You'd likely be selling to someone with another 4gb 760, which is going to limit your market but may improve your pricing.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.
Next year the 12gb in the titan X wont be enough

Foxhound
Sep 5, 2007
Anyone else having nvlddmkm driver crashes with the witcher 3 geforce drivers? Really weird.

Foxhound fucked around with this message at 23:08 on May 20, 2015

kojicolnair
Mar 18, 2009

Foxhound posted:

Anyone else having nvlddmkm driver crashes with the witcher 3 geforce drivers? Really weird.

I do sometimes, very randomly, but this started with the GTA 5 drivers for me. I've had a hard time troubleshooting as I can play for hours and hours without it happening and I don't have enough time to play for 6 hours to make it crash!

Cinara
Jul 15, 2007

Foxhound posted:

Anyone else having nvlddmkm driver crashes with the witcher 3 geforce drivers? Really weird.

Tons of people are having these issues, myself included. I have tried everything but rolling back to the previous driver version as that does not have SLI support for Witcher 3. There are lots of things you can try that have worked for some people though.

Including:

Uncapping FPS
Turn off VYSNC
Run in Fullscreen
Lower plant draw distance
Change Power settings in Nvidia control panel to performance for Witcher 3
Clean driver install
etc

Results vary for everyone, and some people like myself are 100% unable to get the game working. My crashes are roughly 2-5 minutes into the game each time, and only in Witcher, nothing else has issues. Witcher doesn't even have time to stress the cards, they barely hit 60c before the crash usually.

Nill
Aug 24, 2003

I've seen a host of 'solutions' fiddling with settings that don't end up fixing anything. (vsync, driver power mode settings, etc...)

I finally ended up simply boosting the power limit and voltage a tad in Afterburner and (so far) I've been good.

kojicolnair
Mar 18, 2009

Nill posted:

I've seen a host of 'solutions' fiddling with settings that don't end up fixing anything. (vsync, driver power mode settings, etc...)

I finally ended up simply boosting the power limit and voltage a tad in Afterburner and (so far) I've been good.


I've seen people talking about increasing voltage but how much did you increase it by?

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Do the GeForce Experience recommended settings take into account SLI? It's recommending pretty low choices for my dual-970 setup. Maybe it's a ploy to lure me into upgrading. Maybe it'll work!

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Massasoit posted:

I am amazed at the energy efficiency gains recently in computer parts.

:science: Hey, we're using significantly less power... And it performs way better then before!

From my understanding, this is because all the Maxwell chips are basically underclocked and undervolted. Which explains the ridiculous overclocks you can get out of it - something like an extra 20-25% on small Maxwell and a full 50% increase on big Maxwell. Rather than really push the performance to the limit, they traded it for power efficiency. Once you push the cards closer to their real capabilities they start chugging power just as badly as Kepler or AMD. At which point they drastically outperform them, of course. Overclocked Titan X's are crazytown performance, if I won the lottery I'd have a couple of them in SLI clocked to hell with a custom water loop.

Looking at it in that sense, it's the same kind of gimmick as Turbo Clocks on processors. They get to advertise super low power usage and "wink-wink nudge-nudge they overclock like a boss," but since the chip meets its published (under-specified) performance target there's no guarantees of any particular unit actually hitting a specific performance target. And they get to sell chips that would fail to pass quality-control if they were specified at a higher clock rate.

Paul MaudDib fucked around with this message at 00:58 on May 21, 2015

Cinara
Jul 15, 2007

Subjunctive posted:

Do the GeForce Experience recommended settings take into account SLI? It's recommending pretty low choices for my dual-970 setup. Maybe it's a ploy to lure me into upgrading. Maybe it'll work!

Make sure SLI is actually enabled. It disables after every driver update.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Cinara posted:

Make sure SLI is actually enabled. It disables after every driver update.

:eyepop:

Thank you!

Edit: hmm, that didn't help. Still suggesting lower-than-High including disabled Hairworks and DOF, plus medium texture quality. Hmm hmm hmm.

Subjunctive fucked around with this message at 00:53 on May 21, 2015

The Slack Lagoon
Jun 17, 2008



Paul MaudDib posted:

From my understanding, this is because all the Maxwell chips are basically underclocked and undervolted. Which explains the ridiculous overclocks you can get out of it - something like an extra 20-25% on GM100 and a full 50% increase on GM200. Rather than really push the performance to the limit, they traded it for power efficiency. Once you push the cards closer to their real capabilities they start chugging power just as badly as Kepler or AMD. At which point they drastically outperform them, of course. Overclocked Titan X's are crazytown performance, if I won the lottery I'd have a couple of them in SLI clocked to hell with a custom water loop.

Looking at it in that sense, it's the same kind of gimmick as Turbo Clocks on processors. They get to advertise super low power usage and "wink-wink nudge-nudge they overclock like a boss," but since the chip meets its published (under-specified) performance target there's no guarantees of any particular unit actually hitting a specific performance target. And they get to sell chips that would fail to pass quality-control if they were specified at a higher clock rate.

Even if they are under clocked, the performance per watt is still increasing, which is cool to see.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers

Subjunctive posted:

:eyepop:

Thank you!

Edit: hmm, that didn't help. Still suggesting lower-than-High including disabled Hairworks and DOF, plus medium texture quality. Hmm hmm hmm.

Why would you ever use geforce experience to recommend settings with 2 970s? Throw everything on max and gently caress the police.

Unless you're running crazy high resolutions of course.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

cat doter posted:

Why would you ever use geforce experience to recommend settings with 2 970s? Throw everything on max and gently caress the police.

Unless you're running crazy high resolutions of course.

Running 1440, but hoping to keep Witcher 3 above 60fps and I thought maybe it could...no, you're right, I'm being stupid.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Massasoit posted:

Even if they are under clocked, the performance per watt is still increasing, which is cool to see.

That's why you under-clock+under-volt, though - to increase performance-per-watt. For example, compare the Intel i5-4690K with the i5-4690T. Same processor, one is undervolted for the server market. The undervolted variant benchmarks about 10% slower but consumes half the power (binning is also a factor here too, low-power units are usually the top bin, although the K should be in a decent bin too). And the 4690K is not exactly pushed to the limit either, you can squeeze another 25% performance increase out of it if you're willing to drive power consumption up by another 80%.

What's nice is that performance is still good in absolute terms - even if they're not really pushing the silicon hard, they'll still play with high settings and all that jazz.

On the same note, I wonder if running the AMD cards a bit slower might curb their power usage. Based on the 980 and the 4690, it seems like a small reduction in performance target can yield a big increase in performance-per-watt.

Paul MaudDib fucked around with this message at 01:38 on May 21, 2015

Nill
Aug 24, 2003

kojicolnair posted:

I've seen people talking about increasing voltage but how much did you increase it by?
Just +18mV to start. Also bumped the Power Limit to 110% though it's rarely going much over 90% anyway.

Make sure you go into settings and tick all the compatibility options EXCEPT "force constant voltage" and see that "unlock voltage control" is set to extended.

penus penus penus
Nov 9, 2014

by piss__donald

Subjunctive posted:

Running 1440, but hoping to keep Witcher 3 above 60fps and I thought maybe it could...no, you're right, I'm being stupid.

Yeah its way better to max out and start reducing the big name settings to get where you want imo. I've never particularly been impressed with any auto configuration (not that ive used geforce experience much for that though). Way back when I was sperging over every fps and every frame drop and had every game dialed in to an obsessive level, I looked at the geforce experience recommendations after the fact and it was sad. Way too much or way too little. It'd probably have been alright if I didn't care as much though, which will probably encompass most users in some way or another, but regardless it just wasn't great any way I looked at it.

penus penus penus fucked around with this message at 03:53 on May 21, 2015

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers
I feel like for stuff like geforce experience to matter it needs frame rate analysis tools for whatever settings that are recommended. It could offer 3 options 30, 60 and 120fps+ and for it to accept settings for those frame rate targets it needs to maintain those frame rates let's say...95% of the time. That way you know what you're getting and the settings recommended are better controlled.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

cat doter posted:

I feel like for stuff like geforce experience to matter it needs frame rate analysis tools for whatever settings that are recommended. It could offer 3 options 30, 60 and 120fps+ and for it to accept settings for those frame rate targets it needs to maintain those frame rates let's say...95% of the time. That way you know what you're getting and the settings recommended are better controlled.

It's not about scientific metrics, man, it's about the GeForce Experience!

Dad, you're embarassing me, god!

agreed

SlayVus
Jul 10, 2009
Grimey Drawer

Paul MaudDib posted:

It's not about scientific metrics, man, it's about the GeForce Experience!

Dad, you're embarassing me, god!

agreed

If vee dub freak or agreed were your dad you wouldn't have to worry about things like average fps.

penus penus penus
Nov 9, 2014

by piss__donald
http://arstechnica.co.uk/gaming/2015/05/amd-says-nvidias-gameworks-completely-sabotaged-witcher-3-performance/

Inflammatory butthurt title! Then you read the actual content and you're like, huh?

Been noticing a lot of this the last few months

repiv
Aug 13, 2009

"Hairworks is spectacularly inefficient" says AMD, while continuing to never demonstrate TressFXes performance scaling over the wide LOD ranges required for furred enemies in an open world game.

Truga
May 4, 2014
Lipstick Apathy

THE DOG HOUSE posted:

http://arstechnica.co.uk/gaming/2015/05/amd-says-nvidias-gameworks-completely-sabotaged-witcher-3-performance/

Inflammatory butthurt title! Then you read the actual content and you're like, huh?

Been noticing a lot of this the last few months

I want to think this is just AMD crying out over nothing with dumb excuses, but considering the moves nvidia keeps pulling, I really can't tell if it's true or not. :shrug:

veedubfreak
Apr 2, 2005

by Smythe

Paul MaudDib posted:

From my understanding, this is because all the Maxwell chips are basically underclocked and undervolted. Which explains the ridiculous overclocks you can get out of it - something like an extra 20-25% on small Maxwell and a full 50% increase on big Maxwell. Rather than really push the performance to the limit, they traded it for power efficiency. Once you push the cards closer to their real capabilities they start chugging power just as badly as Kepler or AMD. At which point they drastically outperform them, of course. Overclocked Titan X's are crazytown performance, if I won the lottery I'd have a couple of them in SLI clocked to hell with a custom water loop.

Looking at it in that sense, it's the same kind of gimmick as Turbo Clocks on processors. They get to advertise super low power usage and "wink-wink nudge-nudge they overclock like a boss," but since the chip meets its published (under-specified) performance target there's no guarantees of any particular unit actually hitting a specific performance target. And they get to sell chips that would fail to pass quality-control if they were specified at a higher clock rate.

Heh, you only need 1 Titan X. I'm gonna mess around with mine again this weekend and see if I can get a proper stable bios back at 1400+. Started getting crashes and put the stock bios back on it. Even at normal clocks this card murders everything I throw at it even on triple 1440.

What's funny about geforce experience is that it keeps suggesting that I run my games at half my native resolution. What kind of silly crap is that.

sauer kraut
Oct 2, 2004

Truga posted:

I want to think this is just AMD crying out over nothing with dumb excuses, but considering the moves nvidia keeps pulling, I really can't tell if it's true or not. :shrug:

Hairworks tanks performance on pre-900 Nvidia cards almost as bad as on AMD cards.
Honestly I was shocked that it works on Radeons at all. They need to shut up and make completely new chips that don't suck at tessellation/other modern stuff.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.
Apparently the EVGA hydro AIO works on the titan X. So tempting.

Truga
May 4, 2014
Lipstick Apathy

sauer kraut posted:

They need to shut up and make completely new chips that don't suck at tessellation/other modern stuff.

True.




vvvv: TruForm was glorious when it worked right.

Truga fucked around with this message at 17:03 on May 21, 2015

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

sauer kraut posted:

Hairworks tanks performance on pre-900 Nvidia cards almost as bad as on AMD cards.
Honestly I was shocked that it works on Radeons at all. They need to shut up and make completely new chips that don't suck at tessellation/other modern stuff.

They did that with the Radeon 285: GCN 1.2 has great tessellation performance, but yeah, they need to roll out their newer chips.

It's funny to me, though, because ATI tried their hand at tessellation long before anyone cared, with TruForm.

But there can be no doubt that NVIDIA always is up for loving over the competition in ways that AMD doesn't do. AMD makes a point of showing how well TressFX runs on NVIDIA hardware, even.

HalloKitty fucked around with this message at 17:05 on May 21, 2015

wargames
Mar 16, 2008

official yospos cat censor

sauer kraut posted:

Hairworks tanks performance on pre-900 Nvidia cards almost as bad as on AMD cards.
Honestly I was shocked that it works on Radeons at all. They need to shut up and make completely new chips that don't suck at tessellation/other modern stuff.

But look at their new 300 series! oh.. oh wait..

repiv
Aug 13, 2009

HalloKitty posted:

It's funny to me, though, because ATI tried their hand at tessellation long before anyone cared, with TruForm.

They did it twice in fact, they pushed another non-standard tessellation extension with R600 long after TruForm died. It was ~the future of graphics~ right up until nVidia was better at it.

penus penus penus
Nov 9, 2014

by piss__donald
I wasn't clear, my problem with the article is the title "AMD says Nvidia’s GameWorks “completely sabotaged” Witcher 3 performance" which is really the point that is going to stick, when in reality its basically the hair in the game that's causing the problem. It's okay to gripe about that, mostly, but then it goes on to say it's not like it'd have worked anyways with AMD's architecture even if they did "provide the source code".

Nvidias hair technology didn't completely sabotage the entire game for you AMD, ffs.

Kazinsal
Dec 13, 2011
s/says/claims/ fixes the big problem with the title. Removes some objectivity.

Tanreall
Apr 27, 2004

Did I mention I was gay for pirate ducks?

~SMcD
AMD GPUs can work around the GameWorks technology in Witcher 3.

http://www.guru3d.com/news-story/the-witcher-3-hairworks-on-amd-gpus-with-normal-performance.html

SwissArmyDruid
Feb 14, 2014

by sebmojo

quote:

AMD users have a way of enjoying The Witcher 3: Wild Hunt with the Nvidia HairWorks feature enabled and actually getting better performance than GeForce users not using a Maxwell architecture Nvidia GPU. It seems a bit ironic but exciting nonetheless.

Bullshit, I want to see benchmarks. Video games are temporal medium, it doesn't mean anything if you're running 15 FPS but the hair looks good.

Tanreall
Apr 27, 2004

Did I mention I was gay for pirate ducks?

~SMcD
Here is a German article on it as well with in game footage and some fps numbers.

http://www.pcgameshardware.de/The-Witcher-3-PC-237266/News/Trick-Hairworks-fluessig-AMD-Radeon-Grafikkarten-1159466/

Hace
Feb 13, 2012

<<Mobius 1, Engage.>>
Not really surprising considering how unoptimized some nvidia tech seems sometimes

Ragingsheep
Nov 7, 2009

SwissArmyDruid posted:

Bullshit, I want to see benchmarks. Video games are temporal medium, it doesn't mean anything if you're running 15 FPS but the hair looks good.

Hairworks in Witcher 3 runs at some silly 64x tessellation and 8x MSAA by default. The MSAA is adjustable through ini files but only AMD users have a way of forcing a lower level of tessellation through the drivers.

Kazinsal
Dec 13, 2011

Ragingsheep posted:

Hairworks in Witcher 3 runs at some silly 64x tessellation and 8x MSAA by default. The MSAA is adjustable through ini files but only AMD users have a way of forcing a lower level of tessellation through the drivers.

Yeah, and it looks pretty much the same at 16x tessellation (which actually runs fairly well on even a 7950 apparently).

Adbot
ADBOT LOVES YOU

Crab Battle
Jan 16, 2010

Haha! Yeah!

AVeryLargeRadish posted:

Hmmm, have you tried disabling or uninstalling the Intel integrated graphics? If you can do that then after a restart it should be forced to use the 750m instead.

kode54 posted:

No, see, he has Optimus. Which means the discrete graphics chip has no displays connected to it. It must framebuffer copy to the integrated graphics to function.

Thanks for replying to my questions, always appreciated.

I managed to track the issue down and solve it. As well as the Optimus setup with two graphics devices, I also recently picked up a mobile monitor that works over USB using DisplayLink. It turns out, this doesn't play well with fullscreen games, leading to all the D3D-related errors I was seeing.

http://support.displaylink.com/knowledgebase/articles/543922-games-do-not-work-on-windows-with-displaylink-soft

Luckily DisplayLink provide a tool to disable their drivers, letting me get back to gaming. Requires a restart on each switch, but does seem to solve the problem.

Hope this helps, in case anyone is considering a mobile monitor setup.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply