|
Quote is not the same as edit I guess.
|
# ? May 20, 2015 20:47 |
|
|
# ? May 31, 2024 09:13 |
|
If I had to give you a guess around $140 to $160. You'd likely be selling to someone with another 4gb 760, which is going to limit your market but may improve your pricing.
|
# ? May 20, 2015 21:21 |
|
Next year the 12gb in the titan X wont be enough
|
# ? May 20, 2015 21:46 |
|
Anyone else having nvlddmkm driver crashes with the witcher 3 geforce drivers? Really weird.
Foxhound fucked around with this message at 23:08 on May 20, 2015 |
# ? May 20, 2015 23:05 |
|
Foxhound posted:Anyone else having nvlddmkm driver crashes with the witcher 3 geforce drivers? Really weird. I do sometimes, very randomly, but this started with the GTA 5 drivers for me. I've had a hard time troubleshooting as I can play for hours and hours without it happening and I don't have enough time to play for 6 hours to make it crash!
|
# ? May 20, 2015 23:29 |
|
Foxhound posted:Anyone else having nvlddmkm driver crashes with the witcher 3 geforce drivers? Really weird. Tons of people are having these issues, myself included. I have tried everything but rolling back to the previous driver version as that does not have SLI support for Witcher 3. There are lots of things you can try that have worked for some people though. Including: Uncapping FPS Turn off VYSNC Run in Fullscreen Lower plant draw distance Change Power settings in Nvidia control panel to performance for Witcher 3 Clean driver install etc Results vary for everyone, and some people like myself are 100% unable to get the game working. My crashes are roughly 2-5 minutes into the game each time, and only in Witcher, nothing else has issues. Witcher doesn't even have time to stress the cards, they barely hit 60c before the crash usually.
|
# ? May 20, 2015 23:35 |
|
I've seen a host of 'solutions' fiddling with settings that don't end up fixing anything. (vsync, driver power mode settings, etc...) I finally ended up simply boosting the power limit and voltage a tad in Afterburner and (so far) I've been good.
|
# ? May 20, 2015 23:37 |
|
Nill posted:I've seen a host of 'solutions' fiddling with settings that don't end up fixing anything. (vsync, driver power mode settings, etc...) I've seen people talking about increasing voltage but how much did you increase it by?
|
# ? May 21, 2015 00:05 |
|
Do the GeForce Experience recommended settings take into account SLI? It's recommending pretty low choices for my dual-970 setup. Maybe it's a ploy to lure me into upgrading. Maybe it'll work!
|
# ? May 21, 2015 00:27 |
|
Massasoit posted:I am amazed at the energy efficiency gains recently in computer parts. From my understanding, this is because all the Maxwell chips are basically underclocked and undervolted. Which explains the ridiculous overclocks you can get out of it - something like an extra 20-25% on small Maxwell and a full 50% increase on big Maxwell. Rather than really push the performance to the limit, they traded it for power efficiency. Once you push the cards closer to their real capabilities they start chugging power just as badly as Kepler or AMD. At which point they drastically outperform them, of course. Overclocked Titan X's are crazytown performance, if I won the lottery I'd have a couple of them in SLI clocked to hell with a custom water loop. Looking at it in that sense, it's the same kind of gimmick as Turbo Clocks on processors. They get to advertise super low power usage and "wink-wink nudge-nudge they overclock like a boss," but since the chip meets its published (under-specified) performance target there's no guarantees of any particular unit actually hitting a specific performance target. And they get to sell chips that would fail to pass quality-control if they were specified at a higher clock rate. Paul MaudDib fucked around with this message at 00:58 on May 21, 2015 |
# ? May 21, 2015 00:41 |
|
Subjunctive posted:Do the GeForce Experience recommended settings take into account SLI? It's recommending pretty low choices for my dual-970 setup. Maybe it's a ploy to lure me into upgrading. Maybe it'll work! Make sure SLI is actually enabled. It disables after every driver update.
|
# ? May 21, 2015 00:49 |
|
Cinara posted:Make sure SLI is actually enabled. It disables after every driver update. Thank you! Edit: hmm, that didn't help. Still suggesting lower-than-High including disabled Hairworks and DOF, plus medium texture quality. Hmm hmm hmm. Subjunctive fucked around with this message at 00:53 on May 21, 2015 |
# ? May 21, 2015 00:50 |
|
Paul MaudDib posted:From my understanding, this is because all the Maxwell chips are basically underclocked and undervolted. Which explains the ridiculous overclocks you can get out of it - something like an extra 20-25% on GM100 and a full 50% increase on GM200. Rather than really push the performance to the limit, they traded it for power efficiency. Once you push the cards closer to their real capabilities they start chugging power just as badly as Kepler or AMD. At which point they drastically outperform them, of course. Overclocked Titan X's are crazytown performance, if I won the lottery I'd have a couple of them in SLI clocked to hell with a custom water loop. Even if they are under clocked, the performance per watt is still increasing, which is cool to see.
|
# ? May 21, 2015 00:54 |
|
Subjunctive posted:
Why would you ever use geforce experience to recommend settings with 2 970s? Throw everything on max and gently caress the police. Unless you're running crazy high resolutions of course.
|
# ? May 21, 2015 01:10 |
|
cat doter posted:Why would you ever use geforce experience to recommend settings with 2 970s? Throw everything on max and gently caress the police. Running 1440, but hoping to keep Witcher 3 above 60fps and I thought maybe it could...no, you're right, I'm being stupid.
|
# ? May 21, 2015 01:11 |
|
Massasoit posted:Even if they are under clocked, the performance per watt is still increasing, which is cool to see. That's why you under-clock+under-volt, though - to increase performance-per-watt. For example, compare the Intel i5-4690K with the i5-4690T. Same processor, one is undervolted for the server market. The undervolted variant benchmarks about 10% slower but consumes half the power (binning is also a factor here too, low-power units are usually the top bin, although the K should be in a decent bin too). And the 4690K is not exactly pushed to the limit either, you can squeeze another 25% performance increase out of it if you're willing to drive power consumption up by another 80%. What's nice is that performance is still good in absolute terms - even if they're not really pushing the silicon hard, they'll still play with high settings and all that jazz. On the same note, I wonder if running the AMD cards a bit slower might curb their power usage. Based on the 980 and the 4690, it seems like a small reduction in performance target can yield a big increase in performance-per-watt. Paul MaudDib fucked around with this message at 01:38 on May 21, 2015 |
# ? May 21, 2015 01:12 |
|
kojicolnair posted:I've seen people talking about increasing voltage but how much did you increase it by? Make sure you go into settings and tick all the compatibility options EXCEPT "force constant voltage" and see that "unlock voltage control" is set to extended.
|
# ? May 21, 2015 01:32 |
|
Subjunctive posted:Running 1440, but hoping to keep Witcher 3 above 60fps and I thought maybe it could...no, you're right, I'm being stupid. Yeah its way better to max out and start reducing the big name settings to get where you want imo. I've never particularly been impressed with any auto configuration (not that ive used geforce experience much for that though). Way back when I was sperging over every fps and every frame drop and had every game dialed in to an obsessive level, I looked at the geforce experience recommendations after the fact and it was sad. Way too much or way too little. It'd probably have been alright if I didn't care as much though, which will probably encompass most users in some way or another, but regardless it just wasn't great any way I looked at it. penus penus penus fucked around with this message at 03:53 on May 21, 2015 |
# ? May 21, 2015 03:50 |
|
I feel like for stuff like geforce experience to matter it needs frame rate analysis tools for whatever settings that are recommended. It could offer 3 options 30, 60 and 120fps+ and for it to accept settings for those frame rate targets it needs to maintain those frame rates let's say...95% of the time. That way you know what you're getting and the settings recommended are better controlled.
|
# ? May 21, 2015 06:12 |
|
cat doter posted:I feel like for stuff like geforce experience to matter it needs frame rate analysis tools for whatever settings that are recommended. It could offer 3 options 30, 60 and 120fps+ and for it to accept settings for those frame rate targets it needs to maintain those frame rates let's say...95% of the time. That way you know what you're getting and the settings recommended are better controlled. It's not about scientific metrics, man, it's about the GeForce Experience! Dad, you're embarassing me, god! agreed
|
# ? May 21, 2015 08:04 |
|
Paul MaudDib posted:It's not about scientific metrics, man, it's about the GeForce Experience! If vee dub freak or agreed were your dad you wouldn't have to worry about things like average fps.
|
# ? May 21, 2015 13:19 |
|
http://arstechnica.co.uk/gaming/2015/05/amd-says-nvidias-gameworks-completely-sabotaged-witcher-3-performance/ Inflammatory butthurt title! Then you read the actual content and you're like, huh? Been noticing a lot of this the last few months
|
# ? May 21, 2015 15:53 |
|
"Hairworks is spectacularly inefficient" says AMD, while continuing to never demonstrate TressFXes performance scaling over the wide LOD ranges required for furred enemies in an open world game.
|
# ? May 21, 2015 16:04 |
|
THE DOG HOUSE posted:http://arstechnica.co.uk/gaming/2015/05/amd-says-nvidias-gameworks-completely-sabotaged-witcher-3-performance/ I want to think this is just AMD crying out over nothing with dumb excuses, but considering the moves nvidia keeps pulling, I really can't tell if it's true or not.
|
# ? May 21, 2015 16:10 |
|
Paul MaudDib posted:From my understanding, this is because all the Maxwell chips are basically underclocked and undervolted. Which explains the ridiculous overclocks you can get out of it - something like an extra 20-25% on small Maxwell and a full 50% increase on big Maxwell. Rather than really push the performance to the limit, they traded it for power efficiency. Once you push the cards closer to their real capabilities they start chugging power just as badly as Kepler or AMD. At which point they drastically outperform them, of course. Overclocked Titan X's are crazytown performance, if I won the lottery I'd have a couple of them in SLI clocked to hell with a custom water loop. Heh, you only need 1 Titan X. I'm gonna mess around with mine again this weekend and see if I can get a proper stable bios back at 1400+. Started getting crashes and put the stock bios back on it. Even at normal clocks this card murders everything I throw at it even on triple 1440. What's funny about geforce experience is that it keeps suggesting that I run my games at half my native resolution. What kind of silly crap is that.
|
# ? May 21, 2015 16:11 |
|
Truga posted:I want to think this is just AMD crying out over nothing with dumb excuses, but considering the moves nvidia keeps pulling, I really can't tell if it's true or not. Hairworks tanks performance on pre-900 Nvidia cards almost as bad as on AMD cards. Honestly I was shocked that it works on Radeons at all. They need to shut up and make completely new chips that don't suck at tessellation/other modern stuff.
|
# ? May 21, 2015 16:38 |
|
Apparently the EVGA hydro AIO works on the titan X. So tempting.
|
# ? May 21, 2015 16:42 |
|
sauer kraut posted:They need to shut up and make completely new chips that don't suck at tessellation/other modern stuff. True. vvvv: TruForm was glorious when it worked right. Truga fucked around with this message at 17:03 on May 21, 2015 |
# ? May 21, 2015 16:52 |
|
sauer kraut posted:Hairworks tanks performance on pre-900 Nvidia cards almost as bad as on AMD cards. They did that with the Radeon 285: GCN 1.2 has great tessellation performance, but yeah, they need to roll out their newer chips. It's funny to me, though, because ATI tried their hand at tessellation long before anyone cared, with TruForm. But there can be no doubt that NVIDIA always is up for loving over the competition in ways that AMD doesn't do. AMD makes a point of showing how well TressFX runs on NVIDIA hardware, even. HalloKitty fucked around with this message at 17:05 on May 21, 2015 |
# ? May 21, 2015 16:59 |
|
sauer kraut posted:Hairworks tanks performance on pre-900 Nvidia cards almost as bad as on AMD cards. But look at their new 300 series! oh.. oh wait..
|
# ? May 21, 2015 17:03 |
|
HalloKitty posted:It's funny to me, though, because ATI tried their hand at tessellation long before anyone cared, with TruForm. They did it twice in fact, they pushed another non-standard tessellation extension with R600 long after TruForm died. It was ~the future of graphics~ right up until nVidia was better at it.
|
# ? May 21, 2015 17:08 |
|
I wasn't clear, my problem with the article is the title "AMD says Nvidia’s GameWorks “completely sabotaged” Witcher 3 performance" which is really the point that is going to stick, when in reality its basically the hair in the game that's causing the problem. It's okay to gripe about that, mostly, but then it goes on to say it's not like it'd have worked anyways with AMD's architecture even if they did "provide the source code". Nvidias hair technology didn't completely sabotage the entire game for you AMD, ffs.
|
# ? May 21, 2015 18:00 |
|
s/says/claims/ fixes the big problem with the title. Removes some objectivity.
|
# ? May 21, 2015 18:05 |
|
AMD GPUs can work around the GameWorks technology in Witcher 3. http://www.guru3d.com/news-story/the-witcher-3-hairworks-on-amd-gpus-with-normal-performance.html
|
# ? May 21, 2015 20:29 |
|
quote:AMD users have a way of enjoying The Witcher 3: Wild Hunt with the Nvidia HairWorks feature enabled and actually getting better performance than GeForce users not using a Maxwell architecture Nvidia GPU. It seems a bit ironic but exciting nonetheless. Bullshit, I want to see benchmarks. Video games are temporal medium, it doesn't mean anything if you're running 15 FPS but the hair looks good.
|
# ? May 21, 2015 20:41 |
|
Here is a German article on it as well with in game footage and some fps numbers. http://www.pcgameshardware.de/The-Witcher-3-PC-237266/News/Trick-Hairworks-fluessig-AMD-Radeon-Grafikkarten-1159466/
|
# ? May 21, 2015 21:01 |
|
Not really surprising considering how unoptimized some nvidia tech seems sometimes
|
# ? May 21, 2015 23:49 |
|
SwissArmyDruid posted:Bullshit, I want to see benchmarks. Video games are temporal medium, it doesn't mean anything if you're running 15 FPS but the hair looks good. Hairworks in Witcher 3 runs at some silly 64x tessellation and 8x MSAA by default. The MSAA is adjustable through ini files but only AMD users have a way of forcing a lower level of tessellation through the drivers.
|
# ? May 22, 2015 00:00 |
|
Ragingsheep posted:Hairworks in Witcher 3 runs at some silly 64x tessellation and 8x MSAA by default. The MSAA is adjustable through ini files but only AMD users have a way of forcing a lower level of tessellation through the drivers. Yeah, and it looks pretty much the same at 16x tessellation (which actually runs fairly well on even a 7950 apparently).
|
# ? May 22, 2015 01:58 |
|
|
# ? May 31, 2024 09:13 |
|
AVeryLargeRadish posted:Hmmm, have you tried disabling or uninstalling the Intel integrated graphics? If you can do that then after a restart it should be forced to use the 750m instead. kode54 posted:No, see, he has Optimus. Which means the discrete graphics chip has no displays connected to it. It must framebuffer copy to the integrated graphics to function. Thanks for replying to my questions, always appreciated. I managed to track the issue down and solve it. As well as the Optimus setup with two graphics devices, I also recently picked up a mobile monitor that works over USB using DisplayLink. It turns out, this doesn't play well with fullscreen games, leading to all the D3D-related errors I was seeing. http://support.displaylink.com/knowledgebase/articles/543922-games-do-not-work-on-windows-with-displaylink-soft Luckily DisplayLink provide a tool to disable their drivers, letting me get back to gaming. Requires a restart on each switch, but does seem to solve the problem. Hope this helps, in case anyone is considering a mobile monitor setup.
|
# ? May 22, 2015 01:59 |