|
exquisite tea posted:On the subject of the GTX 970, is it still the best price:performance investment for current-gen if all I wanna do is play at max settings @ 1080 with minimal overclock finagling? The 390 is arguably the better card for the near future, but the 970 is also a solid choice if you prefer NVIDIA or have a weaker CPU (AMD's dx11 driver overhead being a possible concern).
|
# ? Mar 7, 2016 13:15 |
|
|
# ? May 31, 2024 11:43 |
|
I use rivatuner for frame limiting, since it's already installed for me to monitor things via afterburner. Works great can't say it's ever failed me
|
# ? Mar 7, 2016 13:31 |
|
Is there any reason to uninstall drivers completely and reinstall from scratch? I've been doing express install with Nvidia for years
|
# ? Mar 7, 2016 13:51 |
|
HalloKitty posted:The 390 is arguably the better card for the near future, but the 970 is also a solid choice if you prefer NVIDIA or have a weaker CPU (AMD's dx11 driver overhead being a possible concern). I'm on an i5-4570 and probably will be until I put together an entirely new system. I've favored nvidia in the past just for the convenience and shadowplay features, although I don't have some crazy allegiance. How does the 980ti compare at the top end, if I wanted to pay a little extra?
|
# ? Mar 7, 2016 14:23 |
|
980Ti is 970 SLI in both price and performance, but without all the SLI bullshit that comes with it. What I'm saying is, it's a really good deal. e: And unlike some other nvidia products with just enough ram to run current games, it might actually perform a bit longer than most at max details due to 6gb ram. Truga fucked around with this message at 14:54 on Mar 7, 2016 |
# ? Mar 7, 2016 14:41 |
|
Blacktoll posted:Is there any reason to uninstall drivers completely and reinstall from scratch? I've been doing express install with Nvidia for years I've read reports of increased performance following a complete cleanup in Safe Mode using something like DDU. It hasn't happened to me personally but it can't hurt, especially after years' worth of express installs.
|
# ? Mar 7, 2016 14:44 |
|
sauer kraut posted:A 390 is really close if you have a 600W+ PSU to run it, that 8GB is looking better every month. Wait how is a like I feel like a puppet but there is endless proof of this for years now https://www.youtube.com/watch?v=t4uUgIkFa8o&t=309s The 8gb 390 and 390x was an in-your-face marketing insult that only took one year to finally trickle through as a "good thing". Guys, its not, its not a selling point, it never will be, it never can be, please dont tell people to buy a 390 because it has 8gb of ram. penus penus penus fucked around with this message at 16:40 on Mar 7, 2016 |
# ? Mar 7, 2016 16:20 |
|
THE DOG HOUSE posted:Wait how is a That sweet sweet 100% scaling in Tomb Raider? Serious though the 8 GB is nice because it's bigger than 4. We know that Rise of the Tomb Raider uses more than 3.5 GB on highest textures because the 970 gets stuttering issues. We know it's less than 4 because the 290 is fine (and I'm pretty sure the 980 but those are rarer and I haven't seen any evidence either way), but NV's already saying to avoid that setting in that game with a 4 GB card. It's like the 4 GB 960 and 380X. They can't use all that memory but the 2 GB they'd have otherwise isn't enough.
|
# ? Mar 7, 2016 16:41 |
|
xthetenth posted:That sweet sweet 100% scaling in Tomb Raider? If you click the video I linked it to the relevant spot. Specifically for the 290 and 390 if you use over 4 gb it doesnt matter because thats not where it bottlenecks. Its very likely that the settings youre talking about use more than 4gb
|
# ? Mar 7, 2016 17:01 |
|
snuff posted:Personally I would rate the R9 390 a 8/8 and the GTX 970 a 3.5/4. I wasn't expecting to laugh in this thread. Well played.
|
# ? Mar 7, 2016 17:34 |
|
THE DOG HOUSE posted:If you click the video I linked it to the relevant spot. Specifically for the 290 and 390 if you use over 4 gb it doesnt matter because thats not where it bottlenecks. Its very likely that the settings youre talking about use more than 4gb Bottlenecks can change, but are you saying that the 290 somehow does better with its ram than the 980? That'd be pretty weird honestly, although with the Fury using a weird rear end caching in VRAM backed by main memory for overflow scheme, that may be the case. Also bandicam is cool and good and does exactly what I wanted as a frame rate limiter. xthetenth fucked around with this message at 18:11 on Mar 7, 2016 |
# ? Mar 7, 2016 17:59 |
|
And for more vram data and perhaps some interesting cases of when it is a good idea to get double the vram (for current lineup offerings) http://www.techspot.com/review/1114-vram-comparison-test/ But in the case of a 390, 8gb is not usable. quote:The 390 and 390X are really graphics cards we never wanted. At the time of their release the Radeon R9 290 and 290X were exceptional buys. The 290X cost just $330, while today the 390X costs around $100 more for no additional performance and it is no different with the 290 and 390. xthetenth posted:Bottlenecks can change, but are you saying that the 290 somehow does better with its ram than the 980? That'd be pretty weird honestly, although with the Fury using a weird rear end caching in VRAM backed by main memory for overflow scheme, that may be the case. Bottlenecks can change but I'm not going to bet on this one. I have no idea if the 290 does better with its ram than a 980. 290's certainly were king of memory intensive tasks when they were up against kepler cards, but the actual amount of vram starts taking a back seat in that comparison versus the architecture differences - which I dont know enough about to even pretend to understand. penus penus penus fucked around with this message at 18:06 on Mar 7, 2016 |
# ? Mar 7, 2016 18:01 |
|
The 290x cost 530 dollars on release though but still, 4gb gddr5 isn't worth that much of a price hike for the same card. the only saving grace the 390s have is that the 290s are out of production Anime Schoolgirl fucked around with this message at 18:06 on Mar 7, 2016 |
# ? Mar 7, 2016 18:04 |
|
The 390s are really priced pretty much where the cards belong if it weren't for the 290's reputation being based on that godawful blower. I'm pretty sure that post 390 launch their price went up too.
|
# ? Mar 7, 2016 18:20 |
|
Anime Schoolgirl posted:The 290x cost 530 dollars on release though Taken out of context that quote reads strangely, but it means at the time of the 390 release rather than the msrp of the 290 which was definitely higher.
|
# ? Mar 7, 2016 19:31 |
|
I have a 3760k with 970 and I've been getting driver crashes in just about every game, from graphically simple to the complex. It's not heat, card runs sub 70c even under heavy load. Latest drivers installed, tried uninstalling gforce experience, and no difference. Anything I can try? The drivers always recover in game, no ctd.
|
# ? Mar 7, 2016 20:42 |
|
Win10?
|
# ? Mar 7, 2016 21:10 |
|
Truga posted:980Ti is 970 SLI in both price and performance, but without all the SLI bullshit that comes with it. I ante'd up and bought the 980ti because a) I'm rich, why not and b) this is pretty much gonna be my last upgrade on this system before I build a new one anyway, might as well go all in.
|
# ? Mar 7, 2016 21:44 |
|
Hey, kids! Is your PC some crappy officeputer with a tiny power supply and no 6pin PCIe auxiliary power connectors? Well, now you can put a GTX 950 in it!
|
# ? Mar 7, 2016 22:04 |
|
exquisite tea posted:I ante'd up and bought the 980ti because a) I'm rich, why not and b) this is pretty much gonna be my last upgrade on this system before I build a new one anyway, might as well go all in. this is mostly a fake brag post but I quoted you because I felt the same way (part B not as much A though). I put together a brand new tower last night: screen: acer predator XB271HU bmiprz mobo: asus rog hero alpha cpu: i5-6600K cooler: h100i v2 ram: corsair dominator ddr4 16gb (2x8) psu: corsair ax860i case: corsair 450d ssd: 950 pro 256gb but I kept the old graphics card - a r280x - instead of getting a 980 ti because I feel like the next gen is right around the corner.
|
# ? Mar 7, 2016 22:16 |
|
xthetenth posted:I'm pretty sure that Cray's made a statement that their second half revenues are going to be tied to Pascal availability. So that very strongly counterindicates first half big Pascal (and consumer big Pascal in general). Second half starts July 1, and Cray doesn't realize revenue on a system until it has been installed and gone through acceptance testing, which can take weeks/months. If you put padding in for that, and then add in lead times for assembly/manufacturing - you could be anywhere from next week to late October as a window for GPU delivery. That said, the previous comment made about compute consumers getting first dibs on GP100 until stocks stabilize is dead on. At least it better be, or some of the large account Nvidia sales guys are going to need to go into hiding.
|
# ? Mar 7, 2016 23:45 |
|
Durinia posted:Second half starts July 1, and Cray doesn't realize revenue on a system until it has been installed and gone through acceptance testing, which can take weeks/months. If you put padding in for that, and then add in lead times for assembly/manufacturing - you could be anywhere from next week to late October as a window for GPU delivery. Yeah, that's true that that means the chips could be out for compute customers earlier, as a consumer I'm much more concerned with the implication that they're going to be buying as many as they can. afkmacro posted:this is mostly a fake brag post but I quoted you because I felt the same way (part B not as much A though). Top of the line 1440p on a 280X? Man after my own heart.
|
# ? Mar 7, 2016 23:53 |
|
Durinia posted:Second half starts July 1, and Cray doesn't realize revenue on a system until it has been installed and gone through acceptance testing, which can take weeks/months. If you put padding in for that, and then add in lead times for assembly/manufacturing - you could be anywhere from next week to late October as a window for GPU delivery. In your opinion, how much market share in the compute sector would AMD need for stabilization, and for comedy purposes, for heads to roll at Nvidia? Time frame and length of time for AMD to take advantage of any opening?
|
# ? Mar 8, 2016 00:03 |
|
Potato Salad posted:Win10? Windows 7 ultimate
|
# ? Mar 8, 2016 00:21 |
|
xthetenth posted:Top of the line 1440p on a 280X? Man after my own heart. Not just top of the line, but a G-Sync monitor with a 280x. What's the chance of a monitor ever coming out with both G-Sync and Freesync supported on the same model? Would NVidia even allow that?
|
# ? Mar 8, 2016 00:59 |
|
Maybe in a few years. Dual scalers doesn't make much sense from a cost perspective, but Nvidia can't refuse to move off of DP 1.2/1.3 forever. 1.4 just got published last week. But since the adaptive refresh bits are an industry standard, and indeed, Nvidia is already using the VBLANK bits for mobile Gsync (VBLANK being from eDP that were the basis upon which the VESA-standardized Adaptive Sync was built), I bet you that Nvidia will cook up some "G-Sync, now on ALL monitors!" marketing campaign and claim that they were on board all along. I don't expect it to happen before Intel start shipping variable refresh-enabled iGPUs, though. (They have already committed to this, in fact.) SwissArmyDruid fucked around with this message at 01:15 on Mar 8, 2016 |
# ? Mar 8, 2016 01:06 |
|
Just finished a system restore after nVidia hosed up trying to install new drivers for my 770. Have the latest drivers been weird for anyone else? I thought I'd check to see if the drivers were the problem before I tried updating again.
|
# ? Mar 8, 2016 01:09 |
|
Spiritus Nox posted:Just finished a system restore after nVidia hosed up trying to install new drivers for my 770. Have the latest drivers been weird for anyone else? I thought I'd check to see if the drivers were the problem before I tried updating again. I've heard bad things about them. Zero VGS posted:Not just top of the line, but a G-Sync monitor with a 280x. Not sure. I think it's more likely that NV supports Freesync than a monitor doing both. That way GSync remains a lock-in for them. All I know is I want the bloody XR341CK to stop being so ridiculously priced compared to november and december.
|
# ? Mar 8, 2016 01:39 |
|
xthetenth posted:I've heard bad things about them. Where do I go to hear about stuff like this? For the record I have the latest drivers installed.
|
# ? Mar 8, 2016 01:44 |
|
Spiritus Nox posted:Just finished a system restore after nVidia hosed up trying to install new drivers for my 770. Have the latest drivers been weird for anyone else? I thought I'd check to see if the drivers were the problem before I tried updating again. The new installer can freak out if you have multiple monitors for some reason. NV are saying to disconnect all but one monitor until it's installed, then you're fine to plug others back in.
|
# ? Mar 8, 2016 01:48 |
|
Spiritus Nox posted:Just finished a system restore after nVidia hosed up trying to install new drivers for my 770. Have the latest drivers been weird for anyone else? I thought I'd check to see if the drivers were the problem before I tried updating again. Yep. Walked out of the room during my update and came back to a disaster. Had to roll them back since the monitor would turn off then the PC when booting up.
|
# ? Mar 8, 2016 03:28 |
|
repiv posted:The new installer can freak out if you have multiple monitors for some reason. NV are saying to disconnect all but one monitor until it's installed, then you're fine to plug others back in. This seemed to work. We'll see if I end up regretting this upgrade.
|
# ? Mar 8, 2016 03:34 |
|
Zero VGS posted:Not just top of the line, but a G-Sync monitor with a 280x. You know the monitor works with ati cards right? I'm just waiting for the new nvidia cards. Couldn't rationalize buying something just for a few months.
|
# ? Mar 8, 2016 04:43 |
|
I kind of made the mistake of jumping on board with a Strix GTX 960, which is a little weaker than I'd like, at least if I ever go on board with VR equipment. I'm currently listing my Rift DK2, which I won't link here because that would be advertising, but I may try listing it in SA Mart instead. Anyway, I'm wondering if I should get a Strix 970, or go all the way for a 980 instead, and keep the 960 as a PhysX card. I may also note that a lot of my gaming ends up being streamed using Steam In Home Streaming over a GigE link to my iMac, which shares the desk with my secondary monitor, which also happens to have the PC connected to two of its inputs. I also wonder if it's even worth keeping the machine's i7 3770 integrated graphics enabled, since some things I've tried to stream inexplicably end up streaming through QuickSync instead of the NVEnc, such as Mercury's Timeless demo.
|
# ? Mar 8, 2016 08:53 |
|
Mea culpa on anything I've said about AMD taking this next gen by the horns, we probably won't be seeing Big Polaris anytime soon. Apparantly SK Hynix won't even be making HBM2 until Q3? (I don't know the reliability of golem.de, so please monitor your sodium intake accordingly) Since Samsung already started production on HBM2 back in January (https://news.samsung.com/global/samsung-begins-mass-producing-worlds-fastest-dram-based-on-newest-high-bandwidth-memory-hbm-interface), Nvidia might actually be first to market with HBM2 parts. http://www.golem.de/news/high-bandwidth-memory-sk-hynix-produziert-4-gbyte-stapel-ab-dem-dritten-quartal-1603-119580.html SwissArmyDruid fucked around with this message at 10:58 on Mar 8, 2016 |
# ? Mar 8, 2016 10:56 |
|
that assumes that samsung doesn't also have a contract for HBM with AMD, which i'd figure was part of that fab contract
|
# ? Mar 8, 2016 14:17 |
|
Maybe I'm just being too doom and gloom, because of course Nvidia getting an HBM2 part out before AMD would be par for the course. I can see a production offload agreement between GloFo, Samsung, and AMD having been inked, but suspect that the HBM2 might need to be sourced from Hynix. Like, another one of those "we will acquire a minimum amount X of HBM2 from you guys" kinds of agreements that had AMD's balls stapled to TSMC... last year? The year before? I am entirely happy to be wrong, though. Someone either in this thread or the AMD thread (Sorry, I forgot your name!) made the argument that Samsung is ultimately the best to handle HBM-enabled part manufacture, as they are so vertically integrated and can handle production of the CPU/GPU, the HBM, the interposers, the TSVs, and then put them all together into a completed product to ship out the door to Sapphire or whatever. SwissArmyDruid fucked around with this message at 15:04 on Mar 8, 2016 |
# ? Mar 8, 2016 14:29 |
|
kuroshi posted:I kind of made the mistake of jumping on board with a Strix GTX 960, which is a little weaker than I'd like, at least if I ever go on board with VR equipment. I'm currently listing my Rift DK2, which I won't link here because that would be advertising, but I may try listing it in SA Mart instead. Don't buy a current gen (2013-14 tech) card for VR, new die shrinks are coming out in a few months. Not sure if PhysX cards are even a thing anymore. The last I heard of it was the early Batman Arkham games where accelerated PhysX was so crashy that you had better turn it off, and Mafia II I guess? Most games I played (XCOM EU, Styx from the top of my head) use PhysX in CPU-only mode. sauer kraut fucked around with this message at 14:46 on Mar 8, 2016 |
# ? Mar 8, 2016 14:40 |
|
SwissArmyDruid posted:Maybe I'm just being too doom and gloom, because of course Nvidia getting an HBM2 part out before AMD would be par for the course. I can see a production offload agreement between GloFo, Samsung, and AMD having been inked, but suspect that the HBM2 might need to be sourced from Hynix. Like, another one of those "we will acquire a minimum amount X of HBM2 from you guys" kinds of agreements that had stapled their balls to TSMC... last year? The year before? That looks like it matches with NV going in on a big chip about as early as possible and AMD waiting till later because they can't rely on corporate budgets to give them enough money to make eating those awful yields worthwhile. Or maybe they've got Samsung ram ready to blow our minds (it's almost definitely the former, they're racing to have a big chip before NV has big chips they can't sell for compute). Incidentally what in the everloving hell is going on with The Division. The loving 380X@1020 MHz is beating a 970@1330 MHz by an eye-watering 20% in 1440, 9% in 3440x1440, and losing by 13% at 4K. Thing is it's losing at 1080p by 16%. The 970 gets dropped out of a plane with concrete overshoes at anything over 1080. Good god, my 290 is beating the 970 I had by 30% at my resolution.
|
# ? Mar 8, 2016 15:57 |
|
|
# ? May 31, 2024 11:43 |
|
I thought the 970 was generally an awesome 1080 card but not for anything much higher resolution anyway.
|
# ? Mar 8, 2016 16:07 |