|
Pierson posted:Is there ever a point at which it ISN'T a good idea to upgrade my GPU and instead upgrade a different component? I'm using an i5-2500k system with 8GB of RAM and my GPU is a G770 that is getting kind of cranky and one of the fans is making a frankly annoying noise. A friend is offering me a cheap-ish 980 and I really want to take it off his hands. This is a 90% dumb question but I need to be sure because this is a big chunk of change; there isn't going to be some kind of weird thing where my other components will be too old/outdated to run the thing properly, right? You can have components (like CPU and motherboard and RAM) that need to be upgraded with an eye to compatibility, but in your case you are good to go.
|
# ? Jun 5, 2015 18:49 |
|
|
# ? May 28, 2024 21:05 |
|
Id' say for now and the last what, 5 -6, maybe 7 years? upgrading the GPU and going form HDD to SSD were the real performance picks, most everything hasn't mattered a ton. Maybe 2016/17 will make a big difference in the switch to Cannonlake and Zen+ along with DDR4 but for now, if you're running on an SSD than you should be okay with just upgrading the GPU.
|
# ? Jun 5, 2015 18:50 |
|
Truga posted:If minimap wouldn't exist, I'd say yes. As it is, it's minor at best, dota players will keep screaming at you to look at your minimap at all times, why does my zoom matter then? I agree that it isn't the best use of a super kickass ultrawide and I wish they could change it. Not to go crazy off-topic, but the minimap isn't going to show that the offlaners are pushing through the creepwave to dive you or that a stun is being cast, etc. You would just see the icons generally around the creep wave. If your camera is off checking the rune or whatever jungle vision you have that's an opportunity for things to happen in your lane that you don't know about. It's just not fair imo for some players to have that vulnerability and others not to, even though it's chance since the enemy won't know where you are looking. However, I still take advantage by playing windowed in the center in 1080p. For new heros I keep a build guide up on the left side of the game window and my music playlist on the right. Easy access to information that makes a real difference when learning new dudes!
|
# ? Jun 5, 2015 18:56 |
|
Subjunctive posted:You can have components (like CPU and motherboard and RAM) that need to be upgraded with an eye to compatibility, but in your case you are good to go. Looking forward to the upgrade I gotta say. The old 770 is really starting to show the cracks running Witcher 3, be nice to be future-proof for a couple more years.
|
# ? Jun 5, 2015 19:01 |
|
I don't mind upgrading questions, but there IS a very friendly and helpful PC upgrading thread, myself and plenty of others would love to help with upgrading questions there! It is highly trafficked and monitored by lots of friendly goons
|
# ? Jun 5, 2015 19:05 |
Pierson posted:Aight, cool, thanks man. Only thing I'd look out for is what you have your 2500k clocked at, it can make a ridiculously large difference depending on that.
|
|
# ? Jun 5, 2015 19:10 |
|
AVeryLargeRadish posted:Only thing I'd look out for is what you have your 2500k clocked at, it can make a ridiculously large difference depending on that.
|
# ? Jun 5, 2015 19:29 |
|
Probably. A 2500K is begging for an OC if you start feeling a performance bottleneck, those things can fly at over 4GHz to the point a good OC makes them competitive with anything without a -E suffix. That thing is one of the best CPUs in recent memory and has plenty of legs yet.
|
# ? Jun 5, 2015 19:51 |
Pierson posted:I've never touched the inside of my PC except to dust and replace an even older GPU, everything is clocked/tuned as standard. I imagine overclocking is a different thread though. Well, the thing is that I ran my 2600k at stock for a while and when I OCed it, and it was as easy as bumping the clock up in bios and running a stress test to make sure the OC was stable, it doubled my frame rates in many of my games. So yeah it can be a huge bottleneck for something like a 970 of you leave it at stock speeds.
|
|
# ? Jun 5, 2015 19:56 |
|
The reference PNY has been in stock at newegg all day, which at least does suggest somewhat better availability. I mean, I realize nobody really wants to buy a PNY card unless they get a deal on it, but I am sure it works fine. It is the stock pcb with the stock cooler, after all. It does seem that MSI is coming out with a lightning-like variant, though all they've shown is a box. Rumor is a couple months for that one. The twin frozr 6g version is supposed to be out by the end of the month. Not much news from Asus. Definitely interested in their three fan cooler though. Hopefully out by the end of the month, but I can't find anything. Maximumpc has some SLI scaling numbers. Works better with some games than others. Personally I've only ever stuck with a single card, but I could definitely see the appeal if you have a 1440p 144 Hz monitor, or are planning to get one. A little rich for me right now. http://www.maximumpc.com/nvidia-gtx-980-ti-2-way-sli-crushing-performance/
|
# ? Jun 5, 2015 21:37 |
|
Probably a long shot, but does anyone know if you can connect a splitter to the Hybrid cooler to run a push/pull config? I've found exactly one person asking this on the 980 version (I have the Titan X variant), EVGA won't tell him because they don't support it, and nobody else gave him an answer. I do have an fan controller I can use to run the extra fan, but I'll have to set its speed manually rather than letting the card control it as it does with the fan it came with.
KakerMix fucked around with this message at 21:41 on Jun 5, 2015 |
# ? Jun 5, 2015 21:38 |
|
KakerMix posted:Probably a long shot, but does anyone know if you can connect a splitter to the Hybrid cooler to run a push/pull config? I've found exactly one person asking this on the 980 version (I have the Titan X variant), EVGA won't tell him because they don't support it, and nobody else gave him an answer. I do have an fan controller I can use to run the extra fan, but I'll have to set its speed manually rather than letting the card control it as it does with the fan it came with. I will be installing the titan x version tonight but i think its only 3 pin and i use 4 pin noctua fans. Im just gonna run em off the motherboard and probably set them to go up with the cpu temp since the cpu temp will be rising with the gpu temp when playing games anyway.
|
# ? Jun 5, 2015 21:48 |
|
Don Lapre posted:I will be installing the titan x version tonight but i think its only 3 pin and i use 4 pin noctua fans. Im just gonna run em off the motherboard and probably set them to go up with the cpu temp since the cpu temp will be rising with the gpu temp when playing games anyway. I have the exact same stuff as you, down to 3 pin Noctuas. I went ahead and split off the lead coming from the Hybrid, fired up the system for a moment and both fans were spinning fine. Going to leave that and see how it does tonight since later I'll have to rip open the computer again anyway. KakerMix fucked around with this message at 22:10 on Jun 5, 2015 |
# ? Jun 5, 2015 22:03 |
|
AVeryLargeRadish posted:Well, the thing is that I ran my 2600k at stock for a while and when I OCed it, and it was as easy as bumping the clock up in bios and running a stress test to make sure the OC was stable, it doubled my frame rates in many of my games. So yeah it can be a huge bottleneck for something like a 970 of you leave it at stock speeds. I know wrong thread and all but I want to add my 2 cents and encourage an overclock of that cpu if possible to improve game performance. CPU benchmarks rarely to nearly never show benchmarks of online games (while online) and that is always where the biggest improvements are, but in this case you'd likely see improvements offline as well. There was noticeable fps (and fewer frame drops) with even the Haswell architecture even though you'll hear otherwise quite often. GPU is king, but there are a lot of games that benefit from CPU clock speed. Some dont though.
|
# ? Jun 5, 2015 22:41 |
|
KakerMix posted:I have the exact same stuff as you, down to 3 pin Noctuas. I went ahead and split off the lead coming from the Hybrid, fired up the system for a moment and both fans were spinning fine. Going to leave that and see how it does tonight since later I'll have to rip open the computer again anyway. Can't you hook 4 pin fans to 3 pin headers?
|
# ? Jun 6, 2015 00:56 |
|
xthetenth posted:Can't you hook 4 pin fans to 3 pin headers? Temps are way down though I'm at 27c at idle.
|
# ? Jun 6, 2015 00:57 |
|
xthetenth posted:Can't you hook 4 pin fans to 3 pin headers? You can try but YMMV, the PWM controller might freak out and misbehave if not fed a constant 12V. Noctuas are fairly tolerant since they include voltage dropping adapters in the package.
|
# ? Jun 6, 2015 01:01 |
|
The gigabyte nonreference is up for preorder on newegg now. http://www.newegg.com/Product/Produ...ID=6202798&SID= The windforce is one of the most effective coolers, though it has not been one of the quietest in the past. Sounds like they've finally added a mode which stops the fans at low utilization though. Also a hands on with the Asus nonreference. No numbers, just some fondling. Still, looks nice. https://www.youtube.com/watch?v=LxCAySMpFg8
|
# ? Jun 6, 2015 02:40 |
|
KakerMix posted:AFAIK not without an adapter. I have these but I might just go back to manual control through my controller like I used to do. Because I have my Hybrid coolers installed but the included fans run at 100% all the time when connected to the lead coming from the kit's block on the GPU. I'm curious to see what your experience is with it Don Larpe as everything I can find talking about it says the fan is ~intelligently controlled~ which seems like that isn't true when used in the defualt configuration. I can always plug the rad fans into my fan controller and control it that way but I just don't know if I'm missing something, you know? Man, mines working awesome. ~27-30c idle. Running 1430mhz right now @ 60c with my noctua fans around 800-1000rpm. Basically inaudible. I keep banging on the power limit though. Gonna flash a new bios. I wrapped the fan cable around the block inside the shroud and im running my 2 noctua fans off one header that adjusts based on cpu temp.
|
# ? Jun 6, 2015 03:24 |
|
What's the lowdown on all the new AMD stuff that's being announced in a week or so? Is 20nm going to walk all over the Nvidia cards?
|
# ? Jun 6, 2015 04:10 |
|
orly posted:What's the lowdown on all the new AMD stuff that's being announced in a week or so?
|
# ? Jun 6, 2015 04:11 |
|
orly posted:What's the lowdown on all the new AMD stuff that's being announced in a week or so? Is 20nm going to walk all over the Nvidia cards? That is not mathematically impossible, but it's close.
|
# ? Jun 6, 2015 04:19 |
|
orly posted:What's the lowdown on all the new AMD stuff that's being announced in a week or so? Is 20nm going to walk all over the Nvidia cards? Take a 290x, rename it 390x. Thats the new amd stuff
|
# ? Jun 6, 2015 04:36 |
Don Lapre posted:Take a 290x, rename it 390x. Thats the new amd stuff Also, if rumors are correct, increase the MSRP! Except for the ultra-top-of-the-line HBM cards! Which are rumored to be hotter, slower and more expensive than their Nvidia equivalents. So yeah, coming up all roses for AMD!
|
|
# ? Jun 6, 2015 06:08 |
|
What happened to AMD? Did they just go all in on HBM and smaller production nodes being ready for these cards and had to backpedal on specs and release date just to get something out the door?
|
# ? Jun 6, 2015 06:35 |
|
I love it when new tech is in the air.
|
# ? Jun 6, 2015 07:02 |
Blorange posted:What happened to AMD? Did they just go all in on HBM and smaller production nodes being ready for these cards and had to backpedal on specs and release date just to get something out the door? We don't really know the whys and hows at this point and it's all just rumors but the silence from AMD is not exactly reassuring. I think Nvidia just has the lead on them techwise and they are doing what they can but it's not enough, which sucks because while I find this all sort of morbidly funny a video card industry dominated by one player will just lead to higher prices and tech stagnation eventually.
|
|
# ? Jun 6, 2015 07:06 |
|
EVGA GeForce GTX 980 Ti Superclocked+ ACX 2.0+ currently in stock. Just snagged one!
|
# ? Jun 6, 2015 07:11 |
|
Are you talking about the rumor that Fiji was slower than the 980Ti? That rumor didn't even last 24 hours. http://wccftech.com/amd-radeon-r9-fury-fiji-xt-gpu-slower-gtx-980-ti/
|
# ? Jun 6, 2015 07:22 |
|
AVeryLargeRadish posted:Also, if rumors are correct, increase the MSRP! That's the worst case predictions. I'm praying that their claims that Fiji wouldn't use more power than a 290X (which isn't much more than a new titan, it's power/performance they do badly at) and the likely number of cores from a double Tonga make it a good competitor and the 200 series gets a respin like NV's one from 400 to 500 series. But maybe that's dogged optimism and a refusal to believe they wouldn't raise prices without raising value. Not sure how likely a respin is but I think a strong showing from Fiji is likely. The last time AMD went to new memory tech before NV, those cards were beasts, and just the sheer increase in die size makes it very likely it's going to be a huge performer. But basically AMD seems to have been betting on 20nm, and that didn't happen. So now they have a new high end card and old cards that aged like champions. 290x started worse than 780ti and has pulled a bit ahead, and the 7970 GHz that became the 280X makes the 680 look like a joke these days. But they're competing against up to date cards, and GPUs get old very fast. xthetenth fucked around with this message at 07:36 on Jun 6, 2015 |
# ? Jun 6, 2015 07:33 |
|
Perhaps they dug themselves into a little corner by cutting prices so steep on the 200 stuff..
|
# ? Jun 6, 2015 15:26 |
|
The Hybrid cooled TitanXs and 980tis are bitchin', I can't seem to push mine north of 40c under load.
|
# ? Jun 6, 2015 16:07 |
|
What kind of % overclock can those hybrid 980ti's get? Do they get voltage control?
|
# ? Jun 6, 2015 17:39 |
|
Is Radeon R9 290x the best value card right now? I want to be able to play most games on high but not max settings for the next 2-3 years. Should I wait until the end of the month for Radeon's new release to see if prices drop?
|
# ? Jun 6, 2015 18:05 |
|
Best price/perf is 970 right now.
|
# ? Jun 6, 2015 18:31 |
|
THE DOG HOUSE posted:Perhaps they dug themselves into a little corner by cutting prices so steep on the 200 stuff.. As far as I know, they made two huge mistakes. The first is their mobile lineup. What mobile lineup you ask? Exactly. The second is they launched the 290 at a reasonable price, the buttcoin craze hit them and they sold like hotcakes at a higher price. They didn't seem to notice it was a bubble, and so they ramped up production in a huge way and have been stuck selling those cards till pretty much now. Howard Phillips posted:Is Radeon R9 290x the best value card right now? If you're comfortable dropping some settings and are looking at that price range, it's a very solid card and other than power draw very competitive with the 970 in performance, but right now it's looking like the 290 may be an even better value. Here's a comparison between the 290X running uber so it doesn't throttle (aftermarket cooled ones run at that performance level but cooler and quieter) and a 970: http://anandtech.com/bench/product/1059?vs=1355 Here's a French site that actually tests with aftermarket coolers' summary page, showing where a 290X Tri-X OC fits up against a Gigabyte GTX 970 G1 Gaming as well as the stock parts: http://www.hardware.fr/articles/928-20/recapitulatif-performances.html The frequencies the cards hit during their tests in various games are in this table, and it looks like the Tri-X OC was at a steady 1000 MHz: http://www.hardware.fr/articles/928-9/protocole-test.html Results will change if you go for a big OC. On Newegg right now, there's a Sapphire Tri-X OC 290 for $270 before $20 rebate, a Gigabyte 290X with a 1000 MHz core clock same as the Tri-X for $295 with a $20 rebate card, and an EVGA superclocked 970 that's pretty close to the G1 Gaming for $324 with a $15 rebate. Unless you're going to OC or are really worried about power consumption and can't replace a few incandescent light bulbs with LEDs, I'd say the 290/290X is the better pick for value, with the 290 probably winning if you don't really need that last few frames. If you're comfortable lowering a setting or two that might be your best bet. xthetenth fucked around with this message at 18:53 on Jun 6, 2015 |
# ? Jun 6, 2015 18:50 |
|
xthetenth posted:
Thanks for great help. I think I'm gonna go with the EVGA 970.
|
# ? Jun 6, 2015 18:58 |
|
.
Somebody fucked around with this message at 20:13 on Jun 6, 2015 |
# ? Jun 6, 2015 19:31 |
|
I feel like it was just yesterday when I was arguing and generally failing to make the case the for some users could make use of a 780/290 for 1080p. Seeing all this 980ti love warms my heart
|
# ? Jun 6, 2015 19:44 |
|
|
# ? May 28, 2024 21:05 |
|
Howard Phillips posted:Thanks for great help. I think I'm gonna go with the EVGA 970. Cool, glad to help. 970's a drat good card, my G1's been pushing 1440p like a champ on top settings, so 3 years should be easily doable if you don't mind turning a few settings down in future AAA games. Isn't two months within the timeframe for step-up? THE DOG HOUSE posted:I feel like it was just yesterday when I was arguing and generally failing to make the case the for some users could make use of a 780/290 for 1080p. Seeing all this 980ti love warms my heart DSR/VSR helps the case so you can forcefeed games pixels if you want more to do, but games keep marching on as well. Personally I tend to think going with midrange cards/old flagships you think will age gracefully and replacing them more often is probably the better course unless you're going to be buying every other generation anyway. It used to be the GTX x60 and AMD equivalent, but they've moved that type of chip up to the x70 now. Somebody fucked around with this message at 20:13 on Jun 6, 2015 |
# ? Jun 6, 2015 19:45 |