|
Kind of regret getting a reference 760 after looking at graphs Worthwhile to pickup after market cooling?
|
# ? Jul 13, 2013 01:40 |
|
|
# ? May 14, 2024 01:25 |
|
The cost of a good aftermarket DIY cooling solution would be halfway to another 760. So no, in my opinion, anyway.
|
# ? Jul 13, 2013 01:42 |
|
GK104s don't really benefit much from cooling anyways. They're more limited by the power delivery capabilities of the card. Don't worry about it.
|
# ? Jul 13, 2013 02:56 |
|
craig588 posted:GK104s don't really benefit much from cooling anyways. They're more limited by the power delivery capabilities of the card. Don't worry about it. I dunno, I think there's a good case that your ears benefit if you pick up any of the usual suspects in non-reference coolers But for performance, not a big deal unless there's an airflow problem. Generally speaking, a sufficiently aggressive fan profile can keep a card cool enough to prevent it from throttling due to temperature on the 600-series cards if that's a concern (though, really, one 13MHz turbo bin at a time is not gonna kill your performance), and of course we have much more overt control over its boost behavior with any Boost 2.0 equipped card.
|
# ? Jul 13, 2013 03:13 |
|
Agreed posted:I dunno, I think there's a good case that your ears benefit Yeah, I am back to running the H100 passively with mid 50s temperatures. I guess it comes down to if silence is worth the ~75$ for one of the coolers you can attach to a videocard. Actually, the universal Antec 620 is only 60$ on Amazon right now. http://www.amazon.com/electronics/dp/B004LWYE4Q The radiator's only half the size as the H100 radiator so you might need a slow fan, but it'll still work better than any of the 100$+ designed for one model videocard coolers.
|
# ? Jul 13, 2013 03:31 |
|
As far as partners go, is there any nvidia card makers that are best avoided? I'm looking at 770s, and specifically a Gigabyte one, but I have heard bad things about their motherboards as of late, and I was wondering if this continues with their video cards.
|
# ? Jul 13, 2013 05:07 |
|
Gonkish posted:As far as partners go, is there any nvidia card makers that are best avoided? I'm looking at 770s, and specifically a Gigabyte one, but I have heard bad things about their motherboards as of late, and I was wondering if this continues with their video cards. Thanks to the Greenlight program, which is similar to Intel's standardization of notebooks and not at all related to Greenlight , any card from any vendor has to match or beat the reference model in terms of noise, performance, and power delivery, among other more fussy stuff. The worst you should get, if it works, is maybe some less good chips in cheaper brands (that is, less overclockable). Gigabyte has historically been kind of a mediocre GPU maker because they have bean-counter cut parts in the "standard" models, especially in power delivery, which reduces longevity and compromises overclockability, but if Greenlight works as intended they shouldn't be able to do that anymore and still get chips from nVidia. And they were never XFX levels of awful in that regard (though GT-200 was the last time XFX made nVidia cards, they became notorious and remain so for making mediocre-to-outright-bad AMD cards now). Screw XFX. That said, Gigabyte has also done some pretty cool stuff. Windforce 3 cooler is a really nice cooler, competitive with EVGA's ACX at the GTX 780 level and outright better on the GTX 760; and they do some neat overbuilt stuff for high end overclocking, too. I think they're a pretty safe bet now that there are limits placed on what you can and can't do to a card if you expect to receive chips from nVidia and remain a partner/vendor. Agreed fucked around with this message at 05:22 on Jul 13, 2013 |
# ? Jul 13, 2013 05:18 |
|
I still have no complaints about my Gigabyte 670. I've been too squeamish to use that BIOS editor and void my warranty with a TDP target increase yet, so I'm stuck topping out at 1241MHz boost bracket, but being able to comfortably run my GDDR5 at 7468MHz all this time without error has been skippy. And the card's still holding up without issue since that 100C incident. And the fact that the Windforce3 can hold temps comfortably below 60C at top performance without ever yelping louder than a spun-up CPU fan that I don't hear through my headphones. That said, I'd still hop to a 780 if I could justify the price. Or 2 of them 760s if it's really such a good deal, haha.
|
# ? Jul 13, 2013 06:28 |
|
I'm not sure if my 650w PSU can handle dual 760s, plus I'd want to shoot for the 4GB ones, and they're just ridiculously overpriced right now (seriously, $100 for 2GB of loving VRAM? Really?). At least with the 770, I can grab this Gigabyte one with Windforce 3 (or whatever they're calling their cooling solution) for $450 and have good performance.
|
# ? Jul 13, 2013 06:31 |
|
That makes sense if you're upgrading from a 5xx for sure, but I couldn't justify a 770 over a 670 that OCs well. I need to go bigger!!
|
# ? Jul 13, 2013 06:34 |
|
Yeah, I'm upgrading from a 560 Ti, so it's a no-brainer.
|
# ? Jul 13, 2013 06:39 |
|
Gonkish posted:Yeah, I'm upgrading from a 560 Ti, so it's a no-brainer. Haha you're going to be over the loving moon, seriously, you're using a card that's a performance analog of my PhysX co-processor The 770 is a bad mother, but as TRR said, (also, nice god damned memory overclock, holy hell), not such a big jump for anyone with a 670 or 680. In fact, it's basically a side-grade for anyone with a solidly overclocked 680, just takes the memory overclocking part off your hands. 1241MHz is still where the "good ones" tend to top out, it's the same chip, just process improvements making harvesting them a more streamlined and effective process (hence every PCI-e 700-series card either being one or the other chip, no third or fourth this go around).
|
# ? Jul 13, 2013 15:30 |
|
Yeah, I figured as much. I grabbed this card 2 years ago because I needed a moderately priced upgrade for my old box after my previous card died. It's served me well, but I think it's destined to be a PhysX card now.
|
# ? Jul 14, 2013 04:40 |
|
If I was going to pickup a second 760 for SLI, is it okay to mix brands? I'm going to get one with an aftermarket cooler since it runs cooler, although my first one is a reference one.
|
# ? Jul 14, 2013 16:26 |
|
Mixing brands is fine. You want: 1) Same GPU version, die harvesting, and revision 2) Same amount of video RAM So two GeForce 770 2GB, for example? Works fine. 770 + 680? No, different revision of GK104. 770 2GB + 770 4GB? No, different VRAM amounts. 770 2GB + 760 2GB? No, different die harvestings of the GPU (i.e. the shader count and memory controller bitwidth are different). Overclocks do not matter, but you'll want to synchronize the cards yourself. Cooling doesn't matter. AMD is more forgiving about the differences between GPUs for CrossFire; you can mix and match a lot, though for best performance at the least wasted cost, identical cards are still best.
|
# ? Jul 14, 2013 16:50 |
|
Factory Factory posted:Mixing brands is fine. You want: How do I find out if its the same die harvesting and revision? I have this: http://ncix.com/products/?sku=85811&vpn=ZT-70401-10P&manufacture=Zotac and was hoping to grab this: http://ncix.com/products/?sku=85650&vpn=02G-P4-2763-KR&manufacture=eVGA for the quieter fans.
|
# ? Jul 14, 2013 17:16 |
|
All GeForce "XYZ" cards that support SLI have the same harvesting and revision as any other "XYZ" cards. It's the harvesting and revision that makes the difference between one card and another. Meaningless exception: in laptop and OEM revisions that cannot use SLI, Nvidia will mix and match different GPUs and configurations and clock speeds, in which case (if you were interested for some reason) you would find out the revision number, number of shaders, and memory bitwidth using GPU-Z or some other system information program. Factory Factory fucked around with this message at 17:25 on Jul 14, 2013 |
# ? Jul 14, 2013 17:22 |
|
Qualifier, some cards do ship with more aggressive BIOSes and that might cause some instability since they would display different boost and voltage behavior. I would be hesitant, for example, to mix an EVGA 780 ACX with a Gigabyte 780 Windforce 3, because their BIOS voltage ranges differ, as well as their default turbo bins. Might not cause issues, might cause issues? I honestly don't know as I haven't ever been in the situation of brand mixing and the VAST majority of the time when people use dual card setups they just make the order for 2x or plan on buying the same card down the road (bad plan, buy what you want to get you the performance you want now, future-proof is an oxymoron here - more like "sensible spending habits proof").
|
# ? Jul 14, 2013 17:41 |
|
Agreed posted:Qualifier, some cards do ship with more aggressive BIOSes and that might cause some instability since they would display different boost and voltage behavior. I would be hesitant, for example, to mix an EVGA 780 ACX with a Gigabyte 780 Windforce 3, because their BIOS voltage ranges differ, as well as their default turbo bins. Yeah, I meant to get two 760 sli's from the start and bought the Zotac because it was much cheaper. Didn't realize I could price match newegg.ca and get the non-reference cooler for the same price. Should I just get another zotac to ensure maximum capability?
|
# ? Jul 14, 2013 18:22 |
|
Phiberoptik posted:Yeah, I meant to get two 760 sli's from the start and bought the Zotac because it was much cheaper. Didn't realize I could price match newegg.ca and get the non-reference cooler for the same price. For SLI, often you want the reference cooler. It's much better at working with less airflow. Non-reference coolers, even the rather impressively done EVGA ACX one that actually is a 2-slot-exactly card as opposed to the usual 2.X (where the .X means it might as well be a 3-slotter) still exhausts air inside the case rather than outside. Reference blowers take air from inside and exhaust to the rear and outside of the case, making them usually a better choice for use in SLI unless you have exceptional airflow. And for absolutely certain SLI compatibility, yeah, there's no better way to be sure than getting a copy of the same card.
|
# ? Jul 14, 2013 19:00 |
|
Does Catalyst not have a way to detect for driver and software updates? The version we have at work in Dell Optiplexes can, but on my home PC it doesn't seem to be in there anywhere. On a separate note, I tried to install an HP utility called Desktop Assistant that uses DDC/CI (I think I got that acronym right) with my monitor. I saw some reviews online about the monitor, and they mentioned the software as being pretty useful. The app would only say something about No API detected though. Could this be video card related, or drivers, or OS or...what? I'm using Win8 Pro 64, with a Radeon 7950 and the monitor is connected by a DVI to HDMI cable if any of those could be a factor.
|
# ? Jul 15, 2013 04:38 |
|
Aphrodite posted:Does Catalyst not have a way to detect for driver and software updates? I haven't used a Radeon in a while, but I know that Steam can check to see if there's an update for Catalyst drivers, or at least it did.
|
# ? Jul 15, 2013 04:45 |
|
Aphrodite posted:Does Catalyst not have a way to detect for driver and software updates? It does auto detect when a new version is out, and there is a place in the catalyst control panel to manually check, but there is often a really long delay of several weeks between a new driver version coming out on the AMD website and the control panel showing an update available, even if you check manually. Gonkish posted:I haven't used a Radeon in a while, but I know that Steam can check to see if there's an update for Catalyst drivers, or at least it did. I seem to recall people having massive issues with catalyst drivers installed via steam, so I'd avoid installing drivers through steam.
|
# ? Jul 15, 2013 10:05 |
|
So I have a gtx 570 and recently I've been noticing that only in game cutscenes that the top half of the game is cut off and is black. The cinematic plays totally normally on the bottom half of the monitor but the top is sharply cut in half at the midpoint of the monitor horizontally. The monitors are 2 u2312hm monitors. Any idea what might be causing this?
|
# ? Jul 16, 2013 07:24 |
|
Ramadu posted:So I have a gtx 570 and recently I've been noticing that only in game cutscenes that the top half of the game is cut off and is black. The cinematic plays totally normally on the bottom half of the monitor but the top is sharply cut in half at the midpoint of the monitor horizontally. The monitors are 2 u2312hm monitors. Any idea what might be causing this? Are you using windows 7?? Have a look here if you are, seems to be a recent update breaking stuff for people. http://steamcommunity.com/app/236090/discussions/0/864973032857761296/ Specifically this one http://support.microsoft.com/kb/2803821
|
# ? Jul 16, 2013 08:52 |
|
lovely Treat posted:Are you using windows 7?? I am indeed. That looks like it will fix this issue exactly. Thanks a ton!
|
# ? Jul 16, 2013 09:03 |
|
I'm going to run with a budget GPU in my next build for the time being. Which would perform better in Skyrim at 1080p, a 2GB 7750, or a 1GB 7770 Ghz Edition? I know in general the 7770 is faster, but Skyrim loves it some VRAM. The rest of the specs would be an i5-4570, 8GB DDR3-1600 and a Sandisk Extreme 120GB SSD.
|
# ? Jul 18, 2013 03:08 |
|
MondayHotDog posted:I'm going to run with a budget GPU in my next build for the time being. Which would perform better in Skyrim at 1080p, a 2GB 7750, or a 1GB 7770 Ghz Edition? I know in general the 7770 is faster, but Skyrim loves it some VRAM. I wouldn't consider either of those cards to be adequate for 1080p gaming, and if your budget is that tight you should be ditching the SSD and using the money to buy a better GPU. I'd consider a radeon 7850 to be the absolute bare minimum for acceptable performance, and personally I would be making whatever sacrifices are needed elsewhere in your budget to afford an Nvidia gtx760.
|
# ? Jul 18, 2013 05:57 |
|
Yeah, if you can sneak a 760 in there, even the 2GB version will make Skyrim flow like butter.
|
# ? Jul 18, 2013 08:31 |
|
Gonkish posted:Yeah, if you can sneak a 760 in there, even the 2GB version will make Skyrim flow like butter. Literally anything will run Skyrim well, though. Kinda amazing what two years worth of driver updates and patches will do. A 760 will be fantastic for any new games coming out in the future, but 1080p Skyrim isn't exactly Crysis 3.
|
# ? Jul 18, 2013 10:32 |
|
FallenGod posted:Literally anything will run Skyrim well, though. Literally anything? poo poo, I should break out some old ISA slot graphics cards, why did I even upgrade?
|
# ? Jul 18, 2013 11:10 |
|
HalloKitty posted:Literally anything? poo poo, I should break out some old ISA slot graphics cards, why did I even upgrade? Given a choice between two ~$100 GPUs that can max out the game has was asking about in his post, replying "no, buy this $250 GPU instead" is a bit silly. I guess he needs 200 fps in Skyrim because of the word "literally", though, so good job with that.
|
# ? Jul 18, 2013 11:29 |
|
FallenGod posted:Given a choice between two ~$100 GPUs that can max out the game has was asking about in his post, replying "no, buy this $250 GPU instead" is a bit silly. His choices were based on faulty decision making. Both choices were bad. You cannot game adequately at 1080p with a $100 gpu, so building a gaming pc with one in it is a pointless waste of money, especially when he was including a solid state drive in his budget - the saving from ditching the SSD could have comfortably funded the better card and he would have had a substantially better PC for it.
|
# ? Jul 18, 2013 11:44 |
|
You really shouldn't be ditching a SSD unless all you do is play games.
|
# ? Jul 18, 2013 12:04 |
|
Skyrim runs fine on a Radeon 6850 as long as you don't install a high-res textures mod, and a 7770 GHz edition is pretty much equivalent, probably a smidge faster with driver optimizations now. And the 7770 GHz edition is the better choice than the 7750 in pretty much all cases, the exception being that your power supply can't hack the beefier card.
|
# ? Jul 18, 2013 12:06 |
|
The Lord Bude posted:You cannot game adequately at 1080p with a $100 gpu, so building a gaming pc with one in it is a pointless waste of money, edit VVVV Only if the sole purpose for his computer is playing games. Given how he's specing the computer, the dude probably isn't playing games all the time. Klyith fucked around with this message at 15:11 on Jul 18, 2013 |
# ? Jul 18, 2013 12:27 |
|
Klyith posted:You can't game with a 7770, but not everyone demands Ultra Super Quality and minimum 60fps to play a loving computer game. And not everyone can spend $500 a year on videocards like some of the people in this thread. Have a sense of perspective. I'm not suggesting anyone spends $500 a year on video cards. One video card is perfectly fine for a normal 3 year PC lifespan. My point was, He wanted to spend $x on a computer. For the same amount of money, allocated differently, He could have a computer better suited to its intended purpose.
|
# ? Jul 18, 2013 12:51 |
|
New 326 series Nvidia beta driver out, combining the 8.1 preview drivers with the rest of the family. Looks like some good improvements in some games.
|
# ? Jul 19, 2013 00:34 |
|
Agreed - I totally did get your PM but got carried away fixing my DNS and hosting which I discovered was broken when the OP lacked any images whatsoever Updated OP and thread title sorry for the delay movax fucked around with this message at 01:44 on Jul 19, 2013 |
# ? Jul 19, 2013 01:40 |
|
|
# ? May 14, 2024 01:25 |
|
The Lord Bude posted:His choices were based on faulty decision making. Both choices were bad. You cannot game adequately at 1080p with a $100 gpu, so building a gaming pc with one in it is a pointless waste of money, especially when he was including a solid state drive in his budget - the saving from ditching the SSD could have comfortably funded the better card and he would have had a substantially better PC for it. I should have given more details. Gaming is only a very small part of what the machine will be used for. It's main purpose is just web browsing & digital painting. But it's going to be my only computer, and I spend a lot of time on it, and people say the most significant upgrade you can buy is an SSD. If anything, I was considering keeping the SSD and going with an i3 instead of the i5. About the video card. There's only 3 modern games I'm concerned with; Skyrim, Deus Ex: HR and Civ 5. All three can run at 30 FPS at decent settings with a $100 card. There's no other new games or upcoming releases that interest me. I mostly play older games, and I'm not going to spend $150 more than I have to. Now when Fallout 4 comes out, I'll be upgrading to a proper card, but that probably won't be for awhile. That's why I said "for the time being".
|
# ? Jul 19, 2013 02:20 |