|
Agreed posted:Go with the 7750 if those are your options, also this here utility is in the OP please read the OP Ah thanks. I was looking for that on AT but didn't find it in the GPU section. movax posted:I'll answer your question with another question: is it possible for you to swap cases/motherboards to use dual-slot cards and a larger PSU? I'm just curious as to what's keeping you constrained to that selection range (i.e. you need a super quiet mini-ITX box for the living room or something). Sorry I didn't elaborate. This is why: http://support.lenovo.com/en_US/detail.page?LegacyDocID=MIGR-61232 As for why bother? For shits.
|
# ? Aug 1, 2012 17:09 |
|
|
# ? May 28, 2024 14:21 |
|
Factory Factory posted:Did we actually put that in the OP of this thread? Because it should be. Nope! So, apologies, guy asking about the comparison, you wouldn't have found it if you looked. movax posted:Update to...660 Ti and 7950, if I recall correctly? Or did the GHz edition launch? 660Ti and 7950, but really just 660Ti for performance and value. This generation, nVidia wins price:performance in a complete sweep, unless ATI has some mega trick up their sleeve (they don't, they'd have played it by now rather than price dropping twice and losing even more executives - to goddamned nVidia, even). ATI wins absolute performance, but it doesn't matter if they're not moving units because their performance parity tiers come in about $100 higher than nVidia's. Suggest memory overclocking, as it is almost certain to gain substantial performance the more bandwidth you can feed the hungry hungry 670 sitting behind the 192-bit bus. ~144GB/sec is bottlenecking the poo poo out of the card, raise that as much as possible and I'd imagine FPS will improve pretty dramatically. Agreed fucked around with this message at 17:15 on Aug 1, 2012 |
# ? Aug 1, 2012 17:13 |
|
That's what I thought, I updated to this:quote:Now, as of 2012-07-31: e: in-flight wifi is the coolest goddamn thing
|
# ? Aug 1, 2012 17:21 |
|
The 7970 GHz Edition is out. It's a ballsack-crumbling $650+ with the first wave, since the ultra-high-end, super-binned, single-slot-with-water-block SKUs have come first. But it actually ties/arguably exceeds the 680 at stock, depending on game choice and number of games sampled. Of course, it uses a lot more power to pull this off, but still; and it's a relative GPGPU powerhouse. Also the GeForce 680M is faster than the 7970M, being essentially a lower-clocked 660 Ti vs. a 7870. The 7970 is superior in price/performance, though, and still a screamer. Factory Factory fucked around with this message at 17:39 on Aug 1, 2012 |
# ? Aug 1, 2012 17:36 |
|
Factory Factory posted:The 7970 GHz Edition is out. It's a ballsack-crumbling $650+ with the first wave, since the ultra-high-end, super-binned, single-slot-with-water-block SKUs have come first. But it actually ties/arguably exceeds the 680 at stock, depending on game choice and number of games sampled. Of course, it uses a lot more power to pull this off, but still; and it's a relative GPGPU powerhouse. It is really bizarre but just as AMD pulled a Netburst with Bulldozer, ATI pulled a Fermi with the Southern Islands cards. nVidia went lean and segmented their products rather than their market, allowing them to budget specifically for gaming performance and as a result end up with superb chips they can afford to sell extremely competitively and still rake in the cash. ATI's "GPGPU powerhouse" is basically their Fermi, a too-general part that brings excellent brute performance but requires them to overdo virtually every aspect of their consumer lineup in really spendy ways, and they've got transistors running hotter than they need to and more logic going on than is necessary for games. Bitcoin miners sure are in luck at least I certainly didn't expect this to happen, it's almost the exact opposite of the last generation (except that ATI had really efficient GPGPU then, too, and probably didn't see nVidia switching lanes like this - if nVidia hadn't basically pared down what made Fermi great for gaming and amplified the poo poo out of that, they'd have big noisy hot-running cards, too, but surprise, different chips for different markets!). It's gotta suck having the most powerful cards and it basically not mattering because you can't make any money off of them. The 660Ti being a 670-lite is just hideous for ATI, there isn't poo poo they can do unless I'm WAAAAY off in my cost estimates for their SKUs in the same performance bracket. And they already suffer a perception bias against them because "ATI has bad drivers." God drat. Agreed fucked around with this message at 17:55 on Aug 1, 2012 |
# ? Aug 1, 2012 17:53 |
|
Just thought of something, is this the first generation since the GeForce FX5000 series where nVidia's price:performance card performs as good or better than last generation's top-end GPU? Or was there another one in there, maybe between the 8800/9800 cards and the GT200 chip generation? Not sure how "historic" this is, but it's certainly something for people building gaming PCs. 1080p is getting absolutely royal treatment in the bang-for-your-buck category, buncha folks who normally accept turning settings down getting to crank everything in all but the most extreme, boundary-defining games and enjoy >60FPS minimums across the board.
|
# ? Aug 1, 2012 20:44 |
|
I got a question, so with my 460 GTX is it worth it to keep it in my machine as a PhysX card or is it not powerful enough to keep up with the 660 Ti. Or am I wrong with my assumptions?
|
# ? Aug 1, 2012 21:32 |
|
Probably fast enough if it's the 1GB version with a 256-bit memory bus. Overclock the VRAM to bring it closer to the throughput of the 660Ti. GF104 (the chip that the 460 uses) is limited in its GPGPU, but it does PhysX fine if that's all it has to do. PhysX is a highly optimized CUDA workload and GF104 should have no issues as a dedicated PhysX processor to take the workload off your graphics rendering card. Also reconsider the whole concept, I've got the "stupid expensive version" of the same setup and the 580 mainly just sits there. I haven't given up on future GPGPU/CUDA developments that might justify it further professionally, but as far as videogames go, you have to go out of your way to find ones that support GPU-accelerated PhysX. They're super cool when you do but it's pretty damned gimmicky and not at all necessary. Agreed fucked around with this message at 22:21 on Aug 1, 2012 |
# ? Aug 1, 2012 22:18 |
|
Does the 660 Ti's low memory bandwidth make it a bad choice for 2560x1440 systems vs something like a 580?
|
# ? Aug 1, 2012 22:33 |
|
Two-parter, going to have to split it up a bit. Sorry.Josh Lyman posted:Does the 660 Ti's low memory bandwidth make it a bad choice for 2560x1440 systems Yes Josh Lyman posted:vs something like a 580? No, the 580 eats it in the total VRAM department when it comes to modern games at such a high resolution. It'd be vs. something like a 670 proper or a 7950, and ATI cards performance scaling at higher resolutions is superior to nVidia's this generation, sometimes making the difference between minimums above or below 30 fps in very demanding games. But all the cards under consideration will run the majority of games out there at high settings even with high resolutions. This is a high resolution friendly generation, even the relatively lower performance of nVidia's hardware past 1080p has to be taken in context as very much relatively lower, still runs like crazy.
|
# ? Aug 2, 2012 04:40 |
|
Agreed posted:Probably fast enough if it's the 1GB version with a 256-bit memory bus. Overclock the VRAM to bring it closer to the throughput of the 660Ti. GF104 (the chip that the 460 uses) is limited in its GPGPU, but it does PhysX fine if that's all it has to do. PhysX is a highly optimized CUDA workload and GF104 should have no issues as a dedicated PhysX processor to take the workload off your graphics rendering card. Yeah I got that 256bit memory bus version. I guess I'll keep it then since it's still good. I have the Batman game on Steam but haven't ever played it so that may be a good test. Same with Mirror's Edge. If not I can just take it out after I play those games and keep it around.
|
# ? Aug 3, 2012 04:36 |
|
Tweak Town did some overclocking on their 660 Ti sample. http://www.tweaktown.com/articles/4873/nvidia_geforce_gtx_660_ti_2gb_reference_video_card_overclocked/index.html It doesn't overclock better than a fully enabled GK104, but they haven't been able to say anything about the quality of the factory cooler yet. They also don't have any voltage control right now so it's possible the 660 is slightly undervolted for better power efficiency. I'm so excited for this card, even without overclocking it I was already ready to buy it as soon as it's launched. Hoping a manufacturer puts one out with a huge quiet 3 slot cooler.
|
# ? Aug 3, 2012 11:31 |
|
Agreed posted:No, the 580 eats it in the total VRAM department But, the super-affordable 3GB 580s! God, 580s were such a bad decision before, are they suddenly like $250 or something? That's the only way someone should be thinking about buying one.
|
# ? Aug 3, 2012 14:54 |
|
Dogen posted:But, the super-affordable 3GB 580s! Right now they occupy a neat space in the used market because people are letting them go pretty cheap to up to a 670-based setup. 580 is really good at CUDA (comparable performance to the mid-range Fermi Quadro cards, and quite a bit better than the entry-level which still costs $700), and it's on Adobe's short-list, making it a great choice for video editing. Mercury Engine is pretty VRAM limited and CUDA in general is, too, if a given operation can't be performed within the space of the on-board VRAM it shunts it back to the CPU. That means if you're editing something in which tasks can occupy >1.25GB but less than 1.5GB of memory, the 580's your card. And the stupid 3GB ones actually look a little less dumb. The high end Quadros and Teslas have 6GB to work with. Of course none of the consumer cards have ECC VRAM or any of the really fancy crap, you wouldn't want to use it for incredibly high-precision simulations, but a kneecapped GF110 is still a massive performer compared to a totally unleashed GF114 (entry-level Quadro) Also, reminder, jerkface, I have a GTX 580 and GTX 680 in my system. I mean that's just leaving money on the table. Why the hell am I giving out advice on video cards I'm that guy aaaahhhhh ahhhhh
|
# ? Aug 3, 2012 15:02 |
|
I know, yes. I mean, I have one too, so I am speaking from a place of understanding
|
# ? Aug 3, 2012 15:08 |
|
This is a really silly question, but where I can I find some ATI/AMD merchandise? Like shirts, case badges, whatever.
|
# ? Aug 4, 2012 06:34 |
|
Endymion FRS MK1 posted:This is a really silly question, but where I can I find some ATI/AMD merchandise? Like shirts, case badges, whatever. ebay?
|
# ? Aug 4, 2012 06:44 |
|
Shaocaholica posted:ebay? I didn't think of that, thanks. Was hoping for an actual store or something I suppose.
|
# ? Aug 4, 2012 07:10 |
|
The 660Ti is looking really, really nice, but TweakTown is obnoxiously blunt about their dislike for Nvidia, and they bring it up in every single review, multiple times, over and over. Apparently, they're upset because they aren't being sent information about products or review units, and they have the gall to call Nvidia unprofessional while they sit and whine like children. Ugh. I wish it were Anandtech, their reviews are so much better
|
# ? Aug 4, 2012 07:44 |
|
InstantInfidel posted:The 660Ti is looking really, really nice, but TweakTown is obnoxiously blunt about their dislike for Nvidia, and they bring it up in every single review, multiple times, over and over. Apparently, they're upset because they aren't being sent information about products or review units, and they have the gall to call Nvidia unprofessional while they sit and whine like children. Ugh. I wish it were Anandtech, their reviews are so much better It doesn't help that their game suite is laughable. HAWX2 and no BF3? Really?
|
# ? Aug 4, 2012 08:00 |
|
I hadn't even looked at that. The only intensive games they have are Metro2033, Just Cause 2, and DiRT3. I mean, jeez, gaming bookmarks aren't any better than synthetic bookmarks when 2/3 of your lineup came out two generations past. Furthermore, it seems this spat with Nvidia has been going on for years, and they have continued to make snide comments and work in multiple jabs on a page for the entire time. I'm a little skeptical of their review now; it seems almost as many words are spent badmouthing the producer as are reviewing the actual product.
|
# ? Aug 4, 2012 08:05 |
|
Endymion FRS MK1 posted:This is a really silly question, but where I can I find some ATI/AMD merchandise? Like shirts, case badges, whatever. http://www.frozencpu.com/search.html?mv_profile=keyword_search&mv_session_id=IKHvn9Dp&searchspec=case+badge&go.x=0&go.y=0
|
# ? Aug 4, 2012 18:04 |
|
grumperfish posted:FrozenCPU has a bunch of ATI/AMD case badges: I'm sure those AMD Athlon Thunderbird badges are flying off the virtual shelves!
|
# ? Aug 4, 2012 21:38 |
|
I need one of those "P4 Killer" badges for my phone.
|
# ? Aug 4, 2012 22:13 |
|
Looking for input on which GPU to upgrade to, posted in the system-building thread but figured I'd see if anyone here would chime in, I've listed 5 GPUs that are significantly marked down currently: http://forums.somethingawful.com/showthread.php?threadid=3458091&pagenumber=289&perpage=40#post406232511
|
# ? Aug 5, 2012 00:04 |
|
Factory Factory posted:I need one of those "P4 Killer" badges for my phone.
|
# ? Aug 5, 2012 00:26 |
|
Corte posted:Looking for input on which GPU to upgrade to, posted in the system-building thread but figured I'd see if anyone here would chime in, I've listed 5 GPUs that are significantly marked down currently: http://forums.somethingawful.com/showthread.php?threadid=3458091&pagenumber=289&perpage=40#post406232511 I'd personally say the 7850 is the best option given there's no real price difference. It's the newer, lower power and slightly faster card, and the 2GB VRAM isn't to be passed up. Instead of screwing around with DisplayPort (it brings its own fun issues), I'd recommend just getting a dirt cheap HDMI to DVI cable, like so: http://www.amazon.ca/Gold-Plated-High-Speed-HDMI/dp/B000O5TFLQ/ref=sr_1_1?ie=UTF8&qid=1344123517&sr=8-1 HalloKitty fucked around with this message at 00:41 on Aug 5, 2012 |
# ? Aug 5, 2012 00:35 |
|
grumperfish posted:FrozenCPU has a bunch of ATI/AMD case badges: Yes! My inner ATI fanboy is squealing in joy right now!
|
# ? Aug 5, 2012 01:22 |
|
HalloKitty posted:Instead of screwing around with DisplayPort (it brings its own fun issues), I'd recommend just getting a dirt cheap HDMI to DVI cable, like so: http://www.amazon.ca/Gold-Plated-High-Speed-HDMI/dp/B000O5TFLQ/ref=sr_1_1?ie=UTF8&qid=1344123517&sr=8-1 Out of curiosity what fun issues are you referring too? So far I've yet to have any issues using display port in quite a few use cases: Laptop with dock single & multi monitor, laptop without dock single monitor, Desktop with multiple monitors and single monitors, all multi monitor setups with combinations of DVI/HDMI/DP. HP and Dell monitors, HP & Apple laptops, AMD and NVidia on the desktop graphics side. Fatal fucked around with this message at 01:57 on Aug 5, 2012 |
# ? Aug 5, 2012 01:55 |
|
Hey guys, quick question about Nvidia and multiple monitors Is it just the newer models that use more power when running two screens of different resolution or did it apply to older models too? Reason I ask is I run a 1920x1200 monitor and a 1080p TV off a GTX 275 at the moment which I gather isn't the most efficient card at the best of times - if its guzzling power just idling I'll replace it with something newer and more efficient.
|
# ? Aug 5, 2012 02:01 |
|
Yes, it's a problem with yours, too. Even more so, because there are fewer power states available on a 275, just "idle" and "full on." Heck, older Nvidia cards can't even do identical monitors on idle clocks. It's a problem for AMD cards, too, but not as much because 1) they tend to use less power in general, and 2) there are more power states between idle and full available. Fatal posted:Out of curiosity what fun issues are you referring too? So far I've yet to have any issues using display port in quite a few use cases: Laptop with dock single & multi monitor, laptop without dock single monitor, Desktop with multiple monitors and single monitors, all multi monitor setups with combinations of DVI/HDMI/DP. HP and Dell monitors, HP & Apple laptops, AMD and NVidia on the desktop graphics side. Well, the automatic "power-off = disconnect" behavior can be a real pain, especially if you have hardware or earlier driver revisions therefor that sometimes drops the EDID data when you turn the monitor back on. I finally switched from DP to DVI for my main connection, because every time I accidentally hit the power button on my monitor, the system would come back with all my windows resized for a 640x480 VGA display it assumed had to be connected since it couldn't detect anything else. And if I did that when the system was asleep, it would come back as an actual 640x480 image and I had to reboot to get my monitor detected right again. So DP's behavior of saying the monitor is disconnected just because it has turned off is a real pain in the shitter. Factory Factory fucked around with this message at 02:28 on Aug 5, 2012 |
# ? Aug 5, 2012 02:13 |
|
I just upgraded the sorry cheap video card I had in my computer to a 550 ti, and I didn't realize just how bad my graphics looked/ran until I put that fucker in this afternoon. It may not be a new top of the line card but it works great for my needs, wish I had done this sooner.
|
# ? Aug 5, 2012 02:44 |
|
dissss posted:Hey guys, quick question about Nvidia and multiple monitors Yes, it's a problem with yours, too. It's probably even worse, since older Nvidia cards can't even run two identical monitors without clocking up. It's even a problem with AMD cards, but with AMD it's significantly less of a problem than with Nvidia cards, because 1) there are more intermediate power states to take advantage of, and 2) the cards use less power in general.
|
# ? Aug 5, 2012 03:50 |
|
And I can't use an ATI case badge, because my CM690 II has no space for badges Maybe I'll just duct tape my Rage to the top of the case or something...
|
# ? Aug 5, 2012 03:54 |
|
Fatal posted:Out of curiosity what fun issues are you referring too? So far I've yet to have any issues using display port in quite a few use cases: Laptop with dock single & multi monitor, laptop without dock single monitor, Desktop with multiple monitors and single monitors, all multi monitor setups with combinations of DVI/HDMI/DP. HP and Dell monitors, HP & Apple laptops, AMD and NVidia on the desktop graphics side. The fact it disconnects the monitor when you turn the monitor off, in Windows. That's all.
|
# ? Aug 5, 2012 15:46 |
|
HalloKitty posted:The fact it disconnects the monitor when you turn the monitor off, in Windows. That's all. So all your icons will get jacked up, only to fix themselves when you go back to the DisplayPort input.
|
# ? Aug 5, 2012 17:15 |
|
real_scud posted:Which is super loving annoying if you happen to use one of your monitors for say, displaying your 360, every time you switch to the 360 input it'll act like the monitor disappeared from the computer. I've never had this issue with my DP monitor and I switch between my DP computer input and DVI PS3 input all the time, nor does it lose connection to the monitor when it's powered off. Weird. I've heard of out-of-spec DP cables causing other strange issues, maybe this is the problem?
|
# ? Aug 5, 2012 17:37 |
|
Fish Cake posted:I've never had this issue with my DP monitor and I switch between my DP computer input and DVI PS3 input all the time, nor does it lose connection to the monitor when it's powered off. Weird. I've heard of out-of-spec DP cables causing other strange issues, maybe this is the problem?
|
# ? Aug 5, 2012 19:27 |
|
Hey, I am in the market to upgrade my GPU. I'm currently running the following setup: Intel i7-2600K @ 3.4 GHz 16 GB RAM Nvidia GTX 470 I've been looking at the current 6XX series of GTX cards, and I think the best in terms of price vs performance is the GTX 670. I also looked at the 560 TI but it looks like the 670 blows it out of the water when I looked at some comparison reviews between the two. Does anyone have any suggestions, or is the 670 a good decision? Money's not too important of a factor, but I don't want to drop $1000 for a GPU. I'm going to be doing some fairly high level gaming, and I generally just want to keep my gaming PC as current as possible. I typically upgrade after two-three years and it's about that time again.
|
# ? Aug 5, 2012 20:08 |
|
|
# ? May 28, 2024 14:21 |
|
Yeah I think a GTX 670 would be a good choice, though you may want to consider whether you're really going to see a difference in your experience that will justify the upgrade. Here's a direct comparison between the GTX 470 and GTX 670 from Anandtech.
|
# ? Aug 5, 2012 20:14 |