|
spouse posted:So, I bought that Sapphire 7950 (http://www.newegg.com/Product/Product.aspx?Item=N82E16814202030) reference a few weeks ago, and it's got a blower. It runs hotter than I'd like it to (and is loud as gently caress at high speed), so I'm wondering if anyone has experience with the Accelero Xtreme, or any iterations of it? I don't really think I have the money for watercooling, so this looks like a decent alternative to the blower. I have a Gelid Icy Vision on my 560Ti 448 and it's amazing. Cooling performance is great and with Noctua Fans it doesn't make any noise. I think any cooler mentioned favorably here: http://www.silentpcreview.com/article1301-page5.html will be fine.
|
# ? May 9, 2013 23:09 |
|
|
# ? May 15, 2024 02:29 |
|
Hi Cross-posting from the general PC gaming thread, people there suggested I try here. I've recently upgraded my graphics card (7770HD to a 660Ti) and I'm wondering if the new one is underperforming a bit. Any views on whether it is, and/or what I can do to tell for sure/fix it? My PC is a G840 (dual-core 2.8ghz), 8gb ram. I'm aware that the CPU is likely to be a bottleneck, but... - 3dmark 11 (in the standard, free mode) comes up with scores of P5543 (total), 8145 (graphics), 2859 (physics). The graphics tests go at 38.8, 38.1, 50.5 and 24.3 fps respectively. Looking on the 3dmark site there are 7 results for single 660Ti's, all using corei7's. The lowest one (actually using a hex-core i7 970) has scores of P15265 (total), 17921 (graphics), 11356 (phyiscs). Fair enough on the physics scores, but the graphics one is more than 2x mine. - far cry 2 runs at 30-40 fps, in 1080p or 720p (all max settings basically). From looking on the internet it doesn't seem to be too bothered about having lots of cores, and given it's age I'd have thought it'd be fine with my cpu, and the internet suggests it should be going at 60+ comfortably. - need for speed hot pursuit has regular slow downs running at 1080p (top settings). At 720p it does better but still has some issues. I'd have thought the card would have no problems running it at 1080p, and the fact that 720p works better suggests its the card that's the limiting factor not the cpu. Some things work fine (ish), sleeping dogs runs at close to 60fps at 1080p, top settings, so maybe it's just individual games. Using latest drivers for everything as far as I know. Feedback from the PC gaming thread was that it probably is the CPU holding things back. If that's the case, can anyone explain why? I just can't see what would be taxing it that much. Thanks
|
# ? May 10, 2013 18:08 |
|
A 2GB 660 Ti should be able to handle 1080p with settings maxed/close-to-max (excepting some insane anti-aliasing/other special effects), so if you're not seeing the performance you expect, it's possible it's your G840 bottlenecking, but that is a Sandy Bridge based Pentium, so it's pretty much brand new when it comes to the current crop of games. I wouldn't blame the 660 though; do you have a ton of background tasks running or something?
|
# ? May 10, 2013 19:07 |
|
Yeah, that's basically what I think - the 660ti should have no problems in the games mentioned, and the CPU shouldn't either, so I don't know what is! There's v little else running. I alt-tabbed out of NFS and it had CPU around 50-60% load, and basically nothing else using the CPU.
|
# ? May 10, 2013 19:22 |
|
So performance on Far Cry 2 is about the same regardless of whether you are running 720 or 1080? That would suggest CPU. Could be that 2.8ghz isn't cutting it at higher settings... as you say, I don't think FC2 benefits from more than 2 cores.
|
# ? May 10, 2013 20:56 |
|
Yes, the fc2 bit does suggest CPU, but I think it should be fast enough. If 2.8 ghz can only do 40 ish then no pc in the world could do 90 but people say they can. The other stuff does not suggest CPU is holding anything back.
|
# ? May 11, 2013 11:48 |
|
all_purpose_cat_boy posted:Yes, the fc2 bit does suggest CPU, but I think it should be fast enough. If 2.8 ghz can only do 40 ish then no pc in the world could do 90 but people say they can. The other stuff does not suggest CPU is holding anything back. Things don't like that don't necessarily scale linearly though. It could be that once the CPU is fast enough to not be bottlenecking anymore, the FPS shoots up.
|
# ? May 11, 2013 12:38 |
|
all_purpose_cat_boy posted:Yes, the fc2 bit does suggest CPU, but I think it should be fast enough. If 2.8 ghz can only do 40 ish then no pc in the world could do 90 but people say they can. The other stuff does not suggest CPU is holding anything back. What (exact model) power supply do you have?
|
# ? May 11, 2013 18:10 |
|
No, I just uninstalled them. I've got a silver stone 450w power supply.
|
# ? May 11, 2013 22:22 |
|
all_purpose_cat_boy posted:No, I just uninstalled them.
|
# ? May 12, 2013 00:15 |
|
I have a 7970 that gets as hot as 96C when playing mwo on a triple monitor set up. Most every other game stays below 80C. Is that safe? The fan only ramps up to 66% automatically when I wouldnt mind it going 100%. The noise is a non issue most of the day but I cant just leave it at 100% all the time. Can I set a more aggressive fan profile?
Fauxtool fucked around with this message at 02:16 on May 12, 2013 |
# ? May 12, 2013 02:12 |
|
That's just above safe temperatures. It's not game-endingly hot, but it's too hot. You can indeed set a more aggressive fan profile, using MSI Afterburner or somesuch (note: software has to be active while playing).
|
# ? May 12, 2013 03:10 |
|
Fauxtool posted:I have a 7970 that gets as hot as 96C when playing mwo on a triple monitor set up. Most every other game stays below 80C. Is that safe? The fan only ramps up to 66% automatically when I wouldnt mind it going 100%. The noise is a non issue most of the day but I cant just leave it at 100% all the time. Can I set a more aggressive fan profile? As for whether you need it, temps in the 90s are hot but within acceptable limits for the silicon. They design GPUs to run hot. It might reduce the statistical life of the card a bit, but it's not like you're running it 24/7 mining bitcoins. I wouldn't worry about it too much personally. But it's pretty bizarre that the card isn't running the fans at 100% at that temperature. Some 7900s have really slack fan profiles that let the chip get pretty hot, especially ones with blower coolers because they're so loud. Not running full speed in the 90s is pretty unusual.
|
# ? May 12, 2013 03:16 |
|
thanks for the help, i was running the catalyst control center that came with the drivers. I was able change the fan profile to use 100% instead of 66%, but it still turns up after it gets hot instead of as it gets hot Ill give msi a try and I can also open up some of the soundproofed holes on the top of my case How can I turn off CCC's fan profiles when I still need it running for eyefinity? Do I just have them both running and hope MSI's overwrite it? thank god im not a buttcoin miner Fauxtool fucked around with this message at 05:08 on May 12, 2013 |
# ? May 12, 2013 05:04 |
|
LCD Deathpanel posted:Try completely removing both the AMD & Nvidia drivers with driver fusion then and then reinstall the Nvidia drivers. Might help, might not, but it's worth a try. I tried to remove the amd ones but it said it needed the paid version to get rid of the last bits. Is there a free alternative that would do it?
|
# ? May 12, 2013 08:12 |
|
The free version of Driver Fusion removes all the important bits. I'm not aware of any free alternatives that support current drivers, that's why we tend to recommend Driver Fusion.
|
# ? May 12, 2013 14:10 |
|
On my recently new gaming/HTPC I have a GTX 660. I have tried, but can't get 5.1 surround sound to work with it, so I've been using my mobo's optical audio out for now. This is a pain in the rear end though because it has to be connected every time I switch to the PC, since I only have one digital audio input and it's in use. Any ideas for getting 5.1 over HDMI working? I'm running Win7. e: It currently only reports two channels when plugged into the TV, and it only reported 2 when I had an HDMI cable plugged right into my receiver. booshi fucked around with this message at 17:09 on May 12, 2013 |
# ? May 12, 2013 17:06 |
|
booshi posted:On my recently new gaming/HTPC I have a GTX 660. I have tried, but can't get 5.1 surround sound to work with it, so I've been using my mobo's optical audio out for now. This is a pain in the rear end though because it has to be connected every time I switch to the PC, since I only have one digital audio input and it's in use. Any ideas for getting 5.1 over HDMI working? I'm running Win7. nVidia claims their drivers work by reading the supported features status from the EDID information provided by whatever the HDMI cable is plugged into. Because your receiver passes it through to the TV, it's likely getting the TV's stereo only settings. See if there is a way to adjust your amp not to do this? I don't know if there is any way to force the nvidia drivers into a specific mode?
|
# ? May 12, 2013 17:48 |
|
EoRaptor posted:nVidia claims their drivers work by reading the supported features status from the EDID information provided by whatever the HDMI cable is plugged into. I've tried the receiver, which doesn't pass through the TV (the way my system is set up is that all devices connect to the TV via HDMI, then audio out via digital audio from the TV to the receiver), and tried with only the receiver connected to the computer. If I could find a way to allow both of my digital audio cables to be plugged in and work that would be fine. The way it is now, I have a splitter that I run in reverse, with one input from the receiver and the other from the computer. I can mute my computer and everything else (TV, Xbox, etc) work fine, but I can't mute anything else to stop a signal from going when the PC is also going, so no signals come through since you can't just mix digital optical audio signals. So, my current solution has been to just switch what is plugged in to the splitter. I'll play around with it some more, as I had a feeling that what nVidia said was the case: using EDID info. If anyone else has any tips feel free, and if I get it working I'll update with my fix in case anyone else runs into the problem.
|
# ? May 12, 2013 17:53 |
|
booshi posted:I've tried the receiver, which doesn't pass through the TV (the way my system is set up is that all devices connect to the TV via HDMI, then audio out via digital audio from the TV to the receiver), and tried with only the receiver connected to the computer. If I could find a way to allow both of my digital audio cables to be plugged in and work that would be fine. The way it is now, I have a splitter that I run in reverse, with one input from the receiver and the other from the computer. I can mute my computer and everything else (TV, Xbox, etc) work fine, but I can't mute anything else to stop a signal from going when the PC is also going, so no signals come through since you can't just mix digital optical audio signals. So, my current solution has been to just switch what is plugged in to the splitter. What receiver do you have? Upgrading the receiver may be the best option.
|
# ? May 12, 2013 20:12 |
|
Is my videocard dying? It's a 7970 that I've had for a while, the temps are pretty low since I put the aftermarket cooler on it and I'm running the 13.5 beta drivers. What's even weirder is it's only Firefox that exhibits this tendency so far, and I haven't had any glitches or weird things happen while playing games.
|
# ? May 12, 2013 23:20 |
|
real_scud posted:Is my videocard dying? It's a 7970 that I've had for a while, the temps are pretty low since I put the aftermarket cooler on it and I'm running the 13.5 beta drivers. I've had this happen too, and have assumed it to be a driver thing since it's seemed to come and go with driver installs.
|
# ? May 12, 2013 23:32 |
|
Killer robot posted:I've had this happen too, and have assumed it to be a driver thing since it's seemed to come and go with driver installs.
|
# ? May 12, 2013 23:57 |
|
Killer robot posted:I've had this happen too, and have assumed it to be a driver thing since it's seemed to come and go with driver installs. Klyith posted:Same here, though mine only makes glitch squares in the tab & location bar area. I suspect turning off hardware acceleration would fix it, but so does 1 alt-tab. I'm pretty agnostic over who is to blame, AMD or firefox.
|
# ? May 13, 2013 00:12 |
|
KillHour posted:What receiver do you have? Upgrading the receiver may be the best option. It's part of a Sony HTITB, a year old. 2 HDMI in, 1 out, a few of the other standard inputs, also a digital optical audio in, and has 5.1 surround, blu-ray player, one of those types. I may actually be in luck because my Dad has been complaining about issues with his Sony receiver (a real one, not like my current one) with his new Samsung TV. I think he has/is about to buy a new one and is going to give it to me, since I have a Sony TV, and their receivers and TVs work really well together with the Bravia sync stuff. Only issue is I only visit my family flying, so it's either they ship it to me or I take a bigass receiver back on the plane somehow.
|
# ? May 14, 2013 02:56 |
|
Discovered something cool about the Kepler architecture. I thought something was wrong with the fans temperature sensors because they were not spinning up after hours of playing Half Life. I ran EVGA Precision and it turned out that the card wasn't even ramping up past .9V and 1GHz since the load was so low. The temperature stayed around 40C so the fans had no reason to speed up. Didn't need to change a single setting from the same configuration that ran at 1.2V and 1.3Ghz space heater mode while playing Bioshock Infinite. Modern power management technology is so cool.
|
# ? May 14, 2013 03:19 |
|
booshi posted:It's part of a Sony HTITB, a year old. 2 HDMI in, 1 out, a few of the other standard inputs, also a digital optical audio in, and has 5.1 surround, blu-ray player, one of those types. I may actually be in luck because my Dad has been complaining about issues with his Sony receiver (a real one, not like my current one) with his new Samsung TV. I think he has/is about to buy a new one and is going to give it to me, since I have a Sony TV, and their receivers and TVs work really well together with the Bravia sync stuff. Only issue is I only visit my family flying, so it's either they ship it to me or I take a bigass receiver back on the plane somehow. You're doing it wrong. You need to hook up your receiver like this: As for the Bravia Link stuff, don't listen to Sony's marketing. Any receiver worth jack (Not a HTIAB) will support controlling devices over HDMI. I can control my PS3 with the remote for my Onkyo, for instance (And the remote for the TV... and vice versa - the PS3 turns off the receiver and TV when I shut it down). I've tested it with both LG and Toshiba TVs, and everything works together just fine.
|
# ? May 14, 2013 08:10 |
|
KillHour posted:You're doing it wrong. You need to hook up your receiver like this: I know my way is not the best way to do it, but my receiver only has two inputs, and I have 4 devices I use. I'm going to try to get the new receiver soon to have it all properly set up. e: I'm also not really worried about controlling multiple devices with the receiver because I have a Logitech Harmony One.
|
# ? May 14, 2013 08:52 |
|
booshi posted:I know my way is not the best way to do it, but my receiver only has two inputs, and I have 4 devices I use. I'm going to try to get the new receiver soon to have it all properly set up. Get one of these in the short term then: http://www.monoprice.com/products/product.asp?c_id=101&cp_id=10110&cs_id=1011002&p_id=6259&seq=1&format=2 I used one of those with my Harmony to accommodate my receiver's lack of inputs until I got a newer one.
|
# ? May 14, 2013 14:01 |
|
If you still don't have enough ports: http://www.monoprice.com/products/product.asp?c_id=101&cp_id=10110&cs_id=1011002&p_id=4088&seq=1&format=2 Edit: Goddamnit, beaten.
|
# ? May 14, 2013 14:01 |
|
mayodreams posted:Get one of these in the short term then: Ah, I'm so out of touch with what's available I didn't know they had HDMI switches like that. Thanks for the link. I'm going to hold off though. I just talked to my dad, and he's buying his new receiver this week, and is just going to ship me his using his company FedEx account, so I'll just hold off for that.
|
# ? May 14, 2013 16:22 |
|
I was browsing LGA 2011 motherboards, and while most of them have PCI-E 3.0, I haven't found any processors in that socket type that support 3.0. Is it just that I haven't found a processor that supports 3.0, does it not matter except for high end cards, or is it just all marketing on the part for the motherboard manufactures?
Urzza fucked around with this message at 07:38 on May 18, 2013 |
# ? May 18, 2013 07:27 |
|
Urzza posted:I was browsing LGA 2011 motherboards, and while most of them have PCI-E 3.0, I haven't found any processors in that socket type that support 3.0. Is it just that I haven't found a processor that supports 3.0, does it not matter except for high end cards, or is it just all marketing on the part for the motherboard manufactures? But PCIe 3.0 doesn't matter for any video card, even the high end. It might just barely be measurable for a SLI setup. Video cards are not starved for bandwidth across the PCIe bus. It's been a consistent thing, all the way back to the AGP era, that you can run Card X on the bus from Generation X-2 and have less than 5% impact to performance. Buying a desktop LGA 2011 system right now would be a massive waste of money, what with Haswell right around the corner. Unless you really need a 6 core chip for some reason.
|
# ? May 18, 2013 08:11 |
|
With nvidia cards there's a way to force pcie3 on x79. Works for my gtx670 with i7 3930k and gigabyte x79-ud5. Seconding the "it's a waste" statement, wait for hadwell.
|
# ? May 18, 2013 23:51 |
|
So my 660ti isn't putting out HDMI. It's turning on and the fans kick up, power LEDs on everything fine, but just no HDMI. What can I do?
|
# ? May 19, 2013 00:44 |
|
Klyith posted:Intel delayed (or maybe cancelled at this point) Ivy Bridge chips for LGA 2011. Probably they decided to just keep selling Sandy based Xenons for the server market, since the performance gains of Ivy in that area were modest. Without the server chip to remark, the foolish enthusiast market isn't enough on it's own. Ivy Bridge E is due out in September. It (and the current Sandy Bridge E) are not worth it unless you have a need for 6-core or quad channel memory.
|
# ? May 19, 2013 17:02 |
|
Rigged Death Trap posted:So my 660ti isn't putting out HDMI. It's turning on and the fans kick up, power LEDs on everything fine, but just no HDMI. Is it a standard HDMI out on the card? I ask since some of Nvidia's cards with HDMI have regular, and others (like my old GTX 465) had a mini HDMI to standard HDMI cable. If it's regular HDMI, go with the simplest route first and try another cable to rule that out. Other than that, did it work initially and stop, or did it never work at all? If it hasn't worked I'd test one of the other video-out options like DVI to see if any of the outputs have signal. If it happened right after a driver update, you might be able to roll back and use a different driver version after a standard remove/clean with Driver Sweeper + Ccleaner/reboot and reload fresh. Also make sure your power cables for the card are plugged in securely, I had a similar issue to yours when I was cleaning out my PC and didn't have a 6 pin connector fully seated, the card fan spun up but I got nothing on the display. Thankfully it didn't hurt the card, once both 6 pin connectors were secure everything came up normally.
|
# ? May 19, 2013 19:09 |
|
So I've been working on this thing, and I wanted to https://www.youtube.com/watch?v=KajdB-eTAYU Right now it works on a nightmare amalgam of Java, FRAPS, and a RAMdisk, but it works as a proof of concept. What do you think?
|
# ? May 20, 2013 21:12 |
|
I seem to have occasional tessellation issues with my AMD 7950 and I'm trying to figure out if this is a defect problem with my card or if this is an AMD wide issue. If tessellation is enabled, I will get a sort of triangular flickering or artifacts around the light source. This is noticeable in crysis 3 and metro: last light, which I think are my only tessellated games. It only happens occasionally on the same lights everytime. I was using 13.5 but I'm scrubbing the drivers and re-installing 13.4 just to see what happens. e: After running some tessellation benchmarks, I'm just gonna assume these games are buggy. ethanol fucked around with this message at 05:03 on May 21, 2013 |
# ? May 21, 2013 00:08 |
|
|
# ? May 15, 2024 02:29 |
|
Factory Factory posted:So I've been working on this thing, and I wanted to The effect seems a bit understated. I remember seeing something like this before and I remember latency and CPU usage being big issues.
|
# ? May 21, 2013 00:38 |