|
BurritoJustice posted:Afterburner has a simple, fast interface and inbuilt profiling. It works well, and I've never heard of anyone complaining about lag. It certainly isn't garbage. It is nice that it also comes with comprehensive hardware monitoring, inbuilt stress testing, and the best drat customisable on screen display around. I literally don't see the advantage of using inspector as an overclocking utility over afterburner. Inspector is useful in other areas, sure, but for someone looking for an overclocking utility Afterburner is king. I loved Afterburner, but I found that recently the Rivatuner Statistics Server was causing horrendous lag in certain games, and noticeable lag to a much lesser extent in other games. The only way to stop it was to turn app detection off, which takes away the point of running it. When I say horrendous, I mean bizarre poo poo - when playing Orcs Must Die! 2, for example, the framerate was 60FPS locked at all times when playing with the Xbox 360 controller, but as soon as I used the mouse, the framerate started tanking, with massive weird stutters. Close Rivatuner Statistics Server, back to 60FPS locked. V-Sync on or off made no difference, nor did having a different graphics driver. Nor did removing my Logitech mouse driver or updating it. Took me a while to figure that one out.
|
# ? Feb 8, 2015 10:28 |
|
|
# ? May 30, 2024 09:52 |
|
Darkpriest667 posted:Let's be honest to the folks here, the "telemetry" that they're going to be collecting isn't anything super personal or special. It's nothing secret that they wouldn't be able to get anyway. It's just more convenient.
|
# ? Feb 8, 2015 15:39 |
|
necrobobsledder posted:I really suspect the primary point is to provide massive amounts of proof of the sheer volume of piracy in Asia or developing / emerging markets to the point that the data cannot be ignored and government officials are pushed into a corner to act. I believe this is a fundamental control point that Microsoft wants rather than to try to herd the billions of cats out there that will likely break their copy protection. Nope. That already exists with windows update/WGA. This is a straight metrics and analysis platform to allow for in-field validation and early detection of issues. Now it could in theory be used to detect pirate copies but there's already a mechanism for that.
|
# ? Feb 8, 2015 15:55 |
|
I think a ball bearing on one of the fans on my 7870 (this one specifically) is loose or something- the fan often rattles, and sometimes stops/starts spinning intermittently while the other fan spins continuously. Am I completely off base here? And would this be compatible with my card and fix the problem if I used it as a replacement? edit: also would I need special tools/care to replace a fan or is it fairly straight forward?
|
# ? Feb 8, 2015 22:58 |
|
That indicates a bad fan so replacing it should fix it. Most semi-stock GPU fans just screw into the heatsink, so you'd need to remove the top cover and replace it. You can oil the fan but it's only a temporary fix. I'd probably go with this fan assembly instead since the fans daisy-chain power cables and no idea if the one you linked is the exact type required (although theoretically any matching PWM connector should work): http://www.ebay.com/itm/75mm-Dual-X...549332234&rt=nc
|
# ? Feb 8, 2015 23:31 |
|
Somebody is going to lend me an MSI R9 290 to see if I have the same texture flickering issues that I get with my 970. Do you recon my psu will be able to cope with it? I have the 550w version of this with a 4670k oc'ed to 4.4ghz.
|
# ? Feb 9, 2015 14:02 |
|
track day bro! posted:Somebody is going to lend me an MSI R9 290 to see if I have the same texture flickering issues that I get with my 970. Do you recon my psu will be able to cope with it? I have the 550w version of this with a 4670k oc'ed to 4.4ghz. http://pcpartpicker.com/p/mpj9cf
|
# ? Feb 9, 2015 14:27 |
|
This is my entire build with the 290 if that helps, how accurate is that wattage estimate? http://uk.pcpartpicker.com/p/JNhFqs
|
# ? Feb 9, 2015 14:43 |
|
track day bro! posted:This is my entire build with the 290 if that helps, how accurate is that wattage estimate? Hopefully pretty accurate; it puts me at 350w with a 450w power supply
|
# ? Feb 9, 2015 15:49 |
|
track day bro! posted:Somebody is going to lend me an MSI R9 290 to see if I have the same texture flickering issues that I get with my 970. Do you recon my psu will be able to cope with it? I have the 550w version of this with a 4670k oc'ed to 4.4ghz.
|
# ? Feb 9, 2015 16:12 |
|
HalloKitty posted:I loved Afterburner, but I found that recently the Rivatuner Statistics Server was causing horrendous lag in certain games, and noticeable lag to a much lesser extent in other games. The only way to stop it was to turn app detection off, which takes away the point of running it. Rivatuner used to cause problems for me too. I remember also having a hilariously hard time uninstalling it but that was a while ago. I'm back to using it however because its the simplest form of fps locking... still.. . I've actually had real stability problems with nvidia inspector as well. I still had to use it for some reason, I believe for some SLI bios modded janky poo poo that nothing else would detect correctly. But in normal conditions it seemed to be fairly inconsistent holding OC values. Afterburner has been rock solid despite the rice skin.
|
# ? Feb 9, 2015 16:34 |
|
Rebrands off the port bow! http://techreport.com/news/27792/radeon-300-series-rumor-repository-suggests-looming-rebrands Also, looks like HBM will not come to the 380/380X, just the 390/390X/395X2. As expected, 1024-bit memory bus on the HBM parts. TL;DR: New silicon slots in at the 90 and 70 positions, 80 and 60 is last series's 90 and 70 silicon, bumped and tweaked. SwissArmyDruid fucked around with this message at 22:34 on Feb 9, 2015 |
# ? Feb 9, 2015 22:27 |
|
With rebrands, do the old boards ever get sent from the retailer back to the OEMs to get a BIOS flash and new packaging, or are they always brand new?
|
# ? Feb 9, 2015 22:41 |
|
They're always brand new. Factories aren't set up to take old board and work on them.
|
# ? Feb 9, 2015 23:10 |
|
Is it just me or does this seem kinda misleading?
|
# ? Feb 9, 2015 23:14 |
|
Everyone does rebrands and have been for quite a while now.
|
# ? Feb 9, 2015 23:16 |
|
Afterburner is regularly updated also. Most of the others are not. As for rebrands, the onus is on the consumer to actually research a product they are going to spend money on.
|
# ? Feb 9, 2015 23:25 |
|
^^^ What they said. I mean, that's why you're here, right? Because you take an active interest in being a well-informed consumer, and all that jazz? Or are you one of those people that just comes in to bag on whichever company does not produce the GPU for your video card? =| EDIT: Holy poo poo, the OP needs a fuckin' update. "AMD has the single most powerful video card that doesn’t require a connector from the power supply, the Radeon HD 7750, making it the best graphics upgrade to a mass-market desktop that doesn’t require replacing the power supply as well."? Jesus, that's outdated. That's the GTX 750 Ti now. What's the protocol on submitting OP updates? SwissArmyDruid fucked around with this message at 00:14 on Feb 10, 2015 |
# ? Feb 10, 2015 00:10 |
|
Sounds like you volunteered, go write a new OP
|
# ? Feb 10, 2015 00:28 |
|
Zero VGS posted:Sounds like you volunteered, go write a new OP *Shrug*. Should keep me amused.
|
# ? Feb 10, 2015 00:32 |
|
So what specifications would meet running Star Swarm stress test at 60 frames per second consistently? Edit: just technical curiosity
|
# ? Feb 10, 2015 01:44 |
|
Got a Gigabyte 970 and I am so pleased. Every game is 60 now even Titanfall and the new cod that I struggled to run before. No complaints here.
|
# ? Feb 10, 2015 02:00 |
|
RBX posted:Got a Gigabyte 970 and I am so pleased. Every game is 60 now even Titanfall and the new cod that I struggled to run before. No complaints here. But it's gonna stutter fizzle and die because of da memoriez!!
|
# ? Feb 10, 2015 03:00 |
|
ZergFluid posted:So what specifications would meet running Star Swarm stress test at 60 frames per second consistently? Not DX11 for the first part. Star Swarm is a synthetic benchmark designed to destroy high-level APIs, by generating more draw calls than, say, DX11 could ever hope to address. Hence, DX12 and Mantle have disproportionately higher performance gains as a result. It's just not a fair competition for DX11. Which is why the tests that Anandtech did with Star Swarm and DX12 (article here: http://www.anandtech.com/show/8962/the-directx-12-performance-preview-amd-nvidia-star-swarm ) don't draw any hard conclusions about "which is best". Between DX12 not being finalized yet, and driver support for DX12/WDDM not quite being completely there yet, there are only two definitive conclusions you can really draw from it: * A GTX 980 running Star Swarm will bottleneck on two cores. * DX11 can't help you. WHERE IS YOUR GOD NOW?!
|
# ? Feb 10, 2015 03:09 |
|
quote:while the R9 380 series will feature Grenada, a re-brand of the Hawaii chip from the R9 290 series So what they're saying is that AMD is gonna sell a warmed over 290X (with GCN 1.2 hacked in) for the amazing price of 250$? That's what we've been waiting for? sauer kraut fucked around with this message at 03:14 on Feb 10, 2015 |
# ? Feb 10, 2015 03:12 |
sauer kraut posted:
No, you've been waiting for the 1024 bit card that will be absurdly expensive.
|
|
# ? Feb 10, 2015 03:48 |
|
Sorry if this isn't really the place to ask, but is there a general consensus on the "best" method of enabling VSync in games while minimizing input latency? Still forcing triple buffering through D3DOverrider if it's not built into the game? Do I still need to test out every game with triple-buffered VSync enabled and see how the controls respond on a 59 fps cap versus 60 fps cap and all of that poo poo? How well does Nvidia's "Adaptive VSync" work? I think the last time I tried it I still noticed heavy tearing if the frame rate dropped below the refresh (which I guess is what you'd expect since it disables VSync at that point). I'm just trying to finally get back to my huge backlog and don't want to be annoyed by tearing or unresponsive controls.
|
# ? Feb 10, 2015 07:38 |
|
The "best" way is with a Gsync/FreeSync screen, but that's more than just V-sync. Second best is V-sync enabled on a 120 Hz or 144 Hz screen, triple buffered. The fast refresh minimizes latency and also means that you have additional V-sync'd refresh rates between 60 Hz and 30 Hz (36, 48, 72 for 144 Hz and 30, 40, 60 for 120 Hz). Beyond that, it all comes down to trade-offs. Triple buffering is strictly superior to double buffering except for the extra VRAM use, which shouldn't matter most of the time and especially never in backlogged games. Adaptive V-Sync is just a setting that turns off V-Sync if the frame rate drops below the monitor refresh rate. It's mostly only useful on 60 Hz screens and games that are *usually* solid-60 framerates, and on the occasional dips you prefer to see tearing rather than 30 FPS judder. 59 FPS vs. 60 FPS caps on a 60 Hz display... well... With V-Sync at 60 Hz, a 59 FPS cap displays at 30 FPS. If you can do a solid 60, then half your refresh rate will be wasted, and that's just bad. If your display can do 59 Hz and is set to do that, that's cool and all but doesn't change anything - typically that's actually 59.94 Hz, meant to match playback of NTSC/ATSC media from discs or TV tuners, and a 59 FPS cap is still below refresh so it will give you a locked 29.97 FPS experience. If you want to get esoteric, and your game can reliably render faster than monitor refresh, and your system has outputs for the Intel IGP (or AMD IGP I guess), you can set up Lucidlogix MVP in i-mode and enable Virtual V-Sync and Hyperformance. Together, these are worth a decent average framerate hit, but the faster-than-refresh the GPU can render, the more that display lag will be reduced (since the final displayed frame is taken from nearer to the end of the frame interval than a normal V-sync frame, which is taken from the beginning of the frame interval).
|
# ? Feb 10, 2015 08:09 |
|
Yeah, I'm just dealing with a standard 60 Hz IPS display. It's probably one of the worst options as far as overall latency is concerned, but it's what I have to work with. As for the frame cap thing, I only mentioned that because I've tried it in combination with triple-buffered VSync for certain games in the past and it seems to reduce the input lag (at least slightly, but still perceptibly) compared to just using triple buffering and VSync. It seems to be really dependent on the specific game in question. Some games, it causes occasional stutter, while other games don't have much noticeable issue. I've seen other people mention using 2 frames below the refresh rate for their cap instead of 1, or putting the limit at a frame above the refresh rate, and setting the global setting for "Maximum pre-rendered frames" to 1 as possible solutions to reduce input latency, but haven't really looked into the "pre-rendered frames" setting further or tried anything with it. Dropping the frame cap further below 59 didn't seem to make any huge change on my setup, from what I remember, and I haven't tried increasing it beyond the refresh rate. I have no idea if putting a cap above the refresh rate actually does anything with VSync enabled, though. I apparently started making a list back in September of all of the settings that work best in the various games I've played... I wish I were less obsessive about this stuff. I need sleep.
|
# ? Feb 10, 2015 09:23 |
|
Why would you use Vsync and a framerate limiter at the same time, on a 60Hz display especially? That seems like a surefire recipe for rock stable 30fps gaming Hopefully AMD gets their limiter going in the next CCC and it doesn't suck.
|
# ? Feb 10, 2015 10:04 |
|
sauer kraut posted:Why would you use Vsync and a framerate limiter at the same time, on a 60Hz display especially? Sometimes, and I've confirmed it with a wall watt meter, you can set a game to 60hz vsync but the graphics card will still run at max TDP and max utilization, as if it is trying to render as many frames as it can and tosses the rest out after 60fps. In those cases, running instead at 59fps frame limit, or 60 or 61, can still introduce tearing. So I'd usually run both vsync and a 61 fps limit as a catch all.
|
# ? Feb 10, 2015 16:33 |
|
Zero VGS posted:Sometimes, and I've confirmed it with a wall watt meter, you can set a game to 60hz vsync but the graphics card will still run at max TDP and max utilization, as if it is trying to render as many frames as it can and tosses the rest out after 60fps. Correct me if I'm wrong, but isn't that pretty much completely intended when using multiple buffers (triple buffering being the best quality)+V-Sync? The card should be rendering as many frames as possible in the background, and display the one that syncs up to the display, as opposed to introducing input lag by waiting for a screen refresh to update. Maybe I'm wrong and have gone nuts. HalloKitty fucked around with this message at 17:58 on Feb 10, 2015 |
# ? Feb 10, 2015 17:53 |
|
sauer kraut posted:Why would you use Vsync and a framerate limiter at the same time, on a 60Hz display especially? edit: Capping below the refresh rate, I mean. Like Zero VGS and other people elsewhere have said, capping above the refresh rate can provide certain advantages, I guess. some dillweed fucked around with this message at 23:17 on Feb 10, 2015 |
# ? Feb 10, 2015 18:03 |
|
Is a 980 overkill for someone who doesn't game at 4k? I want to upgrade my 770, but all that drama about 3.5 GB memory in the 970 has me worried.
|
# ? Feb 12, 2015 05:09 |
|
Considering that 3.5 GB thing doesn't seem to come up outside of 4K (and lovely synthetic benchmarks) you're probably in the clear to get a 970. The 980 is not worth another 50% cost for 10% performance improvement.
|
# ? Feb 12, 2015 05:11 |
Fat_Cow posted:Is a 980 overkill for someone who doesn't game at 4k? I want to upgrade my 770, but all that drama about 3.5 GB memory in the 970 has me worried. The 970 is fine, there are very few situations where the 3.5GB thing comes up and most of them are forced by synthetic benchmarks. Just think of it as buying a 3.5GB card and you have it about right, and 3.5GB is plenty for 1080p. You might see some gains if you game at 1440p or higher, but that's about it.
|
|
# ? Feb 12, 2015 05:19 |
|
Fat_Cow posted:Is a 980 overkill for someone who doesn't game at 4k? I want to upgrade my 770, but all that drama about 3.5 GB memory in the 970 has me worried. You should wait another gen IMO. 770 to 970 is a definite upgrade but do you really need it for 1080? Im sure there will be new cards by the end of the year.
|
# ? Feb 12, 2015 06:37 |
|
I'd argue that it depends whether the extra performance is worth the cost. I switched from a 280X to a 290 for 1920x1200 for about the same price since the 280X didn't quite allow me to simply set everything to max and ignore other settings. It was fine for the most part but I definitely prefer having everything both pretty and fast vs compromising on either. Assuming the 770 could be sold to make up some of the difference it might be doable to upgrade now.
|
# ? Feb 12, 2015 08:53 |
|
I'd say upgrade unless you're really anal about all the extra poo poo. If you just want to game at 1080/60 with no worries and be able to turn the settings up a 970 is the best bet.
|
# ? Feb 12, 2015 09:00 |
|
|
# ? May 30, 2024 09:52 |
|
I'm pondering upgrading my old GTX 560 (non-TI, even!). The reviews I've seen seem to suggest that the 960 is a really good midrange card, with good framerates, minimal stutter, low power consumption and low noise. Yet I see almost no-one recommending it in the parts picking thread. Is there some reason that I've overlooked not to buy one, if I'm after a midrange card? (I have a single 1080p monitor to drive, a decently beefy CPU and 8 gigs of RAM, so the card is definitely the bottleneck for the moment).
|
# ? Feb 12, 2015 10:21 |