Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

BurritoJustice posted:

Afterburner has a simple, fast interface and inbuilt profiling. It works well, and I've never heard of anyone complaining about lag. It certainly isn't garbage. It is nice that it also comes with comprehensive hardware monitoring, inbuilt stress testing, and the best drat customisable on screen display around. I literally don't see the advantage of using inspector as an overclocking utility over afterburner. Inspector is useful in other areas, sure, but for someone looking for an overclocking utility Afterburner is king.

I loved Afterburner, but I found that recently the Rivatuner Statistics Server was causing horrendous lag in certain games, and noticeable lag to a much lesser extent in other games. The only way to stop it was to turn app detection off, which takes away the point of running it.

When I say horrendous, I mean bizarre poo poo - when playing Orcs Must Die! 2, for example, the framerate was 60FPS locked at all times when playing with the Xbox 360 controller, but as soon as I used the mouse, the framerate started tanking, with massive weird stutters. Close Rivatuner Statistics Server, back to 60FPS locked. V-Sync on or off made no difference, nor did having a different graphics driver. Nor did removing my Logitech mouse driver or updating it. Took me a while to figure that one out.

Adbot
ADBOT LOVES YOU

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost

Darkpriest667 posted:

Let's be honest to the folks here, the "telemetry" that they're going to be collecting isn't anything super personal or special. It's nothing secret that they wouldn't be able to get anyway. It's just more convenient.
I really suspect the primary point is to provide massive amounts of proof of the sheer volume of piracy in Asia or developing / emerging markets to the point that the data cannot be ignored and government officials are pushed into a corner to act. I believe this is a fundamental control point that Microsoft wants rather than to try to herd the billions of cats out there that will likely break their copy protection.

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

necrobobsledder posted:

I really suspect the primary point is to provide massive amounts of proof of the sheer volume of piracy in Asia or developing / emerging markets to the point that the data cannot be ignored and government officials are pushed into a corner to act. I believe this is a fundamental control point that Microsoft wants rather than to try to herd the billions of cats out there that will likely break their copy protection.

Nope. That already exists with windows update/WGA. This is a straight metrics and analysis platform to allow for in-field validation and early detection of issues.

Now it could in theory be used to detect pirate copies but there's already a mechanism for that.

Koramei
Nov 11, 2011

I have three regrets
The first is to be born in Joseon.
I think a ball bearing on one of the fans on my 7870 (this one specifically) is loose or something- the fan often rattles, and sometimes stops/starts spinning intermittently while the other fan spins continuously. Am I completely off base here? And would this be compatible with my card and fix the problem if I used it as a replacement?

edit: also would I need special tools/care to replace a fan or is it fairly straight forward?

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva
That indicates a bad fan so replacing it should fix it. Most semi-stock GPU fans just screw into the heatsink, so you'd need to remove the top cover and replace it. You can oil the fan but it's only a temporary fix.

I'd probably go with this fan assembly instead since the fans daisy-chain power cables and no idea if the one you linked is the exact type required (although theoretically any matching PWM connector should work):
http://www.ebay.com/itm/75mm-Dual-X...549332234&rt=nc

track day bro!
Feb 17, 2005

#essereFerrari
Grimey Drawer
Somebody is going to lend me an MSI R9 290 to see if I have the same texture flickering issues that I get with my 970. Do you recon my psu will be able to cope with it? I have the 550w version of this with a 4670k oc'ed to 4.4ghz.

SwissArmyDruid
Feb 14, 2014

by sebmojo

track day bro! posted:

Somebody is going to lend me an MSI R9 290 to see if I have the same texture flickering issues that I get with my 970. Do you recon my psu will be able to cope with it? I have the 550w version of this with a 4670k oc'ed to 4.4ghz.

http://pcpartpicker.com/p/mpj9cf :shrug:

track day bro!
Feb 17, 2005

#essereFerrari
Grimey Drawer

This is my entire build with the 290 if that helps, how accurate is that wattage estimate?
http://uk.pcpartpicker.com/p/JNhFqs

Panty Saluter
Jan 17, 2004

Making learning fun!

track day bro! posted:

This is my entire build with the 290 if that helps, how accurate is that wattage estimate?
http://uk.pcpartpicker.com/p/JNhFqs

Hopefully pretty accurate; it puts me at 350w with a 450w power supply :v:

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva

track day bro! posted:

Somebody is going to lend me an MSI R9 290 to see if I have the same texture flickering issues that I get with my 970. Do you recon my psu will be able to cope with it? I have the 550w version of this with a 4670k oc'ed to 4.4ghz.
It should be fine. That PSU is fairly over engineered and you're not running a ton of hardware. With a 290 and overclocked 2600K I pull around 350w-400w or so from the wall so I'd be surprised if you'd really draw much more than that.

1gnoirents
Jun 28, 2014

hello :)

HalloKitty posted:

I loved Afterburner, but I found that recently the Rivatuner Statistics Server was causing horrendous lag in certain games, and noticeable lag to a much lesser extent in other games. The only way to stop it was to turn app detection off, which takes away the point of running it.

When I say horrendous, I mean bizarre poo poo - when playing Orcs Must Die! 2, for example, the framerate was 60FPS locked at all times when playing with the Xbox 360 controller, but as soon as I used the mouse, the framerate started tanking, with massive weird stutters. Close Rivatuner Statistics Server, back to 60FPS locked. V-Sync on or off made no difference, nor did having a different graphics driver. Nor did removing my Logitech mouse driver or updating it. Took me a while to figure that one out.

Rivatuner used to cause problems for me too. I remember also having a hilariously hard time uninstalling it but that was a while ago.

I'm back to using it however because its the simplest form of fps locking... still.. .

I've actually had real stability problems with nvidia inspector as well. I still had to use it for some reason, I believe for some SLI bios modded janky poo poo that nothing else would detect correctly. But in normal conditions it seemed to be fairly inconsistent holding OC values. Afterburner has been rock solid despite the rice skin.

SwissArmyDruid
Feb 14, 2014

by sebmojo
Rebrands off the port bow!

http://techreport.com/news/27792/radeon-300-series-rumor-repository-suggests-looming-rebrands

Also, looks like HBM will not come to the 380/380X, just the 390/390X/395X2.

As expected, 1024-bit memory bus on the HBM parts.

TL;DR: New silicon slots in at the 90 and 70 positions, 80 and 60 is last series's 90 and 70 silicon, bumped and tweaked.

SwissArmyDruid fucked around with this message at 22:34 on Feb 9, 2015

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
With rebrands, do the old boards ever get sent from the retailer back to the OEMs to get a BIOS flash and new packaging, or are they always brand new?

craig588
Nov 19, 2005

by Nyc_Tattoo
They're always brand new. Factories aren't set up to take old board and work on them.

Ragingsheep
Nov 7, 2009
Is it just me or does this seem kinda misleading?

craig588
Nov 19, 2005

by Nyc_Tattoo
Everyone does rebrands and have been for quite a while now.

veedubfreak
Apr 2, 2005

by Smythe
Afterburner is regularly updated also. Most of the others are not.

As for rebrands, the onus is on the consumer to actually research a product they are going to spend money on.

SwissArmyDruid
Feb 14, 2014

by sebmojo
^^^ What they said.

I mean, that's why you're here, right? Because you take an active interest in being a well-informed consumer, and all that jazz?

Or are you one of those people that just comes in to bag on whichever company does not produce the GPU for your video card? =|

EDIT: Holy poo poo, the OP needs a fuckin' update. "AMD has the single most powerful video card that doesn’t require a connector from the power supply, the Radeon HD 7750, making it the best graphics upgrade to a mass-market desktop that doesn’t require replacing the power supply as well."? Jesus, that's outdated. That's the GTX 750 Ti now.

What's the protocol on submitting OP updates?

SwissArmyDruid fucked around with this message at 00:14 on Feb 10, 2015

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
Sounds like you volunteered, go write a new OP

SwissArmyDruid
Feb 14, 2014

by sebmojo

Zero VGS posted:

Sounds like you volunteered, go write a new OP

*Shrug*. Should keep me amused.

ZergFluid
Feb 20, 2014

by XyloJW
So what specifications would meet running Star Swarm stress test at 60 frames per second consistently?

Edit: just technical curiosity

RBX
Jan 2, 2011

Got a Gigabyte 970 and I am so pleased. Every game is 60 now even Titanfall and the new cod that I struggled to run before. No complaints here.

Bleh Maestro
Aug 30, 2003

RBX posted:

Got a Gigabyte 970 and I am so pleased. Every game is 60 now even Titanfall and the new cod that I struggled to run before. No complaints here.

But it's gonna stutter fizzle and die because of da memoriez!!

SwissArmyDruid
Feb 14, 2014

by sebmojo

ZergFluid posted:

So what specifications would meet running Star Swarm stress test at 60 frames per second consistently?

Edit: just technical curiosity

Not DX11 for the first part. Star Swarm is a synthetic benchmark designed to destroy high-level APIs, by generating more draw calls than, say, DX11 could ever hope to address. Hence, DX12 and Mantle have disproportionately higher performance gains as a result. It's just not a fair competition for DX11.

Which is why the tests that Anandtech did with Star Swarm and DX12 (article here: http://www.anandtech.com/show/8962/the-directx-12-performance-preview-amd-nvidia-star-swarm ) don't draw any hard conclusions about "which is best". Between DX12 not being finalized yet, and driver support for DX12/WDDM not quite being completely there yet, there are only two definitive conclusions you can really draw from it:

* A GTX 980 running Star Swarm will bottleneck on two cores.
* DX11 can't help you. WHERE IS YOUR GOD NOW?!

sauer kraut
Oct 2, 2004

quote:

while the R9 380 series will feature Grenada, a re-brand of the Hawaii chip from the R9 290 series
:flaccid:

So what they're saying is that AMD is gonna sell a warmed over 290X (with GCN 1.2 hacked in) for the amazing price of 250$? That's what we've been waiting for?

sauer kraut fucked around with this message at 03:14 on Feb 10, 2015

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

sauer kraut posted:

:flaccid:

So what they're saying is that AMD is gonna sell a warmed over 290X (with GCN 1.2 hacked in) for the amazing price of 250$? That's what we've been waiting for?

No, you've been waiting for the 1024 bit card that will be absurdly expensive.

some dillweed
Mar 31, 2007

Sorry if this isn't really the place to ask, but is there a general consensus on the "best" method of enabling VSync in games while minimizing input latency? Still forcing triple buffering through D3DOverrider if it's not built into the game? Do I still need to test out every game with triple-buffered VSync enabled and see how the controls respond on a 59 fps cap versus 60 fps cap and all of that poo poo? How well does Nvidia's "Adaptive VSync" work? I think the last time I tried it I still noticed heavy tearing if the frame rate dropped below the refresh (which I guess is what you'd expect since it disables VSync at that point).

I'm just trying to finally get back to my huge backlog and don't want to be annoyed by tearing or unresponsive controls.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
The "best" way is with a Gsync/FreeSync screen, but that's more than just V-sync. Second best is V-sync enabled on a 120 Hz or 144 Hz screen, triple buffered. The fast refresh minimizes latency and also means that you have additional V-sync'd refresh rates between 60 Hz and 30 Hz (36, 48, 72 for 144 Hz and 30, 40, 60 for 120 Hz). Beyond that, it all comes down to trade-offs. Triple buffering is strictly superior to double buffering except for the extra VRAM use, which shouldn't matter most of the time and especially never in backlogged games.

Adaptive V-Sync is just a setting that turns off V-Sync if the frame rate drops below the monitor refresh rate. It's mostly only useful on 60 Hz screens and games that are *usually* solid-60 framerates, and on the occasional dips you prefer to see tearing rather than 30 FPS judder.

59 FPS vs. 60 FPS caps on a 60 Hz display... well... With V-Sync at 60 Hz, a 59 FPS cap displays at 30 FPS. If you can do a solid 60, then half your refresh rate will be wasted, and that's just bad. If your display can do 59 Hz and is set to do that, that's cool and all but doesn't change anything - typically that's actually 59.94 Hz, meant to match playback of NTSC/ATSC media from discs or TV tuners, and a 59 FPS cap is still below refresh so it will give you a locked 29.97 FPS experience.

If you want to get esoteric, and your game can reliably render faster than monitor refresh, and your system has outputs for the Intel IGP (or AMD IGP I guess), you can set up Lucidlogix MVP in i-mode and enable Virtual V-Sync and Hyperformance. Together, these are worth a decent average framerate hit, but the faster-than-refresh the GPU can render, the more that display lag will be reduced (since the final displayed frame is taken from nearer to the end of the frame interval than a normal V-sync frame, which is taken from the beginning of the frame interval).

some dillweed
Mar 31, 2007

Yeah, I'm just dealing with a standard 60 Hz IPS display. It's probably one of the worst options as far as overall latency is concerned, but it's what I have to work with. As for the frame cap thing, I only mentioned that because I've tried it in combination with triple-buffered VSync for certain games in the past and it seems to reduce the input lag (at least slightly, but still perceptibly) compared to just using triple buffering and VSync. It seems to be really dependent on the specific game in question. Some games, it causes occasional stutter, while other games don't have much noticeable issue. I've seen other people mention using 2 frames below the refresh rate for their cap instead of 1, or putting the limit at a frame above the refresh rate, and setting the global setting for "Maximum pre-rendered frames" to 1 as possible solutions to reduce input latency, but haven't really looked into the "pre-rendered frames" setting further or tried anything with it. Dropping the frame cap further below 59 didn't seem to make any huge change on my setup, from what I remember, and I haven't tried increasing it beyond the refresh rate. I have no idea if putting a cap above the refresh rate actually does anything with VSync enabled, though.

I apparently started making a list back in September of all of the settings that work best in the various games I've played... I wish I were less obsessive about this stuff. I need sleep.

sauer kraut
Oct 2, 2004
Why would you use Vsync and a framerate limiter at the same time, on a 60Hz display especially?
That seems like a surefire recipe for rock stable 30fps gaming :confused:

Hopefully AMD gets their limiter going in the next CCC and it doesn't suck.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

sauer kraut posted:

Why would you use Vsync and a framerate limiter at the same time, on a 60Hz display especially?
That seems like a surefire recipe for rock stable 30fps gaming :confused:

Hopefully AMD gets their limiter going in the next CCC and it doesn't suck.

Sometimes, and I've confirmed it with a wall watt meter, you can set a game to 60hz vsync but the graphics card will still run at max TDP and max utilization, as if it is trying to render as many frames as it can and tosses the rest out after 60fps.

In those cases, running instead at 59fps frame limit, or 60 or 61, can still introduce tearing. So I'd usually run both vsync and a 61 fps limit as a catch all.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Zero VGS posted:

Sometimes, and I've confirmed it with a wall watt meter, you can set a game to 60hz vsync but the graphics card will still run at max TDP and max utilization, as if it is trying to render as many frames as it can and tosses the rest out after 60fps.

Correct me if I'm wrong, but isn't that pretty much completely intended when using multiple buffers (triple buffering being the best quality)+V-Sync? The card should be rendering as many frames as possible in the background, and display the one that syncs up to the display, as opposed to introducing input lag by waiting for a screen refresh to update.

Maybe I'm wrong and have gone nuts.

HalloKitty fucked around with this message at 17:58 on Feb 10, 2015

some dillweed
Mar 31, 2007

sauer kraut posted:

Why would you use Vsync and a framerate limiter at the same time, on a 60Hz display especially?
That seems like a surefire recipe for rock stable 30fps gaming :confused:
Triple buffering with VSync doesn't drop the frame rate to the various divisions of the refresh rate (60, 30, 20, 15, etc.) like standard double buffering does. If you're using standard double-buffered VSync, then yeah, I'd think you'd want to avoid capping the frame rate because that would likely worsen your experience significantly.

edit: Capping below the refresh rate, I mean. Like Zero VGS and other people elsewhere have said, capping above the refresh rate can provide certain advantages, I guess.

some dillweed fucked around with this message at 23:17 on Feb 10, 2015

Fat_Cow
Dec 12, 2009

Every time I yank a jawbone from a skull and ram it into an eyesocket, I know I'm building a better future.

Is a 980 overkill for someone who doesn't game at 4k? I want to upgrade my 770, but all that drama about 3.5 GB memory in the 970 has me worried.

Kazinsal
Dec 13, 2011
Considering that 3.5 GB thing doesn't seem to come up outside of 4K (and lovely synthetic benchmarks) you're probably in the clear to get a 970. The 980 is not worth another 50% cost for 10% performance improvement.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

Fat_Cow posted:

Is a 980 overkill for someone who doesn't game at 4k? I want to upgrade my 770, but all that drama about 3.5 GB memory in the 970 has me worried.

The 970 is fine, there are very few situations where the 3.5GB thing comes up and most of them are forced by synthetic benchmarks. Just think of it as buying a 3.5GB card and you have it about right, and 3.5GB is plenty for 1080p. You might see some gains if you game at 1440p or higher, but that's about it.

Bleh Maestro
Aug 30, 2003

Fat_Cow posted:

Is a 980 overkill for someone who doesn't game at 4k? I want to upgrade my 770, but all that drama about 3.5 GB memory in the 970 has me worried.

You should wait another gen IMO. 770 to 970 is a definite upgrade but do you really need it for 1080? Im sure there will be new cards by the end of the year.

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva
I'd argue that it depends whether the extra performance is worth the cost. I switched from a 280X to a 290 for 1920x1200 for about the same price since the 280X didn't quite allow me to simply set everything to max and ignore other settings. It was fine for the most part but I definitely prefer having everything both pretty and fast vs compromising on either.

Assuming the 770 could be sold to make up some of the difference it might be doable to upgrade now.

RBX
Jan 2, 2011

I'd say upgrade unless you're really anal about all the extra poo poo. If you just want to game at 1080/60 with no worries and be able to turn the settings up a 970 is the best bet.

Adbot
ADBOT LOVES YOU

Zephro
Nov 23, 2000

I suppose I could part with one and still be feared...
I'm pondering upgrading my old GTX 560 (non-TI, even!). The reviews I've seen seem to suggest that the 960 is a really good midrange card, with good framerates, minimal stutter, low power consumption and low noise. Yet I see almost no-one recommending it in the parts picking thread. Is there some reason that I've overlooked not to buy one, if I'm after a midrange card? (I have a single 1080p monitor to drive, a decently beefy CPU and 8 gigs of RAM, so the card is definitely the bottleneck for the moment).

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply