Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
Yeah I was reading over my phone and didn't see the whole WWC article.

"secret press conference" Yeah okay man. Also that guy needs an editor. "Adoptive Sync" "San Fransico"

Adbot
ADBOT LOVES YOU

Rastor
Jun 2, 2001

The original article is here but is even less English.

If AMD and Intel both drum up support for Adaptive Sync then nVidia is likely to fall in line, and I haven't seen anyone link to definitive evidence that the new Maxwell cards lack the necessary DisplayPort spec.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
Well, speaking of Freesync... buy my R9 290xes guys, they'll keep you warm in the winter: http://forums.somethingawful.com/showthread.php?threadid=3667123

Ralith
Jan 12, 2011

I see a ship in the harbor
I can and shall obey
But if it wasn't for your misfortune
I'd be a heavenly person today
Do we have any clues as to when AMD's competitor to the 980 will show up? I'm planning a new system soon, and it'd be a shame to be locked out of freesync, but they seem to be under total radio silence.

viewtyjoe
Jan 5, 2009

Ralith posted:

Do we have any clues as to when AMD's competitor to the 980 will show up? I'm planning a new system soon, and it'd be a shame to be locked out of freesync, but they seem to be under total radio silence.

AMD is supposedly announcing something on the 25th. Their tweet about it seems to indicate it's GPU related.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Here's hoping it's a strong counter-punch, because they need that and I don't want to see 'em out of the market (it'd take more than one GPU generation, but not many more, if it were a total sweep this generation). Good luck AMD, hang in there :ohdear:

tijag
Aug 6, 2002

Agreed posted:

Here's hoping it's a strong counter-punch, because they need that and I don't want to see 'em out of the market (it'd take more than one GPU generation, but not many more, if it were a total sweep this generation). Good luck AMD, hang in there :ohdear:

If it was the 'full' Tonga, that still wouldn't compete with the 980. Might compete favorably with the 970 though? At least on a pure performance metric?

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

tijag posted:

If it was the 'full' Tonga, that still wouldn't compete with the 980. Might compete favorably with the 970 though? At least on a pure performance metric?

If the 285 had some clock headroom and some modules disabled, a 285X might be able to get in the same ballpark GTX 970. However, this would leave the R9-290 really high and dry as such a card would have to be spitting distance to the R9-290, while also costing $300 or less.

Edit: I just looked at a lot of GTX 970 review roundups. The GTX970 is faster than the R9-290X in about half of games somehow, despite lower synthetic benchmarks and theoretical performance. By "faster" I mean "higher average fps and higher minimum FPS". I don't see what AMD's going to do, given that the 290X is officially still a $500 card that seems comparable now to a $330 one.

Twerk from Home fucked around with this message at 21:16 on Sep 23, 2014

1gnoirents
Jun 28, 2014

hello :)
I too am seriously hoping AMD comes back strong. Or at least, adequate... I have this feeling CPU's are one thing because the world runs on CPU's and there is always opportunity and motive to make them better even if it is awfully one sided right now. But GPU's, at least the ones we're talking about, seem far less immune to stagnating badly without real competition.

Lowen SoDium
Jun 5, 2003

Highen Fiber
Clapping Larry
Maybe we can hope that there will be monitors that support g-sync and freesync.

everythingWasBees
Jan 9, 2013




MSI 970 Gaming is back in stock on Newegg.
Ordered my first dedicated desktop graphics card in like, seven years. :toot:

NickBlasta
May 16, 2003

Clearly their proficiency at shooting is supernatural, not practical, in origin.
I've had my 970 in for a day now (EVGA SC, blower cooler) and it's great. I did get some coil whine playing an older game but it's quieter than the coil whine on my 560ti. :v: I can barely hear it and I have only a couple low-rpm case fans so I don't really see how it's an issue.

1gnoirents
Jun 28, 2014

hello :)
I'm sure its just relative.

Rastor
Jun 2, 2001

Lowen SoDium posted:

Maybe we can hope that there will be monitors that support g-sync and freesync.
There will be monitors that support Adaptive Sync, there will be monitors that support G-Sync, and there will be monitors that support both Adaptive Sync and G-Sync. The ones that support G-Sync will cost more.

I think we all know where that road leads.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
If I'm installing 970s in SLI with no pci slot gaps then I definitely need a blower model right? Does the EVGA blower version have the same cooler issues as their custom cooler?

Also, has anyone found a review where they directly compare decibels for MSI and Gigabyte versions?

1gnoirents
Jun 28, 2014

hello :)

Zero VGS posted:

If I'm installing 970s in SLI with no pci slot gaps then I definitely need a blower model right? Does the EVGA blower version have the same cooler issues as their custom cooler?

Also, has anyone found a review where they directly compare decibels for MSI and Gigabyte versions?

The worst part about 770 SLI for me was the heat for the top card, and that was mid tower + ATX board. It was open air coolers and obviously more wattage, but the bottom card would hit 70 when the top card would easily push over 90 degrees. That being said it was all within spec it was just annoying. I wasn't sure if it was due to the heat off the back of the bottom card though, and if blower coolers would have helped much except for overall ambient temperature. I'm going to have to make this decision now soon as well.

GreatGreen
Jul 3, 2007
That's not what gaslighting means you hyperbolic dipshit.
With the GTX series, can you go into the GeForce control panel and set a hardware limit on the FPS the card will render?

Anti-Hero
Feb 26, 2004
FWIW I have a 580SLI setup using reference EVGAs with blowers and I have not noticed that large of a temperature differential between the two cards. However, my case orients the cards vertically rather than horizontally. Hard to say whether that or the fan construction has the larger impact on the temperature equalization.

1gnoirents
Jun 28, 2014

hello :)

GreatGreen posted:

With the GTX series, can you go into the GeForce control panel and set a hardware limit on the FPS the card will render?

No but the overclocking software you (should) use has that capability. Whether EVGA precision, if that's available again now, or another flavor like MSI afterburner via the Rivatuner section of it. EVGA is easier because its right there but last time I tried to download it it was pulled because people complained that they stole Rivatuner's code. The afterburner way is like 5 more clicks but once that section is enabled its easy to change and in your task bar.

GreatGreen
Jul 3, 2007
That's not what gaslighting means you hyperbolic dipshit.

1gnoirents posted:

No but the overclocking software you (should) use has that capability. Whether EVGA precision, if that's available again now, or another flavor like MSI afterburner via the Rivatuner section of it. EVGA is easier because its right there but last time I tried to download it it was pulled because people complained that they stole Rivatuner's code. The afterburner way is like 5 more clicks but once that section is enabled its easy to change and in your task bar.

Is this overclocking software something you should use alongside the GeForce software or instead of it?

Lowen SoDium
Jun 5, 2003

Highen Fiber
Clapping Larry

Rastor posted:

There will be monitors that support Adaptive Sync, there will be monitors that support G-Sync, and there will be monitors that support both Adaptive Sync and G-Sync. The ones that support G-Sync will cost more.

I think we all know where that road leads.

Is there any advantage to one over the other?

Rastor
Jun 2, 2001

Lowen SoDium posted:

Is there any advantage to one over the other?
Adaptive Sync doesn't cost as much to implement. Because Adaptive Sync products don't exist yet, and in fact G-Sync only barely exists, we don't have reviews to compare performance. nVidia obviously is claiming to have the better technology, but I am personally dubious that they will be able to demonstrate a sufficient performance premium to charge a price premium.

Wistful of Dollars
Aug 25, 2009

Nvidia accepted Adaptive sync much faster than I ever expected. I wonder what they know...

GrizzlyCow
May 30, 2011

El Scotch posted:

Nvidia accepted Adaptive sync much faster than I ever expected. I wonder what they know...

They didn't. The WccfTech basically assumed that they would. NVIDIA has already said decisively they won't support FreeSync.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.
The day nvidia accepts a competitors technology is the day apple makes a big phone

1gnoirents
Jun 28, 2014

hello :)

GreatGreen posted:

Is this overclocking software something you should use alongside the GeForce software or instead of it?

The geforce software, assuming you mean geforce experience, is just a... driver updater more than anything and a way to click "shadowplay = on". The overclocking utilities are separate and start minimized to the system tray in the same way as geforce experience. But yes run them both they are fairly unrelated. It's also how you overclock your GPU which is easy but :effort:, but for the very least frame limiting which I can't recommend highly enough.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
Both companies really, really need to build frame limiting into their first party software. At the very least, default the GPU to not rendering more frames than the monitor's current refresh rate. Would that be so hard?

veedubfreak
Apr 2, 2005

by Smythe

Zero VGS posted:

Both companies really, really need to build frame limiting into their first party software. At the very least, default the GPU to not rendering more frames than the monitor's current refresh rate. Would that be so hard?

Apparently yes.

1gnoirents
Jun 28, 2014

hello :)
I was shocked when I first tried it . I couldn't believe it either. But then I realized it basically negated the need for vsync, which I had spent like 2 days prior to that loving around with and generally being unhappy with the various downsides in different modes. Once I started frame limiting it was like I got 90% of the benefit of vsync with none of the lag or retarded stepping down to 30 fps, AND the car always ran cooler and the output was seriously smoother. So maybe whoever put all the time into vsync doesn't want all that work to be for nothing :shrug:

Hace
Feb 13, 2012

<<Mobius 1, Engage.>>
I get just as much frame tearing with or without frame limiting.

1gnoirents
Jun 28, 2014

hello :)

Hace posted:

I get just as much frame tearing with or without frame limiting.

I know this isnt always an option but if you can try and set a custom resolution for a tiny bit over 60hz, like 66 hz. If that works, frame limit to 59 or whatever and see if your tearing goes away. I dont know why but at 60 hz if I frame limit to 59-61 I get all the other benefits ( heat, smoothness) but sometimes it still tears just as bad (in some games, but not all :confused:). But if I raise the hz then it simply goes away everything else being the same. What I haven't tried is if I get the same effect if I limit to something like 53 fps with 60hz if that affects tearing or not. I'm not sure why it would behave this way but I've done it like 5 times now just to make sure I wasn't going crazy.

I've only had one monitor I've overclocked wasn't supposed to be overclockable at all but it could still do 10% over, earlier monitors I wasn't even aware of the concept

edit: the one and only exception, and I have no idea why, is TF2.

1gnoirents fucked around with this message at 23:45 on Sep 23, 2014

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Hace posted:

I get just as much frame tearing with or without frame limiting.

Yeah I get mixed results, but on the other hand limiting the frames will stop the GPU from going MAXIMUM POWER and pulling its full TDP from the wall like it was mining bitcoins. Sometimes enabling V-Sync won't even get a card to dial it back a bit. But frame limiting always works for me, unless of course the third part software has a stupid limitation, like how some of them (I think the EVGA overclocker) can't set a limit *above* 120hz, which if you have a 144 monitor, that's not so good.

GreatGreen
Jul 3, 2007
That's not what gaslighting means you hyperbolic dipshit.
So I'm going to reveal that I don't really know much about how these cards work at all, but can limiting your max FPS affect your minimum frame rate?

For example if your card is chugging along and outputting something like 100 FPS and experiences a really graphically intense scene or something, it might get pulled down to let's say 80 FPS. If you limit your frames per second to something like 60 FPS and play through the same section of whatever video game that was, will the sudden graphical intensity that happened last time pull the frame rate even lower than 60 FPS because the card wasn't "ready for it" by already operating at the higher power it normally uses to draw 100 FPS?

I guess what I'm asking is whether there is any elasticity to the way these cards draw power on a moment to moment basis or is electrical current drawn per the requirement of each frame, totally independently from the frame before it?

GreatGreen fucked around with this message at 00:12 on Sep 24, 2014

Hace
Feb 13, 2012

<<Mobius 1, Engage.>>

Zero VGS posted:

Yeah I get mixed results, but on the other hand limiting the frames will stop the GPU from going MAXIMUM POWER and pulling its full TDP from the wall like it was mining bitcoins. Sometimes enabling V-Sync won't even get a card to dial it back a bit. But frame limiting always works for me, unless of course the third part software has a stupid limitation, like how some of them (I think the EVGA overclocker) can't set a limit *above* 120hz, which if you have a 144 monitor, that's not so good.

No I totally agree, Bionic Commando drawing like 1000FPS and maxing out my 770 by default is really lovely, but I was just saying that there's almost never a case where limiting to 60FPS has resolved screen tearing.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Hace posted:

No I totally agree, Bionic Commando drawing like 1000FPS and maxing out my 770 by default is really lovely, but I was just saying that there's almost never a case where limiting to 60FPS has resolved screen tearing.

I thought that non-vsync framerate limiting tended to make tearing worse? I know that I've seen games hard-coded to be locked at 60fps and if I run them without VSync, I'll get a tear that hangs out in one place on the screen and is way more annoying than an intermittent / rolling tear.

craig588
Nov 19, 2005

by Nyc_Tattoo

GreatGreen posted:

So I'm going to reveal that I don't really know much about how these cards work at all, but can limiting your max FPS affect your minimum frame rate?

For example if your card is chugging along and outputting something like 100 FPS and experiences a really graphically intense scene or something, it might get pulled down to let's say 80 FPS. If you limit your frames per second to something like 60 FPS and play through the same section of whatever video game that was, will the sudden graphical intensity that happened last time pull the frame rate even lower than 60 FPS because the card wasn't "ready for it" by already operating at the higher power draw it normally uses to draw 100 FPS?

The power state switch takes less than a frame to happen. You might get a slightly slower speed for 8ms. Modern throttling, both for GPUs and CPUs is really good. Before Sandybridge and Kepler I used to always let everything run full speed all the time because the transitions sucked. Now it's all so fast and accurate that I let everything throttle if it's capable of it.

Hace
Feb 13, 2012

<<Mobius 1, Engage.>>
I find that for emulators I still force my GPU to run at 100%, because otherwise I get intermittent pauses all over the place.

1gnoirents
Jun 28, 2014

hello :)

GreatGreen posted:

So I'm going to reveal that I don't really know much about how these cards work at all, but can limiting your max FPS affect your minimum frame rate?

For example if your card is chugging along and outputting something like 100 FPS and experiences a really graphically intense scene or something, it might get pulled down to let's say 80 FPS. If you limit your frames per second to something like 60 FPS and play through the same section of whatever video game that was, will the sudden graphical intensity that happened last time pull the frame rate even lower than 60 FPS because the card wasn't "ready for it" by already operating at the higher power draw it normally uses to draw 100 FPS?

The opposite happens, in general. But you're probably thinking of it in a different way, say you're at a 100 fps and you get a momentary drop to "80 fps" because the gpu is overwhelmed. You see a stutter and its back up. It's not because it's running 20 fps slower, its because you're getting a a few frames going very slowly very briefly. It depends on when it polls whether or not it really reports like 20 fps or 80 or 100. That same thing is pretty much going to happen frame limited or not although its probably arguable its less of an effect when limited.

However if you truly are just averaging 80 fps down from 100 fps because the scene is just more difficult to process over time, then a 60 fps frame cap will show no difference at all - it will be butter smooth. The thing is when the fps is constantly going up and down even if its all above your refresh rate it is introducing more opportunity to get those stutters and hangs. Plus its just working the card harder constantly for no benefit. You aren't buying yourself "overhead" I guess is what I'm trying to say to soak up more difficult thing to process. That overhead though is a good indication of how well it will hold a frame cap though in my experience. Say if you average 75 fps, when thats really going from 60-90 fps constantly but the average is 75, then you framelimit to 60 I've discovered you can consider that kind of overhead as GPU overhead that will "power through" rougher scenes.

I know I'm explaining this poorly but I've never seen a benefit to letting fps soar way above your refresh, except in games where it doesn't matter either way (some less graphically intense games)

GreatGreen
Jul 3, 2007
That's not what gaslighting means you hyperbolic dipshit.
Awesome answers guys, thanks.

Adbot
ADBOT LOVES YOU

Skwee
Apr 29, 2010

たべる つくる
つくる たべる
たべる つくる
ふたり ドゥビドゥバ

everythingWasBees posted:

MSI 970 Gaming is back in stock on Newegg.
Ordered my first dedicated desktop graphics card in like, seven years. :toot:

Really makes me want to RMA my evga 970 sc acx.. But that would cost me $65 (54 for restock and 11 for shipping) and I am not sure it is worth it to be essentially paying $410 + shipping for an MSI card instead of an evga card..

e:
Eh, screw it, I will keep the EVGA 970 SC ACX 1.0 and let you all know if it annoys the hell out of me or not. Not going to pay for that ridiculous restocking fee

Skwee fucked around with this message at 00:29 on Sep 24, 2014

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply