Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
future ghost
Dec 5, 2005

:byetankie:
Gun Saliva

Twerk from Home posted:

Would I be insane if I sold the MSI GTX 970 to a friend for what I paid for it to replace it with a $250 R9-290? They look pretty comparable in performance, and I'm having a bit of buyers remorse about the 970 because my replay of Crysis: Warhead @ 1920x1200 still can't VSync 60fps all the time.
Warhead seems to favor AMD cards, but that's an awful lot of effort for just one game. If you're wanting to switch mainly to save money there's nothing wrong with that.

Adbot
ADBOT LOVES YOU

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

cisco privilege posted:

Warhead seems to favor AMD cards, but that's an awful lot of effort for just one game. If you're wanting to switch mainly to save money there's nothing wrong with that.

It just so happens that the main GPU-heavy game I'm playing right now is Company of Heroes 2, which also seems to prefer the R9-290. The other stuff I play, like CS: GO, League of Legends and Dota at 2x DSR don't look so incredible that I can't bear to give it up.

It's mostly questioning if the $350 after sales tax was worth it for the 970, I could pocket $100 and see equivalent performance. The main reason I got the 970 is I didn't think that AMD could price cut down to competitiveness, but 290s around $250 are certainly competitive.

Edit: Just realized that I'd need to budget for a new PSU as well, an R9-290 on a 5 year old 520W PSU doesn't seem like a good idea. This idea might be right out.

Twerk from Home fucked around with this message at 19:41 on Oct 23, 2014

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Swartz posted:

So how do I know for sure if my GPU is throttling (I know this has been asked, I'm hoping for a long, detailed explanation though)?

I've been keeping my MSI GTX 970 Gaming at +135 core, +500 memory, all at 110% power limit, and it's been doing great. No artifacts whatsoever.

However I decided to push it without increasing power limit or adding voltage. I'm now up to +175 on core with no artifacting in Ungine Heaven or half an hour of Stalker:COP maxed out and at 2x DSR.

If my GPU were throttling at this speed, wouldn't I be getting stuttering or periodic slowdowns? That hasn't happened so far.

Obviously it looks like I got lucky and got one hell of a good overclocker, as I haven't heard of anything like this without increasing voltage, but maybe it will have artifacts in more demanding games, I need to test that.

I might be able to help - if the right set of indicators is on for your GPU's power monitoring, it's trying to tell you that it's pretty much giving all it can. Unless nVidia walked it backwards for no apparent reason it should still be there for Maxwell GPUs, though you'll need to check your monitoring suite to know for sure since I didn't do the needful yet (oh, how terrible, not to have a current-gen GPU for a year :qq:).

If these flags are maxed, so is your card. If they aren't showing, and it doesn't crash, it's probably good ol' hardware TDP enforcement come to ruin any further OCing. They should be showing for hardware enforcement, that's basically all they are to begin with - but like I said, no first hand experience with Maxwell and I don't wish to mislead.

You're spot on about CS and CoP looking fantastic with DSR, by the way - it's nice to see an old game presented so well. I ought to ask the thread on the series if there's a UI mod for >1440p resolutions, because it becomes nearly illegible for me after that.

GeDoSaTo is still DX9 only, isn't it? Haven't really paid attention to it, though I probably should have. Never got into Dark Souls II.

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva

Twerk from Home posted:

Edit: Just realized that I'd need to budget for a new PSU as well, an R9-290 on a 5 year old 520W PSU doesn't seem like a good idea. This idea might be right out.
A 520W PSU from that era? Sounds like an OCZ maybe since they loved units in that size. Probably worth replacing anyways regardless of whatever GPU you're using now.

DarthBlingBling
Apr 19, 2004

These were also dark times for gamers as we were shunned by others for being geeky or nerdy and computer games were seen as Childs play things, during these dark ages the whispers began circulating about a 3D space combat game called Elite

- CMDR Bald Man In A Box
Just realised today that I'm on a 7 year old PSU with the 8th year fast approaching. These Corsairs are workhorses. Will probably upgrade at the next mobo upgrade as I fancy some SLi action now I've got a bit more money to burn

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

cisco privilege posted:

A 520W PSU from that era? Sounds like an OCZ maybe since they loved units in that size. Probably worth replacing anyways regardless of whatever GPU you're using now.

I thought that the Corsair 520HX was considered a good unit. It was under warranty until a couple months ago, so I haven't thought of it as a time bomb.

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva

Twerk from Home posted:

I thought that the Corsair 520HX was considered a good unit. It was under warranty until a couple months ago, so I haven't thought of it as a time bomb.
Oh, no the HX series is fine. Could probably even get another 3-4 years or so out of it before it presumably-gracefully retires itself. There were just alot of terrible 520W designs back then from different OEMs, so without knowing the model# I guessed at OCZ.

The 5-year lifespan for PSU replacement is generally a good guideline, but I still use a Silverstone Olympia 650W as a testing PSU and it's like, what, 8 years old now? A good-quality overspecced PSU can last a long time if it's not been fully-loaded for the majority.

future ghost fucked around with this message at 21:02 on Oct 23, 2014

Swartz
Jul 28, 2005

by FactsAreUseless

Agreed posted:

I might be able to help - if the right set of indicators is on for your GPU's power monitoring, it's trying to tell you that it's pretty much giving all it can. Unless nVidia walked it backwards for no apparent reason it should still be there for Maxwell GPUs, though you'll need to check your monitoring suite to know for sure since I didn't do the needful yet (oh, how terrible, not to have a current-gen GPU for a year :qq:).

If these flags are maxed, so is your card. If they aren't showing, and it doesn't crash, it's probably good ol' hardware TDP enforcement come to ruin any further OCing. They should be showing for hardware enforcement, that's basically all they are to begin with - but like I said, no first hand experience with Maxwell and I don't wish to mislead.

You're spot on about CS and CoP looking fantastic with DSR, by the way - it's nice to see an old game presented so well. I ought to ask the thread on the series if there's a UI mod for >1440p resolutions, because it becomes nearly illegible for me after that.

GeDoSaTo is still DX9 only, isn't it? Haven't really paid attention to it, though I probably should have. Never got into Dark Souls II.

Thanks, that helps.

I ran Firestrike then alt+tabbed out to see the usage at my +175 clock.
The highest the power % got to was 108 (meaning 2% headroom still); gpu temp was fine the whole time as I set my fan to 100% when playing games or running benchmarks; and gpu usage maxed out at 99%.
With that last one, I've never seen it go to 100%. In fact, I think gpu usage has been at 99% before when I had it at +135.

So if what I'm thinking is correct and my GPU is throttling a little, should increasing my voltage do the trick?

As for Stalker, yeah it looks great with DSR.
It looks even better with some GPU-stressing additions I've made to the code (for COP). If you want some compiled binaries and new shaders of the graphics related stuff I've done leave your email and I'll send it to you.

Swartz fucked around with this message at 21:35 on Oct 23, 2014

bobby2times
Jan 9, 2010

Twerk from Home posted:

Would I be insane if I sold the MSI GTX 970 to a friend for what I paid for it to replace it with a $250 R9-290? They look pretty comparable in performance, and I'm having a bit of buyers remorse about the 970 because my replay of Crysis: Warhead @ 1920x1200 still can't VSync 60fps all the time.

I upgraded from a sapphire r9 290 to a msi 970 and havent regretted it. Crysis is one of those agmes that will never run great unless you have invest in sli.

bobby2times fucked around with this message at 22:28 on Oct 23, 2014

pmchem
Jan 22, 2010


ASUS GTX 970 and 980 cards in stock:

http://www.newegg.com/Product/Product.aspx?Item=N82E16814121899
http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=9515557

Looks like they may be next to stabilize supply after Zotac? Or just randomness.

Schiavona
Oct 8, 2008

pmchem posted:

ASUS GTX 970 and 980 cards in stock:

http://www.newegg.com/Product/Product.aspx?Item=N82E16814121899
http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=9515557

Looks like they may be next to stabilize supply after Zotac? Or just randomness.

Ugh TigerDirect, get 970's, I want my discount.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Civ: Beyond Earth is starting to get reviews. Sadly it's kinda underwhelming, but we may see some really interesting technical stuff coming out. Tech Report is saying that the game is CPU-bound and looking to use Mantle to reduce that overhead, and also that it has built-in frame timing tools and uses SFR rather than AFR to better support assymetric GPUs like Hybrid CrossFire in APUs and mobile. Sounds neat!

http://techreport.com/blog/27258/civ-beyond-earth-with-mantle-aims-to-end-multi-gpu-microstuttering

Star War Sex Parrot
Oct 2, 2003

Amazon has the MSI Gaming GTX 980 in stock. :ohdear:

Starkk
Dec 31, 2008



Gah if this was a 970 I would jump on it, I still might :ohdear:

Tacier
Jul 22, 2003

Factory Factory posted:

Civ: Beyond Earth is starting to get reviews. Sadly it's kinda underwhelming, but we may see some really interesting technical stuff coming out. Tech Report is saying that the game is CPU-bound and looking to use Mantle to reduce that overhead, and also that it has built-in frame timing tools and uses SFR rather than AFR to better support assymetric GPUs like Hybrid CrossFire in APUs and mobile. Sounds neat!

http://techreport.com/blog/27258/civ-beyond-earth-with-mantle-aims-to-end-multi-gpu-microstuttering

So it sounds like this new approach to multi-GPU setups is prioritizing frame latency improvements over raw FPS increases? Looks pretty cool. I wonder how much of a noticeable difference it will make.

Eddain
May 6, 2007

It's being sold through a merchant using Amazon fulfillment, not directly by Amazon.

Their physical location is like 6 miles from my house though and I picked up a MSI GTX 980 Gaming directly from them. Legit sealed box and everything. They have some iffy reviews but it's mostly for their custom laptops. If you really want the MSI 980 Gaming I'd trust these guys.

Mr.PayDay
Jan 2, 2004
life is short - play hard
Total Biscuit showed in his "WTF is Shadow of Mordor" the Texture Quality Settings. The "Ultra" Texture Packs needs 6 GB of VRAM.
When I buy a 970 or 980 SLI System, does the VRAM add up? Am I still limited to 4 GB VRAM or do both nvidias add VRAM up to 8 GB VRAM?
Sorry if this has been asked before, the The FAQ in the OP does not answer that.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
RAM does not add, because each card must have its own copy of all data required for rendering. You'll have an effective 4 GB, same as a single card.

I would not stress about the ultra textures. It's such a tiny difference that you really have to examine stills to notice them vs. regular high textures.

Yaoi Gagarin
Feb 20, 2014

Mr.PayDay posted:

Total Biscuit showed in his "WTF is Shadow of Mordor" the Texture Quality Settings. The "Ultra" Texture Packs needs 6 GB of VRAM.
When I buy a 970 or 980 SLI System, does the VRAM add up? Am I still limited to 4 GB VRAM or do both nvidias add VRAM up to 8 GB VRAM?
Sorry if this has been asked before, the The FAQ in the OP does not answer that.

No, VRAM is not additive in SLI or Crossfire.

edit: drat, beaten by a hair.

Mr.PayDay
Jan 2, 2004
life is short - play hard
allright, thank you for the fast answer, lads

Eddain
May 6, 2007

Mr.PayDay posted:

Total Biscuit showed in his "WTF is Shadow of Mordor" the Texture Quality Settings. The "Ultra" Texture Packs needs 6 GB of VRAM.
When I buy a 970 or 980 SLI System, does the VRAM add up? Am I still limited to 4 GB VRAM or do both nvidias add VRAM up to 8 GB VRAM?
Sorry if this has been asked before, the The FAQ in the OP does not answer that.

I'm playing Shadow of Mordor right now with my MSI GTX 980 Gaming and Ultra Textures enabled. I don't think it's a constant 60 fps but it's easily playable.

Lord Windy
Mar 26, 2010
Hey guys, I'm looking at a project involving the need for a lot of GPGPU and I was looking at my options. It's going to be a drone that flies around my garden looking for bugs. Just something fun to do and most of the technology is already there.

I'm basically only looking at NVIDIA at this point thanks to OpenCV being CUDA optimized (OpenCL at this point is slower than just using a CPU). So my options are 2 Jetson TK1, one to look for bugs and the other to control the flightpath/avoid obstacles, or I can use a server with an arduino board.

I don't know what way to lean here. The Jetsons together would be about $400, need no communication with a server that is slow and interruptible but nowhere near as powerful as an Intel G3258+GTX whatever (500+). Not to mention the weight of the boards and their power requirements. So I was wondering if you guys had any thoughts.

Volguus
Mar 3, 2009

Lord Windy posted:

Hey guys, I'm looking at a project involving the need for a lot of GPGPU and I was looking at my options. It's going to be a drone that flies around my garden looking for bugs. Just something fun to do and most of the technology is already there.

I'm basically only looking at NVIDIA at this point thanks to OpenCV being CUDA optimized (OpenCL at this point is slower than just using a CPU). So my options are 2 Jetson TK1, one to look for bugs and the other to control the flightpath/avoid obstacles, or I can use a server with an arduino board.

I don't know what way to lean here. The Jetsons together would be about $400, need no communication with a server that is slow and interruptible but nowhere near as powerful as an Intel G3258+GTX whatever (500+). Not to mention the weight of the boards and their power requirements. So I was wondering if you guys had any thoughts.

Jetson is a big board. I'd go with an M4 kind of board (or arduino), and optimize the poo poo of my program, and what I can't do there in a reasonable time, talk with a server. K1 is a powerful chip, but it runs on linux (has an entire OS running). An M4 board would be a much weaker cpu, but with no OS, you'd have all the resources at your disposal. And none of them can stand a chance against a desktop machine.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.
Is Ubisoft known for patently insane system reqs that have nothing to do with reality? The listed absolute minimum requirements for Assassin's Creed: Unity are a GTX 680 or Radeon 7970 and a 2500K or Phenom II @ 3.0GHz.

Shouldn't a Core 2 Quad @ 3ghz or more be fine? I remember that the Phenom II series wasn't quite as good clock for clock as a Core 2. Also, 680 or higher probably means that a 760 won't run it at all, right?

r0ck0
Sep 12, 2004
r0ck0s p0zt m0d3rn lyf

Twerk from Home posted:

Is Ubisoft known for patently insane system reqs that have nothing to do with reality? The listed absolute minimum requirements for Assassin's Creed: Unity are a GTX 680 or Radeon 7970 and a 2500K or Phenom II @ 3.0GHz.

Shouldn't a Core 2 Quad @ 3ghz or more be fine? I remember that the Phenom II series wasn't quite as good clock for clock as a Core 2. Also, 680 or higher probably means that a 760 won't run it at all, right?

I read that on the console side they were CPU limited because of the AI. I doubt that the PC is going to be limited at all even on a core 2 duo. They only bump up the specs to cover up their lovely port so you might need the extra CPU to chug through their slop. But don't worry its locked at 30fps.

quote:

"Technically we're CPU-bound," he said. "The GPUs are really powerful, obviously the graphics look pretty good, but it's the CPU [that] has to process the AI, the number of NPCs we have on screen, all these systems running in parallel We were quickly bottlenecked by that and it was a bit frustrating, because we thought that this was going to be a tenfold improvement over everything AI-wise, and we realized it was going to be pretty hard.

http://techreport.com/news/27168/assassin-creed-unity-is-too-much-for-console-cpus

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

Twerk from Home posted:

Is Ubisoft known for patently insane system reqs that have nothing to do with reality? The listed absolute minimum requirements for Assassin's Creed: Unity are a GTX 680 or Radeon 7970 and a 2500K or Phenom II @ 3.0GHz.

Shouldn't a Core 2 Quad @ 3ghz or more be fine? I remember that the Phenom II series wasn't quite as good clock for clock as a Core 2. Also, 680 or higher probably means that a 760 won't run it at all, right?

Game requirements seldom have anything more than the most tenuous relationship with reality.

Blorange
Jan 31, 2007

A wizard did it

Factory Factory posted:

Civ: Beyond Earth is starting to get reviews. Sadly it's kinda underwhelming, but we may see some really interesting technical stuff coming out. Tech Report is saying that the game is CPU-bound and looking to use Mantle to reduce that overhead, and also that it has built-in frame timing tools and uses SFR rather than AFR to better support assymetric GPUs like Hybrid CrossFire in APUs and mobile. Sounds neat!

http://techreport.com/blog/27258/civ-beyond-earth-with-mantle-aims-to-end-multi-gpu-microstuttering

Can anyone explain why all of these reviews are benchmarking at an absurd 8x AA @ 4k? I wonder if the performance benefits are less distinct at lower levels of AA. Does Mantle include a more efficient Anti-Aliasing algorithm than dx 11 at higher sample counts?

r0ck0
Sep 12, 2004
r0ck0s p0zt m0d3rn lyf

Blorange posted:

Can anyone explain why all of these reviews are benchmarking at an absurd 8x AA @ 4k? I wonder if the performance benefits are less distinct at lower levels of AA. Does Mantle include a more efficient Anti-Aliasing algorithm than dx 11 at higher sample counts?

If the game is getting 45fps at 4k 8xAA I think you will be getting fairly decent at 1080p with no AA. I would imagine they are benching at that res so they can start to see differences in the cards. At lower res all the cards get 300fps.

r0ck0
Sep 12, 2004
r0ck0s p0zt m0d3rn lyf
Does anyone have a problem using DSR when cloning displays? If I have my second display off I can use DSR on my primary monitor. If I have both monitors on I do not see the higher resolutions in the games on either monitor. If I have only my second display on I still do not see DSR resolutions. My second monitor is my TV connected via HDMI and my primary is a DVI monitor, using the gigabyte gtx 970 g1.

Also grid autosport doesn't output to both displays when using clone mode. It used to work fine with my GTX 560. Its the only game that doesn't work, the second monitor just goes black and says signal lost. I can't find anything because all search results return stuff about the second display feature in the game that displays lap information, not what I need.

r0ck0 fucked around with this message at 22:17 on Oct 24, 2014

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.
My brand new 970 is artifacting red lines on the screen at stock clocks. Do I want to bother sending it to MSI, or just swap it out at the store for another while its in the 15 day return period?

r0ck0
Sep 12, 2004
r0ck0s p0zt m0d3rn lyf

Twerk from Home posted:

My brand new 970 is artifacting red lines on the screen at stock clocks. Do I want to bother sending it to MSI, or just swap it out at the store for another while its in the 15 day return period?

uh wouldn't swapping be the much faster easier option?

Knifegrab
Jul 30, 2014

Gadzooks! I'm terrified of this little child who is going to stab me with a knife. I must wrest the knife away from his control and therefore gain the upperhand.
I keep going back and forth on whether or not I want to run SLI. I really feel like there are two parties on each side of the fence, and they are each yelling equally loud. I just don't want to feel like I've made a poor choice if down the road support is once again forgone, however it also seems like the best solution for VR.

If I am building a VR rig, is it in my best interest to shoot for SLI (2 970's)?

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Knifegrab posted:

I keep going back and forth on whether or not I want to run SLI. I really feel like there are two parties on each side of the fence, and they are each yelling equally loud. I just don't want to feel like I've made a poor choice if down the road support is once again forgone, however it also seems like the best solution for VR.

If I am building a VR rig, is it in my best interest to shoot for SLI (2 970's)?

Buy one 970 for now, wait and see. Oculus rift consumer version is really far away, and nVidia has only talked about the new VR SLI and now delivered anything yet. If it's great and you buy a 2nd 970 it will be much cheaper in a year.

Wistful of Dollars
Aug 25, 2009

GTX 970 showed up today. I pulled the cooler off and lookie what we have here:



GTX 980 Reference PCB:



:sun:

beejay
Apr 7, 2002

What's the advantage?

Deuce
Jun 18, 2004
Mile High Club

Explain for dummies.

Wistful of Dollars
Aug 25, 2009

Should fit existing 980 waterblocks.

(Other than that, probably nothing)

veedubfreak
Apr 2, 2005

by Smythe

El Scotch posted:

GTX 970 showed up today. I pulled the cooler off and lookie what we have here:



GTX 980 Reference PCB:



:sun:

Which brand? model?

Also apparently the 670 DCII waterblocks fit the new Asus 970 Strix.

Wistful of Dollars
Aug 25, 2009

veedubfreak posted:

Which brand? model?

Also apparently the 670 DCII waterblocks fit the new Asus 970 Strix.

MSI gtx 970 OC (blower).

Adbot
ADBOT LOVES YOU

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map

El Scotch posted:

Should fit existing 980 waterblocks.

(Other than that, probably nothing)

Good find. I'm looking forward to seeing elsewhere how much a Corsair HG10 will do for it. It's supposed to be designed for reference boards, even taking the stock cooler's blower to put air across the VRMs and everything else not covered by the bracket+AIO block.

Whether that's the nVidia blower or any stock blower remains to be seen.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply