|
Twerk from Home posted:Would I be insane if I sold the MSI GTX 970 to a friend for what I paid for it to replace it with a $250 R9-290? They look pretty comparable in performance, and I'm having a bit of buyers remorse about the 970 because my replay of Crysis: Warhead @ 1920x1200 still can't VSync 60fps all the time.
|
# ? Oct 23, 2014 19:24 |
|
|
# ? Jun 3, 2024 20:06 |
|
cisco privilege posted:Warhead seems to favor AMD cards, but that's an awful lot of effort for just one game. If you're wanting to switch mainly to save money there's nothing wrong with that. It just so happens that the main GPU-heavy game I'm playing right now is Company of Heroes 2, which also seems to prefer the R9-290. The other stuff I play, like CS: GO, League of Legends and Dota at 2x DSR don't look so incredible that I can't bear to give it up. It's mostly questioning if the $350 after sales tax was worth it for the 970, I could pocket $100 and see equivalent performance. The main reason I got the 970 is I didn't think that AMD could price cut down to competitiveness, but 290s around $250 are certainly competitive. Edit: Just realized that I'd need to budget for a new PSU as well, an R9-290 on a 5 year old 520W PSU doesn't seem like a good idea. This idea might be right out. Twerk from Home fucked around with this message at 19:41 on Oct 23, 2014 |
# ? Oct 23, 2014 19:28 |
|
Swartz posted:So how do I know for sure if my GPU is throttling (I know this has been asked, I'm hoping for a long, detailed explanation though)? I might be able to help - if the right set of indicators is on for your GPU's power monitoring, it's trying to tell you that it's pretty much giving all it can. Unless nVidia walked it backwards for no apparent reason it should still be there for Maxwell GPUs, though you'll need to check your monitoring suite to know for sure since I didn't do the needful yet (oh, how terrible, not to have a current-gen GPU for a year ). If these flags are maxed, so is your card. If they aren't showing, and it doesn't crash, it's probably good ol' hardware TDP enforcement come to ruin any further OCing. They should be showing for hardware enforcement, that's basically all they are to begin with - but like I said, no first hand experience with Maxwell and I don't wish to mislead. You're spot on about CS and CoP looking fantastic with DSR, by the way - it's nice to see an old game presented so well. I ought to ask the thread on the series if there's a UI mod for >1440p resolutions, because it becomes nearly illegible for me after that. GeDoSaTo is still DX9 only, isn't it? Haven't really paid attention to it, though I probably should have. Never got into Dark Souls II.
|
# ? Oct 23, 2014 20:09 |
|
Twerk from Home posted:Edit: Just realized that I'd need to budget for a new PSU as well, an R9-290 on a 5 year old 520W PSU doesn't seem like a good idea. This idea might be right out.
|
# ? Oct 23, 2014 20:31 |
|
Just realised today that I'm on a 7 year old PSU with the 8th year fast approaching. These Corsairs are workhorses. Will probably upgrade at the next mobo upgrade as I fancy some SLi action now I've got a bit more money to burn
|
# ? Oct 23, 2014 20:34 |
|
cisco privilege posted:A 520W PSU from that era? Sounds like an OCZ maybe since they loved units in that size. Probably worth replacing anyways regardless of whatever GPU you're using now. I thought that the Corsair 520HX was considered a good unit. It was under warranty until a couple months ago, so I haven't thought of it as a time bomb.
|
# ? Oct 23, 2014 20:43 |
|
Twerk from Home posted:I thought that the Corsair 520HX was considered a good unit. It was under warranty until a couple months ago, so I haven't thought of it as a time bomb. The 5-year lifespan for PSU replacement is generally a good guideline, but I still use a Silverstone Olympia 650W as a testing PSU and it's like, what, 8 years old now? A good-quality overspecced PSU can last a long time if it's not been fully-loaded for the majority. future ghost fucked around with this message at 21:02 on Oct 23, 2014 |
# ? Oct 23, 2014 20:57 |
|
Agreed posted:I might be able to help - if the right set of indicators is on for your GPU's power monitoring, it's trying to tell you that it's pretty much giving all it can. Unless nVidia walked it backwards for no apparent reason it should still be there for Maxwell GPUs, though you'll need to check your monitoring suite to know for sure since I didn't do the needful yet (oh, how terrible, not to have a current-gen GPU for a year ). Thanks, that helps. I ran Firestrike then alt+tabbed out to see the usage at my +175 clock. The highest the power % got to was 108 (meaning 2% headroom still); gpu temp was fine the whole time as I set my fan to 100% when playing games or running benchmarks; and gpu usage maxed out at 99%. With that last one, I've never seen it go to 100%. In fact, I think gpu usage has been at 99% before when I had it at +135. So if what I'm thinking is correct and my GPU is throttling a little, should increasing my voltage do the trick? As for Stalker, yeah it looks great with DSR. It looks even better with some GPU-stressing additions I've made to the code (for COP). If you want some compiled binaries and new shaders of the graphics related stuff I've done leave your email and I'll send it to you. Swartz fucked around with this message at 21:35 on Oct 23, 2014 |
# ? Oct 23, 2014 21:33 |
|
Twerk from Home posted:Would I be insane if I sold the MSI GTX 970 to a friend for what I paid for it to replace it with a $250 R9-290? They look pretty comparable in performance, and I'm having a bit of buyers remorse about the 970 because my replay of Crysis: Warhead @ 1920x1200 still can't VSync 60fps all the time. I upgraded from a sapphire r9 290 to a msi 970 and havent regretted it. Crysis is one of those agmes that will never run great unless you have invest in sli. bobby2times fucked around with this message at 22:28 on Oct 23, 2014 |
# ? Oct 23, 2014 22:25 |
|
ASUS GTX 970 and 980 cards in stock: http://www.newegg.com/Product/Product.aspx?Item=N82E16814121899 http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=9515557 Looks like they may be next to stabilize supply after Zotac? Or just randomness.
|
# ? Oct 24, 2014 00:50 |
|
pmchem posted:ASUS GTX 970 and 980 cards in stock: Ugh TigerDirect, get 970's, I want my discount.
|
# ? Oct 24, 2014 01:09 |
|
Civ: Beyond Earth is starting to get reviews. Sadly it's kinda underwhelming, but we may see some really interesting technical stuff coming out. Tech Report is saying that the game is CPU-bound and looking to use Mantle to reduce that overhead, and also that it has built-in frame timing tools and uses SFR rather than AFR to better support assymetric GPUs like Hybrid CrossFire in APUs and mobile. Sounds neat! http://techreport.com/blog/27258/civ-beyond-earth-with-mantle-aims-to-end-multi-gpu-microstuttering
|
# ? Oct 24, 2014 04:13 |
|
Amazon has the MSI Gaming GTX 980 in stock.
|
# ? Oct 24, 2014 06:19 |
|
Gah if this was a 970 I would jump on it, I still might
|
# ? Oct 24, 2014 07:10 |
|
Factory Factory posted:Civ: Beyond Earth is starting to get reviews. Sadly it's kinda underwhelming, but we may see some really interesting technical stuff coming out. Tech Report is saying that the game is CPU-bound and looking to use Mantle to reduce that overhead, and also that it has built-in frame timing tools and uses SFR rather than AFR to better support assymetric GPUs like Hybrid CrossFire in APUs and mobile. Sounds neat! So it sounds like this new approach to multi-GPU setups is prioritizing frame latency improvements over raw FPS increases? Looks pretty cool. I wonder how much of a noticeable difference it will make.
|
# ? Oct 24, 2014 08:03 |
|
It's being sold through a merchant using Amazon fulfillment, not directly by Amazon. Their physical location is like 6 miles from my house though and I picked up a MSI GTX 980 Gaming directly from them. Legit sealed box and everything. They have some iffy reviews but it's mostly for their custom laptops. If you really want the MSI 980 Gaming I'd trust these guys.
|
# ? Oct 24, 2014 09:05 |
|
Total Biscuit showed in his "WTF is Shadow of Mordor" the Texture Quality Settings. The "Ultra" Texture Packs needs 6 GB of VRAM. When I buy a 970 or 980 SLI System, does the VRAM add up? Am I still limited to 4 GB VRAM or do both nvidias add VRAM up to 8 GB VRAM? Sorry if this has been asked before, the The FAQ in the OP does not answer that.
|
# ? Oct 24, 2014 09:48 |
|
RAM does not add, because each card must have its own copy of all data required for rendering. You'll have an effective 4 GB, same as a single card. I would not stress about the ultra textures. It's such a tiny difference that you really have to examine stills to notice them vs. regular high textures.
|
# ? Oct 24, 2014 10:08 |
|
Mr.PayDay posted:Total Biscuit showed in his "WTF is Shadow of Mordor" the Texture Quality Settings. The "Ultra" Texture Packs needs 6 GB of VRAM. No, VRAM is not additive in SLI or Crossfire. edit: drat, beaten by a hair.
|
# ? Oct 24, 2014 10:09 |
|
allright, thank you for the fast answer, lads
|
# ? Oct 24, 2014 10:13 |
|
Mr.PayDay posted:Total Biscuit showed in his "WTF is Shadow of Mordor" the Texture Quality Settings. The "Ultra" Texture Packs needs 6 GB of VRAM. I'm playing Shadow of Mordor right now with my MSI GTX 980 Gaming and Ultra Textures enabled. I don't think it's a constant 60 fps but it's easily playable.
|
# ? Oct 24, 2014 10:27 |
|
Hey guys, I'm looking at a project involving the need for a lot of GPGPU and I was looking at my options. It's going to be a drone that flies around my garden looking for bugs. Just something fun to do and most of the technology is already there. I'm basically only looking at NVIDIA at this point thanks to OpenCV being CUDA optimized (OpenCL at this point is slower than just using a CPU). So my options are 2 Jetson TK1, one to look for bugs and the other to control the flightpath/avoid obstacles, or I can use a server with an arduino board. I don't know what way to lean here. The Jetsons together would be about $400, need no communication with a server that is slow and interruptible but nowhere near as powerful as an Intel G3258+GTX whatever (500+). Not to mention the weight of the boards and their power requirements. So I was wondering if you guys had any thoughts.
|
# ? Oct 24, 2014 11:16 |
|
Lord Windy posted:Hey guys, I'm looking at a project involving the need for a lot of GPGPU and I was looking at my options. It's going to be a drone that flies around my garden looking for bugs. Just something fun to do and most of the technology is already there. Jetson is a big board. I'd go with an M4 kind of board (or arduino), and optimize the poo poo of my program, and what I can't do there in a reasonable time, talk with a server. K1 is a powerful chip, but it runs on linux (has an entire OS running). An M4 board would be a much weaker cpu, but with no OS, you'd have all the resources at your disposal. And none of them can stand a chance against a desktop machine.
|
# ? Oct 24, 2014 13:27 |
|
Is Ubisoft known for patently insane system reqs that have nothing to do with reality? The listed absolute minimum requirements for Assassin's Creed: Unity are a GTX 680 or Radeon 7970 and a 2500K or Phenom II @ 3.0GHz. Shouldn't a Core 2 Quad @ 3ghz or more be fine? I remember that the Phenom II series wasn't quite as good clock for clock as a Core 2. Also, 680 or higher probably means that a 760 won't run it at all, right?
|
# ? Oct 24, 2014 15:28 |
|
Twerk from Home posted:Is Ubisoft known for patently insane system reqs that have nothing to do with reality? The listed absolute minimum requirements for Assassin's Creed: Unity are a GTX 680 or Radeon 7970 and a 2500K or Phenom II @ 3.0GHz. I read that on the console side they were CPU limited because of the AI. I doubt that the PC is going to be limited at all even on a core 2 duo. They only bump up the specs to cover up their lovely port so you might need the extra CPU to chug through their slop. But don't worry its locked at 30fps. quote:"Technically we're CPU-bound," he said. "The GPUs are really powerful, obviously the graphics look pretty good, but it's the CPU [that] has to process the AI, the number of NPCs we have on screen, all these systems running in parallel We were quickly bottlenecked by that and it was a bit frustrating, because we thought that this was going to be a tenfold improvement over everything AI-wise, and we realized it was going to be pretty hard. http://techreport.com/news/27168/assassin-creed-unity-is-too-much-for-console-cpus
|
# ? Oct 24, 2014 15:36 |
|
Twerk from Home posted:Is Ubisoft known for patently insane system reqs that have nothing to do with reality? The listed absolute minimum requirements for Assassin's Creed: Unity are a GTX 680 or Radeon 7970 and a 2500K or Phenom II @ 3.0GHz. Game requirements seldom have anything more than the most tenuous relationship with reality.
|
# ? Oct 24, 2014 15:39 |
|
Factory Factory posted:Civ: Beyond Earth is starting to get reviews. Sadly it's kinda underwhelming, but we may see some really interesting technical stuff coming out. Tech Report is saying that the game is CPU-bound and looking to use Mantle to reduce that overhead, and also that it has built-in frame timing tools and uses SFR rather than AFR to better support assymetric GPUs like Hybrid CrossFire in APUs and mobile. Sounds neat! Can anyone explain why all of these reviews are benchmarking at an absurd 8x AA @ 4k? I wonder if the performance benefits are less distinct at lower levels of AA. Does Mantle include a more efficient Anti-Aliasing algorithm than dx 11 at higher sample counts?
|
# ? Oct 24, 2014 16:22 |
|
Blorange posted:Can anyone explain why all of these reviews are benchmarking at an absurd 8x AA @ 4k? I wonder if the performance benefits are less distinct at lower levels of AA. Does Mantle include a more efficient Anti-Aliasing algorithm than dx 11 at higher sample counts? If the game is getting 45fps at 4k 8xAA I think you will be getting fairly decent at 1080p with no AA. I would imagine they are benching at that res so they can start to see differences in the cards. At lower res all the cards get 300fps.
|
# ? Oct 24, 2014 16:44 |
|
Does anyone have a problem using DSR when cloning displays? If I have my second display off I can use DSR on my primary monitor. If I have both monitors on I do not see the higher resolutions in the games on either monitor. If I have only my second display on I still do not see DSR resolutions. My second monitor is my TV connected via HDMI and my primary is a DVI monitor, using the gigabyte gtx 970 g1. Also grid autosport doesn't output to both displays when using clone mode. It used to work fine with my GTX 560. Its the only game that doesn't work, the second monitor just goes black and says signal lost. I can't find anything because all search results return stuff about the second display feature in the game that displays lap information, not what I need. r0ck0 fucked around with this message at 22:17 on Oct 24, 2014 |
# ? Oct 24, 2014 16:51 |
|
My brand new 970 is artifacting red lines on the screen at stock clocks. Do I want to bother sending it to MSI, or just swap it out at the store for another while its in the 15 day return period?
|
# ? Oct 24, 2014 16:58 |
|
Twerk from Home posted:My brand new 970 is artifacting red lines on the screen at stock clocks. Do I want to bother sending it to MSI, or just swap it out at the store for another while its in the 15 day return period? uh wouldn't swapping be the much faster easier option?
|
# ? Oct 24, 2014 16:59 |
|
I keep going back and forth on whether or not I want to run SLI. I really feel like there are two parties on each side of the fence, and they are each yelling equally loud. I just don't want to feel like I've made a poor choice if down the road support is once again forgone, however it also seems like the best solution for VR. If I am building a VR rig, is it in my best interest to shoot for SLI (2 970's)?
|
# ? Oct 24, 2014 22:02 |
|
Knifegrab posted:I keep going back and forth on whether or not I want to run SLI. I really feel like there are two parties on each side of the fence, and they are each yelling equally loud. I just don't want to feel like I've made a poor choice if down the road support is once again forgone, however it also seems like the best solution for VR. Buy one 970 for now, wait and see. Oculus rift consumer version is really far away, and nVidia has only talked about the new VR SLI and now delivered anything yet. If it's great and you buy a 2nd 970 it will be much cheaper in a year.
|
# ? Oct 24, 2014 23:06 |
|
GTX 970 showed up today. I pulled the cooler off and lookie what we have here: GTX 980 Reference PCB:
|
# ? Oct 25, 2014 00:42 |
|
What's the advantage?
|
# ? Oct 25, 2014 00:55 |
|
Explain for dummies.
|
# ? Oct 25, 2014 00:56 |
|
Should fit existing 980 waterblocks. (Other than that, probably nothing)
|
# ? Oct 25, 2014 00:56 |
|
El Scotch posted:GTX 970 showed up today. I pulled the cooler off and lookie what we have here: Which brand? model? Also apparently the 670 DCII waterblocks fit the new Asus 970 Strix.
|
# ? Oct 25, 2014 03:21 |
|
veedubfreak posted:Which brand? model? MSI gtx 970 OC (blower).
|
# ? Oct 25, 2014 03:31 |
|
|
# ? Jun 3, 2024 20:06 |
|
El Scotch posted:Should fit existing 980 waterblocks. Good find. I'm looking forward to seeing elsewhere how much a Corsair HG10 will do for it. It's supposed to be designed for reference boards, even taking the stock cooler's blower to put air across the VRMs and everything else not covered by the bracket+AIO block. Whether that's the nVidia blower or any stock blower remains to be seen.
|
# ? Oct 25, 2014 03:33 |