|
Jan posted:Remind me, what's plaguing Crossfire this time? I haven't really noticed any issues with my 7850s in Bioshock Infinite. I'd assume it's micro stutter.
|
# ¿ Jul 26, 2013 14:33 |
|
|
# ¿ May 14, 2024 11:05 |
|
If I can find em, Daddy's gonna have two 290s for crossfire action tonight I'm VERY impressed with the numbers of the 290 vs the 290x. Should be a nice upgrade to my GTX680.
|
# ¿ Nov 5, 2013 14:24 |
|
real_scud posted:Apparently you don't even need to wait since the card pcb's are still the same from the 7970 and people have been using the Arctic Accelero III to get a much quieter solution. Those are 2 or 3 slot coolers? I'm going to assume like other 3 fan designs they actually suck at moving the hot air out the back like blower cards? I was thinking of getting that cooler but in Crossfire I might actually want the blower type cooling.
|
# ¿ Nov 5, 2013 14:45 |
|
Arzachel posted:I call bullshit on this. The vast majority of launch day review oc tests are hilariously awful. "The slider won't go any higher in catalyst, so I guess that's the best it can do" awful. People are running 290X at 1100 with stock volts on custom air cooling and 1200+ slightly overvolted, why would you think the 290 would do significantly worse? Because he just got a 780
|
# ¿ Nov 5, 2013 16:32 |
|
Rahu X posted:Argh. Fine. That video states it really well at the end. Boils down to what do YOU value most. $100, or more heat/loudness from your video card.
|
# ¿ Nov 5, 2013 17:07 |
|
I'm in love with my r9 290. 1440p BF4 Ultra settings, 75FPS pretty much constant.
|
# ¿ Nov 9, 2013 01:28 |
|
veedubfreak posted:How is Bf4? I went from being low settings and 40ish fps in MWO to high with the same framerate at 7880x1440. BF4 plays brilliantly. Amazingly smooth on Ultra @ 2560x1400 resolution. Tab8715 posted:How bad is the noise with the 290s? Surprisingly not as loud as I was lead to believe by reviews. I guess when you have a decently sound dampened case the video card is barely audible over the 3x240mm fans I have running (900ish RPM).
|
# ¿ Nov 9, 2013 17:53 |
|
I think I'm gonna pick up two of those Acceleros. The 290s run great on the stock cooler for the first 2 or so hours of BF4, but after that I think things are getting too hot and the card starts to throttle. Going to run them off a 12v rail so I don't think VRM cooling should be an issue. I might even just opt for all copper heatsinks for vRAM/VRMs.
|
# ¿ Nov 12, 2013 13:28 |
|
Tab8715 posted:How do you know it's throttling? The first thing I noticed was choppy gameplay after about 2 hours of straight playing, and when I would check the overdrive panel I could see GPU activity wasn't pegged at 99% and the GPU core was being downclocked.
|
# ¿ Nov 12, 2013 16:12 |
|
Trip report on installing Accelero Extreme III on 2 r9 290s. One card works amazing (50C @ load GPU temps, vrms stay nice and cool too). The fan is on a straight 12v rail. The other card has a short somewhere between the PCIe power connectors and the rest of the GPU. Had to take it apart, clean off all the thermal glue and will try again tonight. Hopefully I didn't kill the card
|
# ¿ Nov 13, 2013 13:21 |
|
Rahu X posted:That's great to know. How did you remove the thermal adhesive though? I keep hearing that stuff is permanent once applied, or at least if you try to remove the glued parts, you could cause potential damage to the RAM and VRMs. Curing for only an hour means the adhesive is still kind malleable and you can remove things rather easily.
|
# ¿ Nov 13, 2013 15:07 |
|
Heatsinks are re-curing. Here's hoping I don't have a dead card
|
# ¿ Nov 13, 2013 23:05 |
|
Woot, both cards are fine. Idling at 41C, load temps around 55C. So happy
|
# ¿ Nov 14, 2013 00:31 |
|
Rahu X posted:That's great to hear. There aren't enough single heatsinks for the memory but you can more than make due with some of the other heatsinks you get in the kit. You also have to use one of the small ones for one of the heatsinks on the memory that's closest to the PCIe connector or else the mounting bracket won't fit. Out side of that I didn't have to "mod" anything. The installation is very straight forward. My only word of caution would be make absolutely, positively sure you aren't shorting anything near the VRMs.
|
# ¿ Nov 14, 2013 03:53 |
|
So apparently my Corsair AX850 does not have enough power for my 290s in crossfire. Each card works flawlessly by itself, if I put both in they'll run fine until I start running some stress tests at which point they'll just black screen. Going to pick up an AX1200i tonight. My other option is the Seasonic 1250w or 1000w Plat+ rated beasties. For anyone on the fence about the default heatsinks that come with the accelero extreme III, they are more than adequate for cooling, and having the fan run off 12v cools the VRMs effectively (max temp seen on any VRM was 57C). Running BF4 on ultra (minus MSAA) w/ the render setting at 150% makes the already stellar looking game ever more stellar
|
# ¿ Nov 15, 2013 14:38 |
|
dataupload posted:I've seen it mentioned a couple of times about 2GB (e.g. with the GTX 670/680) "not being quite enough" any-more. What are these scenarios and is there any evidence that such an amount of VRam can cause issues? Resolutions > 1080p, or high levels of anti-aliasing, etc can cause serious performance issues. It'll depend on the game as well.
|
# ¿ Nov 15, 2013 15:18 |
|
Nostrum posted:I'm planning on getting a 290X and keeping my 670 so I can do this: And if only Nvidia wasn't actively blocking hybrid PHYSX we wouldn't have to jump through these hoops. It also looks like they've been able to stop the hybridization by directly coding games not to support it. Batman Origins and Hawken won't work. We won't know until newer PHYSX enabled games come out. I'm almost tempted to drop my 680 in with my 2 290s and see what kind of crazy trouble I can get into
|
# ¿ Nov 18, 2013 17:20 |
|
Ghostpilot posted:I'm not sure if I'd feel comfortable doing that with anything less than a hazmat suit and a fallout shelter. Just got a Corsair AX1200i, I might as well make it sweat a bit
|
# ¿ Nov 18, 2013 17:37 |
|
HalloKitty posted:Huh, I don't really get why. NVIDIA could make a bit from people who have AMD cards. At the end of the day, you had to buy NVIDIA silicon to get the feature, why block it if it's not the renderer? Who the hell knows. But it pisses me off. I'm sure the same will be said for Mantle, if and when it makes any kind of splash/ripple. I'd love to throw in one of my older NV cards to act as a PPU.
|
# ¿ Nov 18, 2013 20:32 |
|
Oh holy crap. The Mod! I want to do this now. Tripple H100s in my system. Already spent close to $1k on various upgrades over the last week, making me Agreed Jr. I guess. I have a bunch of unused 5 1/4 bays I could probably mount those too as well. This. loving. Thread
|
# ¿ Nov 19, 2013 16:26 |
|
deimos posted:Until you take the electricity prices from the monster pump I'd need for all that head pressure into account! Couldn't you use the heat pressure from the geo-thermal vent?
|
# ¿ Nov 29, 2013 21:05 |
|
El Scotch posted:I'm hardly any better. As Agreed and I poked fun at each other about last week, I have two 290s. I'm not much of a model of sensible choices. New drivers fixed my black screens. Can finally use both cards It's nice being able to play BF4 + Ultra + 150% Rendering. Game looks insanely good at 1440p.
|
# ¿ Dec 4, 2013 04:11 |
|
Agreed posted:I dunno, Alexey seemed like a pretty big fan! Why, he even had plans to integrated it. The ability to hit a broader market?
|
# ¿ Dec 4, 2013 18:55 |
|
Agreed posted:It seems like his English isn't absolutely stellar, so maybe I'm misinterpreting, but it sounds to me like he's changing it from QuickSync to ShadowPlay. That's going from Intel proprietary (which pretty much everyone has these days) to nVidia proprietary and only from the 600-series onward and not all cards. Maybe it's something he would have done all along if NVAPI had been sorta open to public developers before, maybe it really just is a much better capture technology (QuickSync videos don't look very good to me, while Shadowplay videos look great to me). I'm curious as to whether he means to replace it completely or just add support - the way it's worded sounds more like he's changing over entirely to Shadowplay. Hahaha, I read that completely backwards. Now I'm not even sure what he's attempting to do.
|
# ¿ Dec 4, 2013 19:47 |
|
I'm wondering how much I could get for my r9 290s WITH CUSTOM COOLING to make those buttcoins fly faster and further than ever before. Starting bid $1500 for the pair.
|
# ¿ Dec 5, 2013 16:50 |
|
People still use optical drives? You should all be procedurally generating ALL your content with a liquid nitrogen cooled GPU cluster.
|
# ¿ Dec 5, 2013 19:03 |
|
GrizzlyCow posted:Also, TechReport has some data on that R9 290X retailer vs test sample variance issue. It looks kind of bad for AMD. This is so blown out of proportion it's not even funny. Variance in the manufacturing? Say it ain't so
|
# ¿ Dec 13, 2013 22:39 |
|
Guni posted:Goddamit AMD and associated companies (ASUS and the like), where the gently caress are the semi-custom coolers? I've been waiting since launch for the 290 to get some, but this is getting pretty ridiculously long and I'm not going to accept the excessive (IMO) heat and noise that the blower generates. Pull up your pants and build your own
|
# ¿ Dec 17, 2013 17:49 |
|
AMD FreeSync? Hahahah holy crap didn't see that coming. Hopefully all 3 of my LCDs get, in the least, hacked firmware that supports it.
|
# ¿ Jan 7, 2014 15:49 |
|
So basically the way business is done every day, replacing AMD and NV with company X and Y.
|
# ¿ Jan 8, 2014 23:04 |
|
Magic Underwear posted:You are seriously way too invested in this. The pure number of words you have written about this, not to mention how dramatic you're being, is way out of proportion with the subject matter: a niche gaming graphics feature for high end enthusiasts. I was kinda alluding to this. Don't get me wrong, I'm VERY glad NV has started to push this issue so that LCD manufacturers get off their asses and start pushing out some good gaming hardware and not some poo poo 2fast2furious TN panels. Agreed posted:Honestly, though, it's never as simple as just "the way business is done every day," and oversimplifying can lead to mistaken intuition regarding the nature of the competition going on. Unfortunately you're wrong. Business in just about every, single sector works like this. A few will lead in innovation and the rest follow suit or we see knee jerk reactions happening. What we're seeing is AMDs knee jerk reaction and a bit of verbal jousting levied towards Nvidia.
|
# ¿ Jan 9, 2014 01:44 |
|
Agreed posted:There's a reason I keep a dedicated PhysX coprocessor and it's basically all Bs. Borderlands and Batmans. It seems like the problem has more to do with how the SMXes juggle the workload than actually just having enough processor power to do it all, but that's a surface read tbh - I don't actually understand SMXes and I think the list of people who really, properly get them is very small. SMs make more sense, but this Kepler stuff is... way out there. So I don't want to say it's DEFINITELY a workload scheduling thing or whatever, but qualitatively I can say that my gaming experience is vastly better if I use my 650Ti to do all the PhysX processing vs. running the same stuff but with my 780Ti handling both PhysX and rendering. It works okay on Low-Medium, but on High it goes to poo poo, especially minimum framerates (which is why I think it must be a workload juggling issue, some thing that forces it to do THIS then THAT in an inefficient way - PhysX is a light CUDA workload, and going from shaders to CUDA quickly seems difficult in the instance of PhysX). I'd agree, if really does feel like the pipeline gets very heavily saturated when set to high. Could just be bad coding on the games part or it could be like you said a scheduling issue where the PhysX stuff take absolute priority. Could also be that the PhysX computations just end up "hogging" a good part of the GPU. I only really notice the slow down in Borderlands 2 when there's a lot of blood. I'm not even sure if the blood has any fancy fluid computations.
|
# ¿ Jan 9, 2014 17:41 |
|
Starting to see some first real world "numbers" about mantle. http://www.rockpapershotgun.com/2014/01/16/see-a-10000-ship-battle-in-stardocks-insane-new-engine/
|
# ¿ Jan 17, 2014 19:11 |
|
beejay posted:The motion blur enabling/disabling and FPS gains were during the DirectX part, not Mantle. Watch the video again. The Blur is enabled throughout the Mantle part and then when DX is enabled they have to take it off.
|
# ¿ Jan 17, 2014 20:14 |
|
beejay posted:Yeah but I don't see where motion blur is causing a hit in Mantle which is what that post seemed to be getting at. To me it looks like they are saying Mantle can run everything turned on whereas with DX you have to turn off things to maintain fps. I mean motion blur itself is kind of dumb but whatever, that's up to the game designers. First bit of video = Mantle + Motion Blur with VERY smooth framerates. Rest of video = DX + Blur for a brief moment then no blur after. The DX code path runs like total poo poo with the blue enabled. Near the end they switch back to mantle which doubles the DX framerate (8 - 16ish ) They allude to allowing more "simulation" stuff going on in the background (the turrets and pew pew laser projectiles, etc). Stanley Pain fucked around with this message at 21:18 on Jan 17, 2014 |
# ¿ Jan 17, 2014 21:15 |
|
Jan posted:Which is exactly my point. If their implementation of motion blur causes an apparent 100ms hit, I don't think I can trust anything about their claims of DirectX vs Mantle performance. They're using a blur algorithm that normally needs to be rendered on a scene by scene basis, in realtime. True motion blur is an amazing effect. Crysis did motion blur really well. It'll basically make 20FPS feel smoother than 60 when implemented properly.
|
# ¿ Jan 19, 2014 23:21 |
|
Agreed posted:TXAA Stuff To his defense, I too find that TXAA blurs textures entirely too much and would usually run an injector just to add a bit of sharpness which at that point I might as well use the SMAA injector. It does do a fantastic job with the jaggies though
|
# ¿ Jan 29, 2014 19:44 |
|
Agreed posted:You're absolutely free to want whatever you want, of course. He is talking about image quality though and from an absolute perspective. It negatively impacts something that is clearly visible. Loss of texture fidelity is something I, and a lot of other people hate, regardless of how good the AA is. See quincunx AA.
|
# ¿ Jan 29, 2014 20:39 |
|
Agreed posted:Except that you're looking at one arguable upside or downside (different people take it differently) that literally can't be determined in a still frame comparison. Even the more advanced modes of SMAA (which... again, he doesn't understand, as someone points out later on when he mistakenly describes it in the followup discussion) include both spatial and temporal components that aren't going to show in a still frame. I agree that his review is a bit ham fisted (everything at HardOCP is going to be that way, I do enjoy their PSU reviews though). TXAA, SMAA, etc do look good in motion, but that still comes at a cost to overall visual fidelity as I mentioned earlier. The two games I've played that had it (Secret World and Crysis 3) didn't impress me much but like you said, that's probably a case of bad fine tuning, and potentially a lack of a good sharp filter to even out the texture bluring. Where it does look stunning is when you'd normally see a TON of alpha textures or a lot of movement of objects with aliased shaders on them and what not.
|
# ¿ Jan 30, 2014 01:08 |
|
|
# ¿ May 14, 2024 11:05 |
|
exquisite tea posted:Is it normal for the Geforce Experience optimization tool to randomly switch around its "optimal" configuration without any changes to the system? Last week it was telling me to play AC4 with high shadows, no SSAO and FXAA, now it's saying medium shadows with HBAO 2x and MSAA, neither of which seem to play any better or worse than the settings I was using initially. Don't use the optimization tool. It's awful.
|
# ¿ Jan 30, 2014 14:29 |