Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Stanley Pain
Jun 16, 2001

by Fluffdaddy

Jan posted:

Remind me, what's plaguing Crossfire this time? I haven't really noticed any issues with my 7850s in Bioshock Infinite.


I'd assume it's micro stutter.

Adbot
ADBOT LOVES YOU

Stanley Pain
Jun 16, 2001

by Fluffdaddy
If I can find em, Daddy's gonna have two 290s for crossfire action tonight ;) I'm VERY impressed with the numbers of the 290 vs the 290x. Should be a nice upgrade to my GTX680.

Stanley Pain
Jun 16, 2001

by Fluffdaddy

real_scud posted:

Apparently you don't even need to wait since the card pcb's are still the same from the 7970 and people have been using the Arctic Accelero III to get a much quieter solution.

Those are 2 or 3 slot coolers? I'm going to assume like other 3 fan designs they actually suck at moving the hot air out the back like blower cards?

I was thinking of getting that cooler but in Crossfire I might actually want the blower type cooling.

Stanley Pain
Jun 16, 2001

by Fluffdaddy

Arzachel posted:

I call bullshit on this. The vast majority of launch day review oc tests are hilariously awful. "The slider won't go any higher in catalyst, so I guess that's the best it can do" awful. People are running 290X at 1100 with stock volts on custom air cooling and 1200+ slightly overvolted, why would you think the 290 would do significantly worse?

Because he just got a 780 ;)

Stanley Pain
Jun 16, 2001

by Fluffdaddy

Rahu X posted:

Argh. Fine. :argh:

Maybe you're right. Maybe I should get on the phone with Newegg and request they change that replacement RMA to a refund, and use the money I get back to buy a 290 and an Xtreme III instead.

While I'm not too keen on having a 3 slot card, I can't argue with the pressure. :sweatdrop:

That video states it really well at the end. Boils down to what do YOU value most. $100, or more heat/loudness from your video card.

Stanley Pain
Jun 16, 2001

by Fluffdaddy
I'm in love with my r9 290. 1440p BF4 Ultra settings, 75FPS pretty much constant.

Stanley Pain
Jun 16, 2001

by Fluffdaddy

veedubfreak posted:

How is Bf4? I went from being low settings and 40ish fps in MWO to high with the same framerate at 7880x1440.

So far I have the card at 1047/5200. Waiting to see how far I can get before voltage becomes an issue. Running at 46C at full load.


BF4 plays brilliantly. Amazingly smooth on Ultra @ 2560x1400 resolution.

Tab8715 posted:

How bad is the noise with the 290s?

Surprisingly not as loud as I was lead to believe by reviews. I guess when you have a decently sound dampened case the video card is barely audible over the 3x240mm fans I have running (900ish RPM).

Stanley Pain
Jun 16, 2001

by Fluffdaddy
I think I'm gonna pick up two of those Acceleros. The 290s run great on the stock cooler for the first 2 or so hours of BF4, but after that I think things are getting too hot and the card starts to throttle. Going to run them off a 12v rail so I don't think VRM cooling should be an issue. I might even just opt for all copper heatsinks for vRAM/VRMs.

Stanley Pain
Jun 16, 2001

by Fluffdaddy

Tab8715 posted:

How do you know it's throttling?

The first thing I noticed was choppy gameplay after about 2 hours of straight playing, and when I would check the overdrive panel I could see GPU activity wasn't pegged at 99% and the GPU core was being downclocked.

Stanley Pain
Jun 16, 2001

by Fluffdaddy
Trip report on installing Accelero Extreme III on 2 r9 290s. One card works amazing (50C @ load GPU temps, vrms stay nice and cool too). The fan is on a straight 12v rail.

The other card has a short somewhere between the PCIe power connectors and the rest of the GPU. Had to take it apart, clean off all the thermal glue and will try again tonight. :negative: Hopefully I didn't kill the card :(

Stanley Pain
Jun 16, 2001

by Fluffdaddy

Rahu X posted:

That's great to know. How did you remove the thermal adhesive though? I keep hearing that stuff is permanent once applied, or at least if you try to remove the glued parts, you could cause potential damage to the RAM and VRMs.

Curing for only an hour means the adhesive is still kind malleable and you can remove things rather easily.

Stanley Pain
Jun 16, 2001

by Fluffdaddy
Heatsinks are re-curing. Here's hoping I don't have a dead card :(

Stanley Pain
Jun 16, 2001

by Fluffdaddy
Woot, both cards are fine. Idling at 41C, load temps around 55C. So happy :D

Stanley Pain
Jun 16, 2001

by Fluffdaddy

Rahu X posted:

That's great to hear.

I have one other question though. Did you have to use any more heatsinks than the ones that came with the coolers? I'd assume not if Tom's was able to install it just fine, but I also hear conflicting reports on more heatsinks being required.

Just curious, as I'm getting sort of impatient waiting for the guide from Tom's, and from most of the installation videos I've seen on other GPUs, it actually seems really simple. Just takes awhile because of the curing is all.

There aren't enough single heatsinks for the memory but you can more than make due with some of the other heatsinks you get in the kit. You also have to use one of the small ones for one of the heatsinks on the memory that's closest to the PCIe connector or else the mounting bracket won't fit. Out side of that I didn't have to "mod" anything.

The installation is very straight forward. My only word of caution would be make absolutely, positively sure you aren't shorting anything near the VRMs. :)

Stanley Pain
Jun 16, 2001

by Fluffdaddy
So apparently my Corsair AX850 does not have enough power for my 290s in crossfire. Each card works flawlessly by itself, if I put both in they'll run fine until I start running some stress tests at which point they'll just black screen. Going to pick up an AX1200i tonight. My other option is the Seasonic 1250w or 1000w Plat+ rated beasties.


For anyone on the fence about the default heatsinks that come with the accelero extreme III, they are more than adequate for cooling, and having the fan run off 12v cools the VRMs effectively (max temp seen on any VRM was 57C). Running BF4 on ultra (minus MSAA) w/ the render setting at 150% makes the already stellar looking game ever more stellar :haw:

Stanley Pain
Jun 16, 2001

by Fluffdaddy

dataupload posted:

I've seen it mentioned a couple of times about 2GB (e.g. with the GTX 670/680) "not being quite enough" any-more. What are these scenarios and is there any evidence that such an amount of VRam can cause issues?

Resolutions > 1080p, or high levels of anti-aliasing, etc can cause serious performance issues. It'll depend on the game as well.

Stanley Pain
Jun 16, 2001

by Fluffdaddy

And if only Nvidia wasn't actively blocking hybrid PHYSX we wouldn't have to jump through these hoops. It also looks like they've been able to stop the hybridization by directly coding games not to support it. Batman Origins and Hawken won't work. We won't know until newer PHYSX enabled games come out.


I'm almost tempted to drop my 680 in with my 2 290s and see what kind of crazy trouble I can get into :getin:

Stanley Pain
Jun 16, 2001

by Fluffdaddy

Ghostpilot posted:

I'm not sure if I'd feel comfortable doing that with anything less than a hazmat suit and a fallout shelter.

Just got a Corsair AX1200i, I might as well make it sweat a bit ;)

Stanley Pain
Jun 16, 2001

by Fluffdaddy

HalloKitty posted:

Huh, I don't really get why. NVIDIA could make a bit from people who have AMD cards. At the end of the day, you had to buy NVIDIA silicon to get the feature, why block it if it's not the renderer?

Who the hell knows. But it pisses me off. I'm sure the same will be said for Mantle, if and when it makes any kind of splash/ripple. I'd love to throw in one of my older NV cards to act as a PPU.

Stanley Pain
Jun 16, 2001

by Fluffdaddy
Oh holy crap. The Mod! I want to do this now. Tripple H100s in my system. :allears:


Already spent close to $1k on various upgrades over the last week, making me Agreed Jr. I guess. I have a bunch of unused 5 1/4 bays I could probably mount those too as well. This. loving. Thread :keke:

Stanley Pain
Jun 16, 2001

by Fluffdaddy

deimos posted:

Until you take the electricity prices from the monster pump I'd need for all that head pressure into account!

(Don't think I haven't looked into it)

Couldn't you use the heat pressure from the geo-thermal vent?

Stanley Pain
Jun 16, 2001

by Fluffdaddy

El Scotch posted:

I'm hardly any better. As Agreed and I poked fun at each other about last week, I have two 290s. I'm not much of a model of sensible choices.

New drivers fixed my black screens. Can finally use both cards :dance:


It's nice being able to play BF4 + Ultra + 150% Rendering. Game looks insanely good at 1440p.

Stanley Pain
Jun 16, 2001

by Fluffdaddy

Agreed posted:

I dunno, Alexey seemed like a pretty big fan! Why, he even had plans to integrated it.


But then something happened.


Hmm.

Guessing some money changed hands there, what do you think? ;)

Edit: Seriously, what do you think, it can be read two ways - is he going from the broadly compatible QuickSync solution found in, what, 5.3 forward to ShadowPlay only in 5.5 forward? If so, why? Because it's better? Because he got paid? Or am I just misreading him entirely?


The ability to hit a broader market?

Stanley Pain
Jun 16, 2001

by Fluffdaddy

Agreed posted:

It seems like his English isn't absolutely stellar, so maybe I'm misinterpreting, but it sounds to me like he's changing it from QuickSync to ShadowPlay. That's going from Intel proprietary (which pretty much everyone has these days) to nVidia proprietary and only from the 600-series onward and not all cards. Maybe it's something he would have done all along if NVAPI had been sorta open to public developers before, maybe it really just is a much better capture technology (QuickSync videos don't look very good to me, while Shadowplay videos look great to me). I'm curious as to whether he means to replace it completely or just add support - the way it's worded sounds more like he's changing over entirely to Shadowplay.

Hahaha, I read that completely backwards. Now I'm not even sure what he's attempting to do.

Stanley Pain
Jun 16, 2001

by Fluffdaddy
I'm wondering how much I could get for my r9 290s WITH CUSTOM COOLING to make those buttcoins fly faster and further than ever before.


Starting bid $1500 for the pair.

Stanley Pain
Jun 16, 2001

by Fluffdaddy
People still use optical drives? :psyduck:


You should all be procedurally generating ALL your content with a liquid nitrogen cooled GPU cluster.

Stanley Pain
Jun 16, 2001

by Fluffdaddy

GrizzlyCow posted:

Also, TechReport has some data on that R9 290X retailer vs test sample variance issue. It looks kind of bad for AMD.

This is so blown out of proportion it's not even funny. Variance in the manufacturing? Say it ain't so ;)

Stanley Pain
Jun 16, 2001

by Fluffdaddy

Guni posted:

Goddamit AMD and associated companies (ASUS and the like), where the gently caress are the semi-custom coolers? I've been waiting since launch for the 290 to get some, but this is getting pretty ridiculously long and I'm not going to accept the excessive (IMO) heat and noise that the blower generates.

Pull up your pants and build your own :clint:

Stanley Pain
Jun 16, 2001

by Fluffdaddy
AMD FreeSync? Hahahah holy crap didn't see that coming. Hopefully all 3 of my LCDs get, in the least, hacked firmware that supports it. :xd:

Stanley Pain
Jun 16, 2001

by Fluffdaddy
So basically the way business is done every day, replacing AMD and NV with company X and Y.

Stanley Pain
Jun 16, 2001

by Fluffdaddy

Magic Underwear posted:

You are seriously way too invested in this. The pure number of words you have written about this, not to mention how dramatic you're being, is way out of proportion with the subject matter: a niche gaming graphics feature for high end enthusiasts.

I was kinda alluding to this.


Don't get me wrong, I'm VERY glad NV has started to push this issue so that LCD manufacturers get off their asses and start pushing out some good gaming hardware and not some poo poo 2fast2furious TN panels.


Agreed posted:

Honestly, though, it's never as simple as just "the way business is done every day," and oversimplifying can lead to mistaken intuition regarding the nature of the competition going on.

Unfortunately you're wrong. Business in just about every, single sector works like this. A few will lead in innovation and the rest follow suit or we see knee jerk reactions happening. What we're seeing is AMDs knee jerk reaction and a bit of verbal jousting levied towards Nvidia.

Stanley Pain
Jun 16, 2001

by Fluffdaddy

Agreed posted:

There's a reason I keep a dedicated PhysX coprocessor and it's basically all Bs. Borderlands and Batmans. It seems like the problem has more to do with how the SMXes juggle the workload than actually just having enough processor power to do it all, but that's a surface read tbh - I don't actually understand SMXes and I think the list of people who really, properly get them is very small. SMs make more sense, but this Kepler stuff is... way out there. So I don't want to say it's DEFINITELY a workload scheduling thing or whatever, but qualitatively I can say that my gaming experience is vastly better if I use my 650Ti to do all the PhysX processing vs. running the same stuff but with my 780Ti handling both PhysX and rendering. It works okay on Low-Medium, but on High it goes to poo poo, especially minimum framerates (which is why I think it must be a workload juggling issue, some thing that forces it to do THIS then THAT in an inefficient way - PhysX is a light CUDA workload, and going from shaders to CUDA quickly seems difficult in the instance of PhysX).


I'd agree, if really does feel like the pipeline gets very heavily saturated when set to high. Could just be bad coding on the games part or it could be like you said a scheduling issue where the PhysX stuff take absolute priority. Could also be that the PhysX computations just end up "hogging" a good part of the GPU.

I only really notice the slow down in Borderlands 2 when there's a lot of blood. I'm not even sure if the blood has any fancy fluid computations.

Stanley Pain
Jun 16, 2001

by Fluffdaddy
Starting to see some first real world "numbers" about mantle.

http://www.rockpapershotgun.com/2014/01/16/see-a-10000-ship-battle-in-stardocks-insane-new-engine/

Stanley Pain
Jun 16, 2001

by Fluffdaddy

beejay posted:

The motion blur enabling/disabling and FPS gains were during the DirectX part, not Mantle.
I don't even know what the above comment referencing CPUs is talking about.

Watch the video again. The Blur is enabled throughout the Mantle part and then when DX is enabled they have to take it off.

Stanley Pain
Jun 16, 2001

by Fluffdaddy

beejay posted:

Yeah but I don't see where motion blur is causing a hit in Mantle which is what that post seemed to be getting at. To me it looks like they are saying Mantle can run everything turned on whereas with DX you have to turn off things to maintain fps. I mean motion blur itself is kind of dumb but whatever, that's up to the game designers.


I see what you mean, I took it to be the engine but I'm not so sure. Also they were running that demo on an i7 :v: I think they were mostly just saying that with the GPU taking so much of the load, that the CPU is more free to do other things.



First bit of video = Mantle + Motion Blur with VERY smooth framerates.


Rest of video = DX + Blur for a brief moment then no blur after. The DX code path runs like total poo poo with the blue enabled.

Near the end they switch back to mantle which doubles the DX framerate (8 - 16ish :v:)


They allude to allowing more "simulation" stuff going on in the background (the turrets and pew pew laser projectiles, etc).

Stanley Pain fucked around with this message at 21:18 on Jan 17, 2014

Stanley Pain
Jun 16, 2001

by Fluffdaddy

Jan posted:

Which is exactly my point. If their implementation of motion blur causes an apparent 100ms hit, I don't think I can trust anything about their claims of DirectX vs Mantle performance.

They're using a blur algorithm that normally needs to be rendered on a scene by scene basis, in realtime. True motion blur is an amazing effect. Crysis did motion blur really well. It'll basically make 20FPS feel smoother than 60 when implemented properly.

Stanley Pain
Jun 16, 2001

by Fluffdaddy

Agreed posted:

TXAA Stuff

To his defense, I too find that TXAA blurs textures entirely too much and would usually run an injector just to add a bit of sharpness which at that point I might as well use the SMAA injector. It does do a fantastic job with the jaggies though ;)

Stanley Pain
Jun 16, 2001

by Fluffdaddy

Agreed posted:

You're absolutely free to want whatever you want, of course.

I don't care that he has an opinion on what TXAA does re: "is that blurry? that looks blurry, I don't like blurry, I don't think I like TXAA in this game!," I care that he has no clue what TXAA does at all, or SMAA for that matter. You'd expect, I think rightfully, that someone who claims expertise in video card technologies (nearly seventeen years, according to his [H] badge?) would at least know what the hell he's talking about. But that's apparently just not ... nope. Nope nope nope. Better luck next reviewer, I guess.

He is talking about image quality though and from an absolute perspective. It negatively impacts something that is clearly visible. Loss of texture fidelity is something I, and a lot of other people hate, regardless of how good the AA is. See quincunx AA.

Stanley Pain
Jun 16, 2001

by Fluffdaddy

Agreed posted:

Except that you're looking at one arguable upside or downside (different people take it differently) that literally can't be determined in a still frame comparison. Even the more advanced modes of SMAA (which... again, he doesn't understand, as someone points out later on when he mistakenly describes it in the followup discussion) include both spatial and temporal components that aren't going to show in a still frame.

If anything, I think TXAA in Crysis 3 suffers from a lack of finer tuning and looks its best outdoors in wide open areas where it can be remarkably filmic. But there was a lot of contention when the idea of filmic AA versus "just better AA, please, we'd like less jaggies and that's it" started as a conversation. I think inevitably overall image quality is going to win simply as a byproduct of the fact that pixel density is going to make really high perf hit AA methods totally unnecessary, with only relatively minor AA for conventional visual acuity necessary; then what do we do with these powerful GPUs?

TXAA is an answer that's probably a bit early, but that's Tim Lottes for you, the guy thinks ahead (did you check his "next gen wishlist?" it's pretty much dead on for how to improve the user experience with some simple developer paradigm adjustments compared to how games are currently made). I also don't know how much work he was able to finish on it before the big career change, so I don't know if it's as close to being an ideal version of itself as SMAA is to being an ideal version of itself given that it has had a considerable amount of work done to keep it current. TXAA is probably on the "very very filmic" side of "how should videogames look?" question, whereas SMAA is still pretty balanced in terms of just removing jaggies but also allowing for the incorporation of some filmic aspects that aren't quite as consuming.

Of course, then enters the question of how engines should work as such, and it's a whole 'nother can of worms - should it be tiled deferred rendering? Forward+? Clustered shading? Which rendering model gets us closer to true global illumination, since talking about visual acuity, moving on from our current lighting models to even imperfect VSV would be an incredible leap in realism/artistic appearance (since, y'know, realism isn't always what it's about - TXAA doesn't enhance realism, for example, it just makes things look more cinematic, which isn't a goal every gamer shares, one particular visionary rendering guru's notions of how thing should work aside)... It's not that the goalposts start moving, it's that the question blows wide open, and now is sort of the best time to do it since we're going to be able to really work with tech in a new way without former "ugh consoles!" limitations holding us back to anything like the same degree as before.

And I'd like for people who are going to be talking a lot about these technologies' practical implementations to have some grounding in what the hell they're talking about if they're getting paid for it. Maybe that [H] article is just one really bad example that came along at exactly the right time to push my "grrr tech journalism" buttons, I'm sure there have got to be some much better places out there that take the time to understand what it is they're talking about. I hope. Right?

I agree that his review is a bit ham fisted (everything at HardOCP is going to be that way, I do enjoy their PSU reviews though). TXAA, SMAA, etc do look good in motion, but that still comes at a cost to overall visual fidelity as I mentioned earlier. The two games I've played that had it (Secret World and Crysis 3) didn't impress me much but like you said, that's probably a case of bad fine tuning, and potentially a lack of a good sharp filter to even out the texture bluring.

Where it does look stunning is when you'd normally see a TON of alpha textures or a lot of movement of objects with aliased shaders on them and what not.

Adbot
ADBOT LOVES YOU

Stanley Pain
Jun 16, 2001

by Fluffdaddy

exquisite tea posted:

Is it normal for the Geforce Experience optimization tool to randomly switch around its "optimal" configuration without any changes to the system? Last week it was telling me to play AC4 with high shadows, no SSAO and FXAA, now it's saying medium shadows with HBAO 2x and MSAA, neither of which seem to play any better or worse than the settings I was using initially.

Don't use the optimization tool. It's awful.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply