Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Rosoboronexport
Jun 14, 2006

Get in the bath, baby!
Ramrod XTreme

Zero VGS posted:

Question, if I overclock a 760 to be about the same performance as a reference 770, would the 760 draw less watts than the 770, more watts, or about the same? I'm assuming more since running hotter would reduce efficiency but I've no idea in practice.

To achieve that you'd need to overclock the GPU over 33% and 17% on memory to catch up on the difference on SMX count(6 vs 8) and memory speed (6000 vs 7000 MHz) on shader-limited games and that's not including the boost clock differences. However, some games which are ROP-limited you might have a chance to catch the 770 as both cards have the same value of ROPs.

As for the power usage, even if you get 33% overclock to work, most likely 760 would eat more power at that point as you'd have to up the voltage a lot to reach it.

Adbot
ADBOT LOVES YOU

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Endymion FRS MK1 posted:

Huh. FXAA actually worked. I never tried it because I assumed since forcing the stronger AA types didn't work FXAA wouldn't either.

I remember this being wicked awesome when FXAA-through-the-control-panel/nVidia Inspector first hit, because for its imperfections, god drat was it nice to finally get AA that worked on CoP. At the time I didn't expect to be running a card twice as powerful as the GTX 580 with a hefty core OC that was in this same computer then, but hardware has been pretty awesome lately, so now I can dick around with forcing SSAA too. Back then, though, getting ~roughly "4xMSAA quality" assuming no additional coverage samples and having it just work on transparent textures too was a :holy: moment for a game with so much pretty stuff but such a piss-poor implementation of garbage AA in every D3D rendering path (gotta remember, the shader-based AA things working on interiors instead of just edges and on transparent textures without any significant performance hit compared to explicit transparency anti-aliasing was a huge deal at the time).

FXAA may not float everyone's boat, but one very nice thing that you can say about it is that it Just Works, and given the difficulty of getting other AA methods to work (be it trying various compatibility bits and putting up with image quality compromises or trying to get an injector to function correctly without borking the game on accident), it is nice to have a backup plan that you can count on, even if it isn't your favorite AA method around.

fletcher posted:

You guys are throwin' around a lot of anti-aliasing acronyms. Is there somewhere that has a nice comparison of all the different ones?

Here's a somewhat dated but still good list of AA types so I don't have to try to make a new one for no good reason: http://www.overclock.net/t/1329979/anti-aliasing-the-basics

That post gets the goal of TXAA a little wrong, but that can be forgiven as before it was released in practice not a lot of folks really got what it was trying to accomplish anyway because most everyone is way concerned with still image quality and Tim Lottes was thinking about how to fix the persistent problems that we have had for so long with the illusion of moment.

Not to mention that since that post was published we've had more practical examples and most people with an nVidia card should have had an opportunity, assuming they like any of the big AAA franchises that make their way to PC, to actually try TXAA first-hand and see what's going on with it and why it's different from MSAA and FXAA (to the point of incompatibility with the latter).

Here's an older discussion of AA that's pretty topical here, still relevant today; you can see some very good commentary (including what I personally believe to be a really key moment for Tim Lottes when it came to the FXAA 4.0 vs. TXAA thing, but god drat do I wish he'd either had permission or decided to - whichever was the key factor - finish FXAA 4.0, so close but ultimately so far), though it devolves to flaming and then just spamming after comment number 9 or so.

http://filmicgames.com/archives/860

I find this particular little bitter aside to be pretty funny and also unfortunately quite true, by the way, in that the way that pretty games are talked about vs. ugly games often misses some really important points regarding rendering:

quote:

Side Note #5
Users and video game reviewers know virtually nothing about why an image looks good or bad. That’s why you always here them say “texture resolution”. If a game looks good because it has good lighting, nice animation, and high-quality lighting models most forum posters will say “Wow, this game has such detailed textures!”. But when a game has bad animation and gamma space lighting they will say “Those textures look low-res!”. I don’t remember hearing a game reviewer say “Game A has a great anisotropic reflectance model on it’s brushed metal shaders!”. Instead, they’ll say “Game A has really crisp textures” as if the developer magically found an extra 100 megs of texture RAM inside the console.

The same thing tends to happen with anti-aliasing. Often times you’ll see two games, let’s call them Game A and Game B, where A looks much better than B. The reviews will say “Game A has detailed texture and clean edges” whereas “Game B has low-res textures and lots of jaggies” even when they both use the exact same technique. Game reviewers don’t know what to say other than texture detail and jaggies so they use those terms to justify what they already believe.

The author comes down very heavily in support of deferred tiled rendering, but that's not super important compared to the discussion of AA going on, I just think it's a little sad but true comment about the state of discourse there. It's okay for users not to totally get it so long as the basic options are exposed for them to pick between methods and experience them first-hand, but you'd wish that a proper gaming press would do better, yeah?

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
Just found this pretty nice article about AA in Crysis 3: http://hardocp.com/article/2013/03/12/crysis_3_video_card_performance_iq_review/6

Has benchmarks and extensive high quality screenshots, of the following methods tested using in game settings: No AA - FXAA - SMAA - TXAA - MSAA (all using various modes)

HalloKitty fucked around with this message at 10:22 on Jan 29, 2014

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

HalloKitty posted:

Just found this pretty nice article about AA in Crysis 3: http://hardocp.com/article/2013/03/12/crysis_3_video_card_performance_iq_review/6

Has benchmarks and extensive high quality screenshots, of the following methods tested using in game settings: No AA - FXAA - SMAA - TXAA - MSAA (all using various modes)

You can also see a startling lack of understanding of what in the actual gently caress the person is talking about. If you wanted to compare TXAA to other modes, you'd need to do so in motion. Grab a snippet from a tricky bit of a benchmark with a bunch of jaggies or something like that, and show that oh huh that's very odd it looks very different in motion than it does in a static image, it's almost as if what it's doing can't at all be captured in a static image because by design it's intended to rather drastically change how the game is perceived in motion!

And certainly not grounds for some bullshit like this:

quote:

The TXAA options are only supported on NVIDIA GPUs. Turning on the lowest option TXAA Medium 2XT drops 23% in performance. Moving to the highest setting drops 45% from No AA. TXAA is a very taxing option, and as you'll see, completely the wrong AA option to use unless you like your gameplay experience ruined.

...

Overall, TXAA should be avoided. We don't know how this quality of TXAA got implemented into the game and left there. It seems like somebody took the day off when quality control of TXAA image quality rolled around that day. TXAA destroys the gameplay experience in Crysis 3.

I personally think it looks amazing and delivers exactly what Tim Lottes promised - a filmic quality to motion without a greater performance hit than similarly implemented MSAA. In the [H] trial runs there, TXAA comes away with a 45% performance hit, which is on the "well that's rather high" side, but so does MSAA, with 44% for 4xMSAA.

Gaming journalism is dumb but it just goes to poo poo when they try to talk about tech.

quote:


The third mode is the worse and most useless mode in this game, TXAA. If you thought FXAA blurs your textures, you will be shocked at how much TXAA blurs everything in the game. Looking at this game with TXAA enabled is like looking at your display with Vaseline smeared all over it. It causes a drastic reduction in texture quality, object quality, grass, tree, vegetation quality, and character quality. It doesn't matter if you use the TXAA 1 or TXAA 2 mode, both are nasty, visually speaking.

TXAA does improve to a great extent alpha texture aliasing and specular or shader aliasing. In fact, it is the best at doing it, but at the cost of completely destroying your gameplay experience. TXAA should be avoided, you should never use it in Crysis 3.

Shaking my drat head. Dude is confronted with something that baffles him, and rather than explore what might lead to something producing such a different result than he wants - the information is certainly out there as to exactly what TXAA does, and is for - reaction is just "ewwww the textures don't look like most videogames' textures, with an edge sharpening algorithm running, gently caress this forever :cry:" - total misunderstanding of the technology itself and why it's there at all.

Edit: It's not just an nVidia thing either, he has no clue what the SMAA modes mean either.

Agreed fucked around with this message at 14:30 on Jan 29, 2014

CactusWeasle
Aug 1, 2006
It's not a party until the bomb squad says it is
Just upgraded from a Radeon 7950 to an EVGA GTX 780 ACX. Single 1080P screen. Quite simply, shocked at the different in some games. Benchmarks seemed to suggest 20-50% more FPS but Metro Last Light is now playable on max settings and is visually :stare:

So, question: I'm confused about GPU Boost 2.0 and EVGA Precision. The card seems to run at 1084mhz in games but does GPU Boost 2.0 automatically OC the card , or do I need to overclock in EVGA Precision?

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride

CactusWeasle posted:

Just upgraded from a Radeon 7950 to an EVGA GTX 780 ACX. Single 1080P screen. Quite simply, shocked at the different in some games. Benchmarks seemed to suggest 20-50% more FPS but Metro Last Light is now playable on max settings and is visually :stare:

So, question: I'm confused about GPU Boost 2.0 and EVGA Precision. The card seems to run at 1084mhz in games but does GPU Boost 2.0 automatically OC the card , or do I need to overclock in EVGA Precision?

Here you go: http://forums.somethingawful.com/showthread.php?noseen=0&threadid=3484126&perpage=40&pagenumber=85#post417157441

CactusWeasle
Aug 1, 2006
It's not a party until the bomb squad says it is
Thanks!

Stanley Pain
Jun 16, 2001

by Fluffdaddy

Agreed posted:

TXAA Stuff

To his defense, I too find that TXAA blurs textures entirely too much and would usually run an injector just to add a bit of sharpness which at that point I might as well use the SMAA injector. It does do a fantastic job with the jaggies though ;)

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Stanley Pain posted:

To his defense, I too find that TXAA blurs textures entirely too much and would usually run an injector just to add a bit of sharpness which at that point I might as well use the SMAA injector. It does do a fantastic job with the jaggies though ;)

You're absolutely free to want whatever you want, of course.

I don't care that he has an opinion on what TXAA does re: "is that blurry? that looks blurry, I don't like blurry, I don't think I like TXAA in this game!," I care that he has no clue what TXAA does at all, or SMAA for that matter. You'd expect, I think rightfully, that someone who claims expertise in video card technologies (nearly seventeen years, according to his [H] badge?) would at least know what the hell he's talking about. But that's apparently just not ... nope. Nope nope nope. Better luck next reviewer, I guess.

Stanley Pain
Jun 16, 2001

by Fluffdaddy

Agreed posted:

You're absolutely free to want whatever you want, of course.

I don't care that he has an opinion on what TXAA does re: "is that blurry? that looks blurry, I don't like blurry, I don't think I like TXAA in this game!," I care that he has no clue what TXAA does at all, or SMAA for that matter. You'd expect, I think rightfully, that someone who claims expertise in video card technologies (nearly seventeen years, according to his [H] badge?) would at least know what the hell he's talking about. But that's apparently just not ... nope. Nope nope nope. Better luck next reviewer, I guess.

He is talking about image quality though and from an absolute perspective. It negatively impacts something that is clearly visible. Loss of texture fidelity is something I, and a lot of other people hate, regardless of how good the AA is. See quincunx AA.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Except that you're looking at one arguable upside or downside (different people take it differently) that literally can't be determined in a still frame comparison. Even the more advanced modes of SMAA (which... again, he doesn't understand, as someone points out later on when he mistakenly describes it in the followup discussion) include both spatial and temporal components that aren't going to show in a still frame.

If anything, I think TXAA in Crysis 3 suffers from a lack of finer tuning and looks its best outdoors in wide open areas where it can be remarkably filmic. But there was a lot of contention when the idea of filmic AA versus "just better AA, please, we'd like less jaggies and that's it" started as a conversation. I think inevitably overall image quality is going to win simply as a byproduct of the fact that pixel density is going to make really high perf hit AA methods totally unnecessary, with only relatively minor AA for conventional visual acuity necessary; then what do we do with these powerful GPUs?

TXAA is an answer that's probably a bit early, but that's Tim Lottes for you, the guy thinks ahead (did you check his "next gen wishlist?" it's pretty much dead on for how to improve the user experience with some simple developer paradigm adjustments compared to how games are currently made). I also don't know how much work he was able to finish on it before the big career change, so I don't know if it's as close to being an ideal version of itself as SMAA is to being an ideal version of itself given that it has had a considerable amount of work done to keep it current. TXAA is probably on the "very very filmic" side of "how should videogames look?" question, whereas SMAA is still pretty balanced in terms of just removing jaggies but also allowing for the incorporation of some filmic aspects that aren't quite as consuming.

Of course, then enters the question of how engines should work as such, and it's a whole 'nother can of worms - should it be tiled deferred rendering? Forward+? Clustered shading? Which rendering model gets us closer to true global illumination, since talking about visual acuity, moving on from our current lighting models to even imperfect VSV would be an incredible leap in realism/artistic appearance (since, y'know, realism isn't always what it's about - TXAA doesn't enhance realism, for example, it just makes things look more cinematic, which isn't a goal every gamer shares, one particular visionary rendering guru's notions of how thing should work aside)... It's not that the goalposts start moving, it's that the question blows wide open, and now is sort of the best time to do it since we're going to be able to really work with tech in a new way without former "ugh consoles!" limitations holding us back to anything like the same degree as before.

And I'd like for people who are going to be talking a lot about these technologies' practical implementations to have some grounding in what the hell they're talking about if they're getting paid for it. Maybe that [H] article is just one really bad example that came along at exactly the right time to push my "grrr tech journalism" buttons, I'm sure there have got to be some much better places out there that take the time to understand what it is they're talking about. I hope. Right?

Agreed fucked around with this message at 21:03 on Jan 29, 2014

veedubfreak
Apr 2, 2005

by Smythe
I think I figured out why my middle card is doing all the work. Apparently according to gpu-z it is running at 1030mhz where the other 2 cards are only running 1000mhz. Now to figure out why that card has a different "stock" speed than the other 2. I installed the same bios on all 3 cards.

Stanley Pain
Jun 16, 2001

by Fluffdaddy

Agreed posted:

Except that you're looking at one arguable upside or downside (different people take it differently) that literally can't be determined in a still frame comparison. Even the more advanced modes of SMAA (which... again, he doesn't understand, as someone points out later on when he mistakenly describes it in the followup discussion) include both spatial and temporal components that aren't going to show in a still frame.

If anything, I think TXAA in Crysis 3 suffers from a lack of finer tuning and looks its best outdoors in wide open areas where it can be remarkably filmic. But there was a lot of contention when the idea of filmic AA versus "just better AA, please, we'd like less jaggies and that's it" started as a conversation. I think inevitably overall image quality is going to win simply as a byproduct of the fact that pixel density is going to make really high perf hit AA methods totally unnecessary, with only relatively minor AA for conventional visual acuity necessary; then what do we do with these powerful GPUs?

TXAA is an answer that's probably a bit early, but that's Tim Lottes for you, the guy thinks ahead (did you check his "next gen wishlist?" it's pretty much dead on for how to improve the user experience with some simple developer paradigm adjustments compared to how games are currently made). I also don't know how much work he was able to finish on it before the big career change, so I don't know if it's as close to being an ideal version of itself as SMAA is to being an ideal version of itself given that it has had a considerable amount of work done to keep it current. TXAA is probably on the "very very filmic" side of "how should videogames look?" question, whereas SMAA is still pretty balanced in terms of just removing jaggies but also allowing for the incorporation of some filmic aspects that aren't quite as consuming.

Of course, then enters the question of how engines should work as such, and it's a whole 'nother can of worms - should it be tiled deferred rendering? Forward+? Clustered shading? Which rendering model gets us closer to true global illumination, since talking about visual acuity, moving on from our current lighting models to even imperfect VSV would be an incredible leap in realism/artistic appearance (since, y'know, realism isn't always what it's about - TXAA doesn't enhance realism, for example, it just makes things look more cinematic, which isn't a goal every gamer shares, one particular visionary rendering guru's notions of how thing should work aside)... It's not that the goalposts start moving, it's that the question blows wide open, and now is sort of the best time to do it since we're going to be able to really work with tech in a new way without former "ugh consoles!" limitations holding us back to anything like the same degree as before.

And I'd like for people who are going to be talking a lot about these technologies' practical implementations to have some grounding in what the hell they're talking about if they're getting paid for it. Maybe that [H] article is just one really bad example that came along at exactly the right time to push my "grrr tech journalism" buttons, I'm sure there have got to be some much better places out there that take the time to understand what it is they're talking about. I hope. Right?

I agree that his review is a bit ham fisted (everything at HardOCP is going to be that way, I do enjoy their PSU reviews though). TXAA, SMAA, etc do look good in motion, but that still comes at a cost to overall visual fidelity as I mentioned earlier. The two games I've played that had it (Secret World and Crysis 3) didn't impress me much but like you said, that's probably a case of bad fine tuning, and potentially a lack of a good sharp filter to even out the texture bluring.

Where it does look stunning is when you'd normally see a TON of alpha textures or a lot of movement of objects with aliased shaders on them and what not.

forbidden dialectics
Jul 26, 2005





Gyrotica posted:

Unless this wins out :D

Just for shits and giggles I tried it out and my mildly overclocked 780Ti is pushing 700 kH/s, which is in the R9 280X ballpark. Not bad, and certainly a whole shitload better than where it was before, but not all that great. If I crank up the clock/volts it starts drawing frightening amount of power :stare:.

Burger McAngus
May 24, 2010

I just got a 780, would I be able to use an old gt 520 I have laying around as a dedicated PhysX card or would it not be up to the task?

slidebite
Nov 6, 2005

Good egg
:colbert:

How do you do that? I have a new GTX780 and an old GTX570 that's doing nothing and have been getting into ARMA3 lately.

Phuzun
Jul 4, 2007

Adding a second NVIDIA card for physx processing is as simple as installing it to an available slot, then installing the drivers (reinstalling). The default is for the driver to choose the best card, but you can force it in the NVIDIA control panel.

John Lightning
Mar 10, 2012
If I were to purchase a 2GB GTX 770 to use with my 1080p screen, would I have to worry about running out of vram in soon to be released games such as Titanfall? With as many max settings as possible.

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

John Lightning posted:

If I were to purchase a 2GB GTX 770 to use with my 1080p screen, would I have to worry about running out of vram in soon to be released games such as Titanfall? With as many max settings as possible.

2gigs is fine for 1080p.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

The Lord Bude posted:

2gigs is fine for 1080p.

I was doing some research and I'm not sure you need more than 2GB of VRAM at all, for cards that come with that as the reference amount, in single GPU situations.

http://www.legitreviews.com/gigabyte-geforce-gtx-760-4gb-video-card-review-2gb-4gb_129062
http://alienbabeltech.com/main/gtx-770-4gb-vs-2gb-tested/3/

Two reviews where they actually test the same GPU with 2GB of VRAM vs 4GB, neither one shows a scenario where increasing your VRAM at higher resolutions offers any worthwhile improvements in FPS. The biggest difference in FPS was in Far Cry 3 on a GTX 760, but the difference was between 11 FPS to 3 FPS on a GTX 760 at 4k resolution, but you're already in a situation where the game is unplayable.

There wasn't more than like 2 FPS difference between a 2GB GTX 770 and 4GB GTX 770 in all the games they ran up to 5670x1080. It seems to be that in nearly any scenario where you actively need more than 2GB of VRAM, you'll also need a beefier GPU or multiple GPU's first. None of the absolute top of the line cards have to worry about it since 780+ series cards and R9 280+ series cards and higher come with more than 2GB of VRAM by reference and any card lower than that seems to need to be in a SLI/Crossfire configuration to use more than 2GB of VRAM.

Straker
Nov 10, 2005
that's really interesting, no sarcasm. thanks :)

I kinda can't believe that even in BF4 at 1600p there's no difference, what the hell. I'd be curious to see figures with like SLI/crossfired better cards with more/less RAM just to be doubly certain it's not some weird GPU bottleneck.

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

Beautiful Ninja posted:

I was doing some research and I'm not sure you need more than 2GB of VRAM at all, for cards that come with that as the reference amount, in single GPU situations.

http://www.legitreviews.com/gigabyte-geforce-gtx-760-4gb-video-card-review-2gb-4gb_129062
http://alienbabeltech.com/main/gtx-770-4gb-vs-2gb-tested/3/

Two reviews where they actually test the same GPU with 2GB of VRAM vs 4GB, neither one shows a scenario where increasing your VRAM at higher resolutions offers any worthwhile improvements in FPS. The biggest difference in FPS was in Far Cry 3 on a GTX 760, but the difference was between 11 FPS to 3 FPS on a GTX 760 at 4k resolution, but you're already in a situation where the game is unplayable.

There wasn't more than like 2 FPS difference between a 2GB GTX 770 and 4GB GTX 770 in all the games they ran up to 5670x1080. It seems to be that in nearly any scenario where you actively need more than 2GB of VRAM, you'll also need a beefier GPU or multiple GPU's first. None of the absolute top of the line cards have to worry about it since 780+ series cards and R9 280+ series cards and higher come with more than 2GB of VRAM by reference and any card lower than that seems to need to be in a SLI/Crossfire configuration to use more than 2GB of VRAM.

Well it's nice to have some figures on the subject. I'll be sure to recommend accordingly in the future.

exquisite tea
Apr 21, 2007

Carly shook her glass, willing the ice to melt. "You still haven't told me what the mission is."

She leaned forward. "We are going to assassinate the bad men of Hollywood."


Is it normal for the Geforce Experience optimization tool to randomly switch around its "optimal" configuration without any changes to the system? Last week it was telling me to play AC4 with high shadows, no SSAO and FXAA, now it's saying medium shadows with HBAO 2x and MSAA, neither of which seem to play any better or worse than the settings I was using initially.

Stanley Pain
Jun 16, 2001

by Fluffdaddy

exquisite tea posted:

Is it normal for the Geforce Experience optimization tool to randomly switch around its "optimal" configuration without any changes to the system? Last week it was telling me to play AC4 with high shadows, no SSAO and FXAA, now it's saying medium shadows with HBAO 2x and MSAA, neither of which seem to play any better or worse than the settings I was using initially.

Don't use the optimization tool. It's awful.

Rolling Scissors
Jul 22, 2005

Turn off the fountain dear, it's just me.
Nap Ghost
So the Mantle update for BF4 is live:

http://battlelog.battlefield.com/bf4/news/view/bf4-mantle-live

quote:

Test case 1: Low-end single-player
CPU/GPU: AMD A10-7850K (‘Kaveri’ APU), 4 cores @ 3.7 GHz
Settings: 720p MEDIUM settings.
OS: Windows 7 64-bit
Result: 26.6 ms/f -> 23.3 ms/f = 14% faster

quote:

Test case 2: Standard 64-player multiplayer
CPU: AMD FX-8350, 8 cores @ 4 GHz
GPU: AMD Radeon 7970 3 GB (AMD will add support for the AMD Radeon™ HD 7970 in a later stage of Mantle’s release schedule, learn more)
Settings: 1080p ULTRA 1x MSAA
OS: Windows 8 64-bit
Result: 18.87 ms/f -> 15.08 ms/f = 25.1% faster

quote:

Test case 3: High-end single-player with multiple GPUs
CPU: Intel Core i7-3970x Extreme, 12 logical cores @ 3.5 GHz
GPU: 2x AMD Radeon R9 290x 4 GB
Settings: 1080p ULTRA 4x MSAA
OS: Windows 8 64-bit
Result: 13.24 ms/f -> = 8.38 ms/f = 58% faster

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE
Is there any other game on the horizon other than BF4 that is going to be using mantle?

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

That's pretty significant. I want to see some in-depth independent benchmarking and reviews.

Edit: Haha, oh dear. It only supports GCN1.1 right now, yet they promised it would be GCN1+ compatible. They even have a 7970 tested in those results.. but they haven't released the relevant drivers to let the 7970 work with it.

Hardly anybody has a GCN1.1 card, especially due to the price gouging. Not a good start at all.

HalloKitty fucked around with this message at 14:47 on Jan 30, 2014

Stanley Pain
Jun 16, 2001

by Fluffdaddy

The Lord Bude posted:

Is there any other game on the horizon other than BF4 that is going to be using mantle?

Star Citizen, Thief, Nitrous Engine (Oxide, funded by Stardock), Frostbite Engine games, Plants vs Zombies 2 :shobon:

Stanley Pain
Jun 16, 2001

by Fluffdaddy
drat, I'm gonna have to install BF4 again to check this out on dual 290s. Looking forward to it ;)

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

Stanley Pain posted:

Star Citizen, Thief, Nitrous Engine (Oxide, funded by Stardock), Frostbite Engine games, Plants vs Zombies 2 :shobon:

This could be a big deal then. Probably not enough of a deal to sway me away from Nvidia, but that 290X is looking more promising than it was.

Phuzun
Jul 4, 2007

exquisite tea posted:

Is it normal for the Geforce Experience optimization tool to randomly switch around its "optimal" configuration without any changes to the system? Last week it was telling me to play AC4 with high shadows, no SSAO and FXAA, now it's saying medium shadows with HBAO 2x and MSAA, neither of which seem to play any better or worse than the settings I was using initially.

Maybe check the setting that auto-updates the game configurations. Never used that myself, but that sounds like what it says it does.

Stanley Pain
Jun 16, 2001

by Fluffdaddy

The Lord Bude posted:

This could be a big deal then. Probably not enough of a deal to sway me away from Nvidia, but that 290X is looking more promising than it was.

AMD brought their A game for the 290 (if you disregard the stock cooler). Great price/performance AND they fixed a ton of their frame pacing issues. With Mantle I wonder if I can run BF4 with the render setting cranked to 150 or 200% (currently I could run in at 125% on ultra @1440p). Might not be enough vram available for that though.

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE
I think I still prefer the 780ti overall - greenlight, pysX in a handful of games, possibility of a gsync monitor down the road, not having to worry that it will take weeks for new drivers to come out that will let you play a new game without crashes to desktop or game breaking bugs, (I understand this is improving but it's only been a year since I last had an AMD card and I still haven't gotten round to fully trusting them yet). Have they fixed the frame pacing bullshit yet?

Still a 290X is a couple hundred dollars cheaper, and while I've never given a poo poo that sort of thing, I can only justify spending more if the 780ti remains objectively better across the board.

Stanley Pain
Jun 16, 2001

by Fluffdaddy

The Lord Bude posted:

weeks for new drivers to come out that will let you play a new game without crashes to desktop or game breaking bugs, (I understand this is improving but it's only been a year since I last had an AMD card and I still haven't gotten round to fully trusting them yet). Have they fixed the frame pacing bullshit yet?


I think the ATI drivers are poo poo jokes died sometime in 1999. Both companies have awful drivers if you happen to fall into a bugged release for your particular hardware setup. This isn't anything new to NV or AMD.

I mentioned the frame pacing stuff just a post or two up. I had horrible frame pacing issues when I ran dual 6870s which made me switch to NV for a while. My dual 290s haven't see a hint of micro stutter, or delayed frames.

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

Stanley Pain posted:

I think the ATI drivers are poo poo jokes died sometime in 1999. Both companies have awful drivers if you happen to fall into a bugged release for your particular hardware setup. This isn't anything new to NV or AMD.

I mentioned the frame pacing stuff just a post or two up. I had horrible frame pacing issues when I ran dual 6870s which made me switch to NV for a while. My dual 290s haven't see a hint of micro stutter, or delayed frames.

My personal experience on the subject:

I have used Nvidia from March 2008 - July 2010, AMD from July 2010 to Nov 2012, and Nvidia again since then. Admittedly I don't play every game ever that comes out, but numerous games I bought whilst I had an AMD card had severe issues for a couple of weeks until drivers could be released, an issue that never once occurred whilst I was with Nvidia.

Dragon Age Origins - crashes to desktop frequently, Crysis 2 - intense strobelight flickering made the game unplayable till new drivers came out, Rage - flat out didn't work, needed to get a refund, Dishonoured - 2 week wait till I could get past the main menu screen. This is only a selection.

At one stage, AMD had different drivers available based on which of two recently released games you wanted to be able to play. Then you had their mess of a website that made it impossible to find beta drivers and fixes without tracking down an obscure helpdesk article on a specific issue, and their system of issuing hotfixes separate from drivers, but not necessarily rolling them all into each new driver release, so you had multiple different driver versions that worked with different games.

They may have fixed all this in the past year, but It may be too soon for me to trust them.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Definitely impressed with the Mantle numbers, especially the test case where they took a game from 30fps to 60fps. That's the kind of performance it needs to impress, although more data on the types of tests run (and frankly just a little lab where I could do my own damned tests would be pretty alright too).

Soon as somebody can use this - veedub, can we count on you? - would appreciate numbers for a more standard quad core configuration plus A SINGLE 290x, if you can turn the other ones off, to see if their mantle results are in reality "Crossfire is inefficient as poo poo and comes with a gigantic bucket of CPU overhead" or not.

Stanley Pain
Jun 16, 2001

by Fluffdaddy
Reverse AMD with Nvidia and you'll have my story, then flip flop for everyone else in the forum :P

You had some bad luck :( Nvidia has also been guilty of the "two different driver sets depending on which game you're playing". At the end of the day, point is, you get what you're comfortable with, at the price you're willing to pay. If that means ponying up more money for very similar performance that's OK.

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE
Absolutely, I've always been willing to pay more, If I get more performance, I don't typically consider value at all, but if mantle improves the 290X to the point where it is cheaper AND faster, I'm gonna be tempted to switch back.

Agreed posted:

Soon as somebody can use this - veedub, can we count on you? - would appreciate numbers for a more standard quad core configuration plus A SINGLE 290x, if you can turn the other ones off, to see if their mantle results are in reality "Crossfire is inefficient as poo poo and comes with a gigantic bucket of CPU overhead" or not.

I'm hoping it's this. gently caress crossfire forever.

Stanley Pain
Jun 16, 2001

by Fluffdaddy

Agreed posted:

Soon as somebody can use this - veedub, can we count on you? - would appreciate numbers for a more standard quad core configuration plus A SINGLE 290x, if you can turn the other ones off, to see if their mantle results are in reality "Crossfire is inefficient as poo poo and comes with a gigantic bucket of CPU overhead" or not.


Please make this so. I could run some numbers on my setup with dual and single 290s as well if I have time.

Adbot
ADBOT LOVES YOU

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

The wording on their own, very much pro-290 site makes it pretty clear that most of the gains there are Crossfire related. I think they word it as something dumb like "the CPU can't keep the GPUs busy," which may be true, I dunno, those are very low frame times to start and end with. The whole point of Mantle and Mantle-like improvements (nVidia wants you to just use OpenGL though) is to remove what we definitely know to be some serious issues with CPU load management and constant state queries (in addition to the draw calls overhead problem). I think that the set of numbers they give us there is probably THE LEAST useful set we could possibly get for anyone to actually verify at home, though we'll presumably see some better analysis in the hardware tech community (except from [H] because their gurus are idiots apparently).

A test they should have run but didn't: High population server, single 4770, single R9 290, graphics set on the high side but keeping frame times below 16ms at all costs. Compare results with an AMD FX 8-"core" CPU.

A test they should have run but didn't: Single player, single 4770, single R9 290. Graphics maxed as possible. Compare results with the same AMD processor used above.

The fact that the best real-world hopeful results come from a scenario that users actually cannot either test or experience on their own right now is just silly. And the whole setup is so limited in the scope of testing and not representative of the kinds of systems people game on that it just isn't of much use.

So, as soon as anyone with GCN 1.1 cards here can get this patch installed, some more useful benches would be appreciated. Crossfire is a good thing to bench, too, but only in a situation where we've got non-Crossfire numbers to compare it to, as apples:apples as possible.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply