Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

beejay posted:

What is "the AMD/nvidia software bonfire" and where does AMD come into play via that link?

edit: I think I get it, you are saying burn the extra software that comes with graphics drivers in general, in which case I agree.

No, just a long, storied history of "NO, YOUR BRAND loving SUCKS!" back and forth.

Adbot
ADBOT LOVES YOU

redstormpopcorn
Jun 10, 2007
Aurora Master

Zero VGS posted:

There's a thing where after enough coins are mined into the economy, the "difficulty" increases to the point where it is like twice as hard to mine coins with the same hardware, which means you're getting them twice as slow or burning double the electricity. When that happens people will give it a rest or try to develop dedicated mining hardware like ASICs and the GPUs will return to normal price.

Either that or AMD ramps up production enough to meet demand, or a combination of both.

IIRC nobody's buying AMD hardware to mine BitCoins anymore since ASICs have already cranked the difficulty to the point where not even the ASICs themselves really have a chance of breaking even. They're all getting snatched up for LiteCoins and Dogecoins and Coinye and whatever other stupid loving worthless pre-mined horseshit fork of Bitcoin is the new hot flavor this week. AMD's production issues will cease based on prohibitive legislation or the end of greed-based cryptography, whichever comes first.

Incidentally, my former boss is apparently running 8 Litecoin mining rigs with multiple R9-series cards in the shop's A/C-less warehouse; they're all throttling at 85°C when the temperature outside is under 20°C. It's gonna be awesome when summer rolls around, the ambient temperature hits 48°C, and every card dies simultaneously.

Hamburger Test
Jul 2, 2007

Sure hope this works!

Agreed posted:

No, just a long, storied history of "NO, YOUR BRAND loving SUCKS!" back and forth.

It came up a couple of days ago again that nVidia's drivers were more stable, and then the counter. This is just another piece of evidence that both companies release lovely software; it's pretty even in my experience when adding it all up. I think ShadowPlay is neat, but it's not worth dealing with GeForce Experience.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

I haven't personally had any issues with Experience since its initial release, but I don't doubt that some haven't been so lucky, especially if they're running with Shadowplay enabled full-time since that's a pretty significant thing to have going on non-stop. I use it as a "turn it on when I need it, turn it off when I don't" kind of thing and in that usage I have had no problems. I have noticed it doesn't work with older DX/D3D versions than DX9 (it MIGHT work with DX8, but I'm not sure - definitely a no-go with DX2, DX3, and DX6 though) which is kind of a bummer since it limits its utility for a one-stop-shop for Let's Plays if the game is a little older.

But yeah, anyone with a couple brain cells to rub together and who has been paying attention to the history of things gets by now that neither nVidia or AMD are without sin. Drivers and value-add, they both make some goofs along the way. Sometimes their goofs are really bad and bork the fan controller so that it doesn't work right, killing cards, though that hasn't happened in super recent memory (and maybe can't happen with today's much more effective over-everything protections in place unless it's a card made for liquid nitrogen overclocking or something equally nuts out the gate).

craig588
Nov 19, 2005

by Nyc_Tattoo

spasticColon posted:

I might go ahead and overclock the VRAM on my 660Ti in order to compensate for the gimped memory bus. But the individual VRAM chips on my video card don't have a heatsink or heatsinks so would I have to buy VRAM heatsinks before safely overclocking? I wouldn't overvolt I would just overclock until I hit the wall on the stock voltage.

You don't need them. The memory just doesn't get hot, it's practically ambient on my 680 overclocked to 7GHz. They get hotter from the heat radiated through the PCB from the VRMs and GPU than they do from their own dissipation.

Ignoarints
Nov 26, 2010
I just tried to unlock the voltage cap through a gpu bios mod but it didn't seem to work. I'm still stuck at 1.175. I realized the instructions I was following were a year old along with pretty much all the links about this. Sure enough I crashed it at just 26 MHz over what I had before which is why I stopped in the past. The temperature actually got up to 69* C in furmark but the fan duty was only 10% then it kicked it and never went over.

I can't help but think 1.175 v is killing any more overclock here. Did nvidia close this bios mod off?

Edit: nm I looked at some of the nvflash options and one of them was remove protection which did it. It boosts itself insane now. With just a 14 mhz offset it goes to 1267 mhz. It stays at 70*C at 42% fan at full load



It crashes at 1293 MHz but been stable for 30 min so far at 1280. I haven't pushed the memory to crashing yet but I'm very pleased with the results from pushing the voltage. So far, anyways, and I haven't tested anything in game yet. But I couldn't come anywhere near 1280 before.

Ignoarints fucked around with this message at 02:58 on Feb 1, 2014

jink
May 8, 2002

Drop it like it's Hot.
Taco Defender
Anandtech just posted their Mantle 'preview' using a high performance Intel setup (Intel Core i7-4960X @ 4.2GHz) coupled with a 290X.


quote:

Finally, we’ll quickly close with some of AMD’s performance numbers, which they’ve published in their reviewer’s guide. We feel that vendor-provided should always be taken with a grain of salt, but they do serve their purpose, especially for getting an idea of what performance is like under a best case scenario. To that end we can quickly see that AMD was able to top out at a 41% performance improvement on a 290X paired with an A10-7700K. This is a greater performance gain than the peak gain of 30% we’ve seen in our own results, but not immensely so. More importantly it can give us a good idea of what to reasonably expect for performance under Battlefield 4. If AMD’s results are accurate, then a 40% performance improvement is the most we should be expecting out of Battlefield 4’s Mantle renderer.




http://www.anandtech.com/show/7728/battlefield-4-mantle-preview

jink fucked around with this message at 22:46 on Feb 1, 2014

Ignoarints
Nov 26, 2010

jink posted:

Anandtech just posted their Mantle 'preview' using a high performance Intel setup (Intel Core i7-4960X @ 4.2GHz) coupled with a 290X.





http://www.anandtech.com/show/7728/battlefield-4-mantle-preview

Sigh, what are the chances this is going to be used by nvidia?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
About 0%.

unpronounceable
Apr 4, 2010

You mean we still have another game to go through?!
Fallen Rib

jink posted:

Anandtech just posted their Mantle 'preview' using a high performance Intel setup (Intel Core i7-4960X @ 4.2GHz) coupled with a 290X.

http://www.anandtech.com/show/7728/battlefield-4-mantle-preview

I can't help but be amused that there's a Phenom II in AMD's performance graph.

craig588
Nov 19, 2005

by Nyc_Tattoo

Ignoarints posted:

Sigh, what are the chances this is going to be used by nvidia?

Real world performance increases are barely 10%. It's not really something sigh worthy. You can see bigger gains with a much slower CPU, but when are you ever going to play games on a 2GHz CPU?

Ignoarints
Nov 26, 2010

craig588 posted:

Real world performance increases are barely 10%. It's not really something sigh worthy. You can see bigger gains with a much slower CPU, but when are you ever going to play games on a 2GHz CPU?

Considering how far people push for barely 10%, I'd be extremely pleased with a blanket 10% for free. I've seen a ton of people saying that now, I don't know why it's viewed as a trivial amount. Also I was interested in how it smoothed out fps spikes when the cpu gets momentarily slammed. If I got a patch that gave my cpu a 10% boost for free I'd flip out. I don't understand how this is much different from that.

craig588
Nov 19, 2005

by Nyc_Tattoo
It comes down to game support a lot more than hardware support. Unless it's really easy to implement, a lot of developers probably won't be enthusiastic about diverting resources for a 10% gain.

beejay
Apr 7, 2002

I would agree but BF4 is a pretty big name and they did it. Developers put PhysX in games. It probably won't be widespread for a while if ever but I think it will catch on decently.

Professor Science
Mar 8, 2006
diplodocus + mortarboard = party

beejay posted:

I would agree but BF4 is a pretty big name and they did it. Developers put PhysX in games. It probably won't be widespread for a while if ever but I think it will catch on decently.
they did it because repi has been pushing for a new 3D API for years and years and years and AMD probably paid a lot of money (that's how developer relations works in the game industry, PhysX is the same)

AccidentalFloss
Nov 4, 2005

celebratory gunfire
Do you guys recommend driver cleaner? I bought it but I can't register it as they never gave me the info (serial # etc). I've been waiting to hear back from support since new years eve. I can't run it until I reg. I wanted to try it because nvidia uninstall didn't work properly. I switched to a amd radeon hd 7790 but it wouldn't work until I manually deleted the nvidia files. I want to clean that poo poo up.

beejay
Apr 7, 2002

Driver Fusion is usually recommended around here.

EoRaptor
Sep 13, 2003

by Fluffdaddy

Professor Science posted:

they did it because repi has been pushing for a new 3D API for years and years and years and AMD probably paid a lot of money (that's how developer relations works in the game industry, PhysX is the same)

This is about Microsoft no longer being interested in windows as a prime gaming platform. DX has stagnated on PC, yet the leverage provided by the prime gaming platform (xbox) makes using other API's a much higher development burden than performance returned.

Though mantle is NOT related to the xbox platform API, the underlying similarity between the graphics processors means that optimization applied to the xbox build can be used in the mantle build.

I don't know if AMD really hopes mantle will take off, but 10% for a major title that is having mantle added on after development was mostly complete isn't bad. This is also mantle's first appearance in a game, and developers will hopefully get better with it.

GrizzlyCow
May 30, 2011

craig588 posted:

Real world performance increases are barely 10%. It's not really something sigh worthy. You can see bigger gains with a much slower CPU, but when are you ever going to play games on a 2GHz CPU?

When you're on a laptop? Most laptops processors would more resemble an Ivy-Bridge 2C/4T @ ~2GHz processor than AnandTech's other preview setup. Really, anyone without a big budget will be using a 2C/4T @ 3GHz (i3) or a low-powered AMD processor in their computer (and a weaker card), so even a 10% boost will be a big welcome for them.

It'll be interesting to see the FCAT data. If Mantle similarly improves the frame times, it'll be much more impressive in my mind.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

craig588 posted:

Real world performance increases are barely 10%. It's not really something sigh worthy. You can see bigger gains with a much slower CPU, but when are you ever going to play games on a 2GHz CPU?
It's as little as ~10% if you have put a lot of care and money into building a balanced system that doesn't have any bottlenecks. If you didn't have the money for an Intel quad-core for example, 30-40% is a nice performance uplift.

AccidentalFloss posted:

Do you guys recommend driver cleaner? I bought it but I can't register it as they never gave me the info (serial # etc). I've been waiting to hear back from support since new years eve. I can't run it until I reg. I wanted to try it because nvidia uninstall didn't work properly. I switched to a amd radeon hd 7790 but it wouldn't work until I manually deleted the nvidia files. I want to clean that poo poo up.
File a chargeback with your credit card company and try not to buy scam software in the future.

veedubfreak
Apr 2, 2005

by Smythe
Well I took the cards out and swapped some stuff around, the "middle" card still insists on being the primary card no matter which card is in the slot. This board is weird.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Professor Science posted:

they did it because repi has been pushing for a new 3D API for years and years and years and AMD probably paid a lot of money (that's how developer relations works in the game industry, PhysX is the same)

Around $8,000,000 changed hands to secure the deal for AMD to get to do Mantle.

Well, we get a pretty good idea of why AMD press broke with the numbers that they did.

8%-10% performance gains in high end scenarios is not the (paraphrasing) "with Mantle, we will embarrass a Titan" level of performance difference advertised. Hell, it's not even the "maybe 25%-ish" that would certainly be really impressive. It's... not a lot of return on all those man-hours, frankly, wow that kinda sucks.

But at least, according to AMD's numbers, it strongly helps reduce Crossfire CPU overhead to the tune of mega performance gains, so veedub if you play a lot of BF4 I guess you're about to get way higher framerates, assuming the CPU was definitely the limiting factor in that benchmark whose details we don't really have to make any accurate comparison.

8%-10%, after all that? And this is the flagship example?...

:raise:

jink
May 8, 2002

Drop it like it's Hot.
Taco Defender

unpronounceable posted:

I can't help but be amused that there's a Phenom II in AMD's performance graph.

I think it is pretty telling that the main gains are for the weak AMD cores. It makes sense that DICE would announce the Mantle release using them for the biggest impact.

I hope that Mantle's performance start some changes in how DirectX works but I don't understand enough of the pipeline to know if that's feasible or if Microsoft even cares. PCs are dead, remember? :D

Fallows
Jan 20, 2005

If he waits long enough he can use his accrued interest from his savings to bring his negative checking balance back into the black.

veedubfreak posted:

Well I took the cards out and swapped some stuff around, the "middle" card still insists on being the primary card no matter which card is in the slot. This board is weird.

Is your middle card in the black slot? That one doesn't use the PLX so it will be x16 while the others are running at x8, I don't know if that makes any difference to which card is primary tho.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Agreed posted:

8%-10%, after all that? And this is the flagship example?...

:raise:

Again, up to 40% when CPU limited.

beejay
Apr 7, 2002

Agreed posted:

Around $8,000,000 changed hands to secure the deal for AMD to get to do Mantle.

Well, we get a pretty good idea of why AMD press broke with the numbers that they did.

8%-10% performance gains in high end scenarios is not the (paraphrasing) "with Mantle, we will embarrass a Titan" level of performance difference advertised. Hell, it's not even the "maybe 25%-ish" that would certainly be really impressive. It's... not a lot of return on all those man-hours, frankly, wow that kinda sucks.

But at least, according to AMD's numbers, it strongly helps reduce Crossfire CPU overhead to the tune of mega performance gains, so veedub if you play a lot of BF4 I guess you're about to get way higher framerates, assuming the CPU was definitely the limiting factor in that benchmark whose details we don't really have to make any accurate comparison.

8%-10%, after all that? And this is the flagship example?...

:raise:

This is a little fanboyish. :(

veedubfreak
Apr 2, 2005

by Smythe

Fallows posted:

Is your middle card in the black slot? That one doesn't use the PLX so it will be x16 while the others are running at x8, I don't know if that makes any difference to which card is primary tho.

No, it's in the 2nd orange slot. I did try to make the black slot my primary slot but it ends up pushing the third card to the edge of the board and complicated the tubing.

iuvian
Dec 27, 2003
darwin'd



Battlefield benchmarks that aren't from a 64 player server are pretty pointless, but even a 10% boost in a mode nobody plays is pretty good.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

beejay posted:

This is a little fanboyish. :(

No, just disappointed that it's not more. I had hoped for more. I had expected more after the most recent Oxide presentation.

It seems like there aren't that many situations where BF4 becomes CPU limited in these benches, I guess. Hopefully it'll allow developers for the other games using Frostbite to allow for situations that'd cause problems at the level where we actually see substantial gains, though choosing to do so to reward AMD card buyers solely is a big problem for developers.

It exposes some interesting, strong fringe case benefits to a less constrained graphics API, and hopefully in a direct enough way that it actually encourages broader movement toward something better (or maybe gets Microsoft to improve how GPUs work with WDDM, hey, that'd be awesome) - but we really need a cross-platform thing here, not a proprietary thing. And I definitely had hoped for a higher average framerate improvement.

I don't see that as "fanboyish" at all. Please explain.

GrizzlyCow
May 30, 2011
It's higher for less powerful CPU. You were only going to see large benefit from Mantle from weaker processors anyway. It's also a little early to be disappointed. According to DICE, they were able to get up to 25% increase with a FX-8350 and a 7970 on one of the mutliplayer maps, and the 60% figure came from a CF build. If that's true, we could see these bigger benefits on systems with fairly adequate processors, too.

Mulva
Sep 13, 2011
It's about time for my once per decade ban for being a consistently terrible poster.

Agreed posted:

I don't see that as "fanboyish" at all. Please explain.

8% from a badly coded piece a poo poo from the first game to really use an entirely new API is pretty good, and in fact even anandtech has them going up to 30% at times. It's not like it's 8% period, that's the floor. If the floor was much higher, that'd be insane. Saaaaaaaaay...15% improvement as the floor of performance increase from a bad team of coders being slotted into a badly coded game? That would have been one of the most impressive things to happen in computer programming in the last decade. 8% is fine. It's not like a r2[7/8x/9/9x]0 is a bad card as is, getting an extra 8 to 30% increase just for showing up is pure bonus on top of making a good decision.

Or what was a good decision before bitminers jacked prices up a hundred bucks at least.

gemuse
Oct 13, 2005

iuvian posted:

Battlefield benchmarks that aren't from a 64 player server are pretty pointless, but even a 10% boost in a mode nobody plays is pretty good.

Don't know how reliable this site is, but the gains seem pretty good: http://www.golem.de/news/amds-mantle-api-im-test-der-prozessor-katalysator-1402-104261-3.html

Hace
Feb 13, 2012

<<Mobius 1, Engage.>>
Honestly I'm more interested in seeing what the gains are on lower/mid-range 7XXX series cards.

Ignoarints
Nov 26, 2010

Agreed posted:

Around $8,000,000 changed hands to secure the deal for AMD to get to do Mantle.

Well, we get a pretty good idea of why AMD press broke with the numbers that they did.

8%-10% performance gains in high end scenarios is not the (paraphrasing) "with Mantle, we will embarrass a Titan" level of performance difference advertised. Hell, it's not even the "maybe 25%-ish" that would certainly be really impressive. It's... not a lot of return on all those man-hours, frankly, wow that kinda sucks.

But at least, according to AMD's numbers, it strongly helps reduce Crossfire CPU overhead to the tune of mega performance gains, so veedub if you play a lot of BF4 I guess you're about to get way higher framerates, assuming the CPU was definitely the limiting factor in that benchmark whose details we don't really have to make any accurate comparison.

8%-10%, after all that? And this is the flagship example?...

:raise:

8 million to replace direct x with a minimum of 8% performance increase seems like an very good bargain. It's almost hard to believe even. I'm not aware of anything that was released and implemented into existing hardware that produced those kinds of gains across the board. Ever.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Some more real-worldish multiplayer numbers:

http://pclab.pl/art55953-3.html

Anyone speak Polish?

Ignoarints posted:

8 million to replace direct x with a minimum of 8% performance increase seems like an very good bargain. It's almost hard to believe even. I'm not aware of anything that was released and implemented into existing hardware that produced those kinds of gains across the board. Ever.

Designed for their cards from the ground up. Hell, you can find examples of games that have a bigger performance difference than 8% running on AMD vs. nVidia hardware without anything like Mantle. Unless they make this genuinely cross-platform, it's not what you're saying it is.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.
At least the preliminary Mantle benchmarks aren't making me regret going with a GTX 770 paired with an overclocked Core i5-3570k. Mantle was the primary reason I was looking at a R9 280X/7970 ghz when I was looking to upgrade but miners put the kibosh on that plan. Hopefully anand and techreport will start having more 'sane' builds benchmarked in the near future, R9 290X's in uber mode running at 1080p is quite the edge case. A Core i5 with a 7870/7970 seems to be the sweet spot many gamers hit for performance and probably a Core i3/7850 or 7770 for the more budget limited gamers, for Radeon cards the Steam Hardware Survey has the 7850 being the most popular AMD card, followed by the 7770.

Straker
Nov 10, 2005
also, "I'm not CPU limited because my CPU doesn't suck" is just another way of saying "I skimped on GPUs" :v:

I mean CPUs aren't getting much better much faster, so being able to support more GPU than you otherwise could (or even just minimizing CF overhead) sounds like a good deal. Haven't actually tried BF4 since this news though.

One Eye Open
Sep 19, 2006
Am I awake?
The point is, not everyone can afford an i5 or an i7, let alone a 290(X) or 780/Titan. For those people, who make up a lot of the gaming public, the gains are a lot higher with Mantle. I know this thread likes to only really discuss the high end stuff, but most people who enjoy gaming can't afford that kind of thing, and are happy with anything that improves their experience.

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

One Eye Open posted:

The point is, not everyone can afford an i5 or an i7, let alone a 290(X) or 780/Titan. For those people, who make up a lot of the gaming public, the gains are a lot higher with Mantle. I know this thread likes to only really discuss the high end stuff, but most people who enjoy gaming can't afford that kind of thing, and are happy with anything that improves their experience.

This thread is the exact opposite of 'the high end stuff'. This thread, if anything, is very close to hostile towards people that have a looser interpretation of 'value' and want to spend more money on their PC. Most people who post here are trying to get the best bang for their buck at <1k budgets. The 290x and 780 aren't even on the list of graphics card recommendations. An i5 is not high end, and we make it explicitly clear not to buy an i7 for gaming.

Adbot
ADBOT LOVES YOU

One Eye Open
Sep 19, 2006
Am I awake?

The Lord Bude posted:

This thread is the exact opposite of 'the high end stuff'. This thread, if anything, is very close to hostile towards people that have a looser interpretation of 'value' and want to spend more money on their PC. Most people who post here are trying to get the best bang for their buck at <1k budgets. The 290x and 780 aren't even on the list of graphics card recommendations. An i5 is not high end, and we make it explicitly clear not to buy an i7 for gaming.

I must be reading a different thread then, where people are constantly talking about those very cards, recommended or not. Also an i5 may not be high end, but it is still out of a lot of people's budgets and by the figures I linked, not used by the majority of gamers.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply