Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

Twerk from Home posted:

Look at the R9-290, a GTX 780 competitor that falls somewhere between a 970 and 980 today. It doesn't make any goddamn sense to me either, but the 7970 and R9-290 definitely perform way better on the same benchmarks now vs around their release time.

It's probably just AMD taking years to actually write properly performing drivers for their GPU's. Nvidia GPU's don't seem to 'age' as well because Nvidia actually manages to wring most of the performance out the cards early on.

Adbot
ADBOT LOVES YOU

Rukus
Mar 13, 2007

Hmph.

Beautiful Ninja posted:

It's probably just AMD taking years to actually write properly performing drivers for their GPU's. Nvidia GPU's don't seem to 'age' as well because Nvidia actually manages to wring most of the performance out the cards early on.

That and no doubt Nvidia is encouraged to show how much better their new cards are compared to the older ones, so focus shifts on optimizing the upcoming cards instead of using resources on their old product.

TenementFunster
Feb 20, 2003

The Cooler King
when the hell is the price on the 1070 coming down? or is that just bullshit nerd rumors?

Verizian
Dec 18, 2004
The spiky one.
AMD's dx11 drivers were so bad they claimed that only 3% of the 290's potential was being utilised properly.

What we're seeing is understandable when AMD's architecture was significantly better than Nvidia's pre-Maxwell but their software was cold poo poo on a hotplate.


TenementFunster posted:

when the hell is the price on the 1070 coming down? or is that just bullshit nerd rumors?

When retailers sell their massive stocks of 900 series cards and they get more than 5-15 10x0 cards per SKU in a delivery.

Unless you're in the UK then expect prices to rise, and rise.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

TenementFunster posted:

when the hell is the price on the 1070 coming down? or is that just bullshit nerd rumors?

That depends on what you mean by the price coming down. If you are talking about the prices you see on Amazon and such a lot of those are due to scalpers reselling cards at very high prices. If you mean to ask when we will see cards at the MSRP of $379 there are a couple near that but they come with blower coolers so I can't recommend them. There is no reason to believe that the decent ones with open coolers will start selling for anything less than ~$420 any time soon. If someone told you that the MSRP is dropping soon then :lol:, they were either making poo poo up or confused between scalper prices and the MSRP.

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

AVeryLargeRadish posted:

That depends on what you mean by the price coming down. If you are talking about the prices you see on Amazon and such a lot of those are due to scalpers reselling cards at very high prices. If you mean to ask when we will see cards at the MSRP of $379 there are a couple near that but they come with blower coolers so I can't recommend them. There is no reason to believe that the decent ones with open coolers will start selling for anything less than ~$420 any time soon. If someone told you that the MSRP is dropping soon then :lol:, they were either making poo poo up or confused between scalper prices and the MSRP.

The high Pascal prices has completely to do with Nvidia/retailer price gouging and high demand, and totally nothing to do with AMD complete non-competition in the GTX 1070-1080 tier and massively screwing up the RX480 launch.

According to AMD fanboys, anyway.

Hubis
May 18, 2003

Boy, I wish we had one of those doomsday machines...
Yeah, AMD GPUs "improving" over time has everything to do with their awful DX11 optimization getting better (though still not great). Ironically, if you buy that DX12 is going to be a big deal in the future then by that logic NVIDIA is going to age "better" as they have (or at least had) more perf left on the table in the driver.

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

Hubis posted:

Yeah, AMD GPUs "improving" over time has everything to do with their awful DX11 optimization getting better (though still not great). Ironically, if you buy that DX12 is going to be a big deal in the future then by that logic NVIDIA is going to age "better" as they have (or at least had) more perf left on the table in the driver.

I'm not so sure of that. AMD GPUs still have bad performance compared to their theoretical GFLOPS numbers. Additionally, I thought the understanding was that AMD DX11 was just bad and DX12 avoids a lot of the badness, causing them to improve relatively. I don't see how you can extrapolate from that that NVIDIA must somehow be bad at DX12 or have much more performance left on the table.

Hubis
May 18, 2003

Boy, I wish we had one of those doomsday machines...

Skuto posted:

I'm not so sure of that. AMD GPUs still have bad performance compared to their theoretical GFLOPS numbers. Additionally, I thought the understanding was that AMD DX11 was just bad and DX12 avoids a lot of the badness, causing them to improve relatively. I don't see how you can extrapolate from that that NVIDIA must somehow be bad at DX12 or have much more performance left on the table.

I don't actually think you can, I just mean that extrapolating how well AMD GPUs will age is probably not that useful.

Their DX11 is still bad, it's just gotten a little bit less so. Mantle DX12 was their plan to avoid that.

I do think you can extrapolate that NVIDIA has perf left on the table to some degree, though, because there are benchmarks where they are faster in DX11 than DX12. It's just a question of whether that's a dx12 driver issue, or because of optimizations that exist in the driver in dx11 but must be implemented manually in the application in dx12 (because it's more explicit).

Drakhoran
Oct 21, 2012

Now I'm not really an GPU expert, but my understanding was that a large part of AMD's advantage in DX12 comes from the fact that Maxwell lacked hardware for the kind of asynchronous compute GCN can do. I presume either Pascal has got similar circuitry or else Nvidia will make certain The Way It Was Meant To Be Played games do not make use of that capability.

repiv
Aug 13, 2009

Drakhoran posted:

Now I'm not really an GPU expert, but my understanding was that a large part of AMD's advantage in DX12 comes from the fact that Maxwell lacked hardware for the kind of asynchronous compute GCN can do. I presume either Pascal has got similar circuitry or else Nvidia will make certain The Way It Was Meant To Be Played games do not make use of that capability.

It's not just that GCN benefits from aggressive use of async compute, it's also that aggressive use of async compute makes Maxwell slower. So any DX12 game sponsored by AMD (most of them so far) is probably hitting Maxwell right where it hurts.

The main improvement in Pascal seems to be that it's indifferent to use the use of async - it doesn't gain much but it doesn't lose anything either.

lDDQD
Apr 16, 2006

Hubis posted:

NVIDIA is going to age "better" as they have (or at least had) more perf left on the table in the driver.

You really believe nVidia deliberately leaves performance on the table? Anyway, Maxwell was extremely poo poo at DX12 in some cases, because it lacks the hardware to implement some of the features of that API. And it will never, ever get significantly better at it with driver updates. Everyone assumed that Pascal would implement those features, but that hasn't been the case: it still lacks them. So once again, there's no way to get the kind of performance you can get out of having an ASIC, when you're doing it in software. Meanwhile, AMD has had this hardware since the 290s, and added some improvements in more recent versions of GCN, so the argument could be made for the longevity of AMD's graphics cards, if anything.

On the other hand, if you're buying a 480 or something, you'll probably upgrade it by the time DX12 / Vulkan titles actually start becoming commonplace, so who cares, really?

Risky Bisquick
Jan 18, 2008

PLEASE LET ME WRITE YOUR VICTIM IMPACT STATEMENT SO I CAN FURTHER DEMONSTRATE THE CALAMITY THAT IS OUR JUSTICE SYSTEM.



Buglord
Where are the RX470 and GTX1060s? Release them to quell all the RX480 demand so we have more choices already.

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

jm20 posted:

Where are the RX470 and GTX1060s? Release them to quell all the RX480 demand so we have more choices already.

Rumors on the China Interwebs for GTX 1060: 7th July launch, 14th July availability. Stock 980 performance.

More or less confirmed: 1x6-pin power, no SLI, 192-bit memory bus.

https://www.techpowerup.com/223883/nvidia-geforce-gtx-1060-doesnt-support-sli-reference-pcb-difficult-to-mod

Hubis
May 18, 2003

Boy, I wish we had one of those doomsday machines...

Drakhoran posted:

Now I'm not really an GPU expert, but my understanding was that a large part of AMD's advantage in DX12 comes from the fact that Maxwell lacked hardware for the kind of asynchronous compute GCN can do. I presume either Pascal has got similar circuitry or else Nvidia will make certain The Way It Was Meant To Be Played games do not make use of that capability.

Async compute is a small part of DX12. The main benefit of DX12 is being able to better afford CPU overhead from API calls setting up the draws (which is why it doesn't help at all in pure GPU-bound applications, unless you are dealing with GPU-GPU dependencies or something).

lDDQD posted:

You really believe nVidia deliberately leaves performance on the table?

Deliberately? Not at all. But there are some driver optimizations that require a level of resource control not available in DX12, so they need to get app developers to change behavior to get the same efficiency. In other cases it might be possible, but it takes a lot of engineering to refactor them between driver models so that they can still work invisibly which they are actively investing in. I just expect the DX12 gap to pretty much disappear.

quote:

Anyway, Maxwell was extremely poo poo at DX12 in some cases, because it lacks the hardware to implement some of the features of that API. And it will never, ever get significantly better at it with driver updates. Everyone assumed that Pascal would implement those features, but that hasn't been the case: it still lacks them. So once again, there's no way to get the kind of performance you can get out of having an ASIC, when you're doing it in software. Meanwhile, AMD has had this hardware since the 290s, and added some improvements in more recent versions of GCN, so the argument could be made for the longevity of AMD's graphics cards, if anything.

What DX12 features are you talking about in Pascal? I think you are right about the hardware aging "better" to the extent that since GCN has been a much more stable evolution between generations, optimizations for new GPUs tend to trickle down to older ones much more.

BTW, the reason AMD benefits so much from Async Compute is precisely because their GPU perf tends to not map well to their theoretical GFLOP numbers. Their hardware tends to have more "bubbles" in it when pushing draws through the pipeline, resulting in less-than-theoretical utilization. Async compute is a great weapon for them because it lets them address those idle resources and fill them with useful work. You could potentially extrapolate how much NVIDIA GPUs could gain from the feature by comparing their theoretical GFLOPs to regular rendering performance.

quote:

On the other hand, if you're buying a 480 or something, you'll probably upgrade it by the time DX12 / Vulkan titles actually start becoming commonplace, so who cares, really?

While I get the sentiment, my assumption is that the $200-300 GPU market is on the upgrade every 3 years cycle. I'd expect DX12/vulkan to be pretty common within about 2 years (especially as 3rd party developers pick up more modern version of licensed engines).

e:

repiv posted:

<Async Compute Stuff>

also this

Tokamak
Dec 22, 2004

Quick question,

My old GPU died (AMD 7850) a few weeks ago and I rather not wait a month+ for the mid-range launch. Should I go with a launch RX 480 or a slightly cheaper GTX 970? I have a 1920x1200 monitor, and plan to use the card for 3+ years.

Palladium posted:

Rumors on the China Interwebs for GTX 1060: 7th July launch, 14th July availability. Stock 980 performance.
More or less confirmed: 1x6-pin power, no SLI, 192-bit memory bus.

https://www.techpowerup.com/223883/nvidia-geforce-gtx-1060-doesnt-support-sli-reference-pcb-difficult-to-mod

Or maybe I should just suck it up and wait a couple weeks and see how this rumour pans out?

Tokamak fucked around with this message at 16:26 on Jul 4, 2016

Bardeh
Dec 2, 2004

Fun Shoe
At this point it's probably best to wait for AIB 480's to appear, and to see what the price and availability of the 1060 is going to be like. That's what I'll be doing. As much as I want to order a reference 480 right now, I'm going to wait until all the options are on the table.

Setset
Apr 14, 2012
Grimey Drawer

Tokamak posted:

Quick question,

My old GPU died (AMD 7850) a few weeks ago and I rather not wait a month+ for the mid-range launch. Should I go with a launch RX 480 or a slightly cheaper GTX 970? I have a 1920x1200 monitor, and plan to use the card for 3+ years.


Or maybe I should just suck it up and wait a couple weeks and see how this rumour pans out?

the correct answer is always to wait. wait for benchmarks, wait for new cards to release, wait for announcements, wait for ndas to lift.. but yeah I wouldn't buy a reference RX480 unless you can get that $175 deal. The custom ones are almost guaranteed to be better

repiv
Aug 13, 2009

Yeah, don't buy a reference RX480 until the power draw situation is resolved.

Background if you're out of the loop: http://www.pcper.com/reviews/Graphics-Cards/Power-Consumption-Concerns-Radeon-RX-480

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

Tokamak posted:

Quick question,

My old GPU died (AMD 7850) a few weeks ago and I rather not wait a month+ for the mid-range launch. Should I go with a launch RX 480 or a slightly cheaper GTX 970? I have a 1920x1200 monitor, and plan to use the card for 3+ years.

If you *must* buy now and willing to bear a tiny risk of frying your motherboard with the currently out-of-spec RX480, and believe AMD can fix that satisfactorily by this week: RX 480.

If not, 970.

Palladium fucked around with this message at 16:37 on Jul 4, 2016

mobby_6kl
Aug 9, 2009

by Fluffdaddy
^^^
It's summer, go outside and meet people. There will be more choice and lower prices on cards later when the weather sucks.

Ars has some tests of the RX480s in crossfire. Weirdly the data don't match up exactly with some of the conclusions in the text, but there are some interesting charts nevertheless: http://arstechnica.com/gadgets/2016/07/amd-rx-480-crossfire-vs-nvidia-gtx-1080-ashes/

tl;dr version: Ashes of Singularity is best case, as expected. Where CF works, it's somewhere between the 1070 and 1080. You're still dumb if you buy two 480s.

E: Techpowerup: https://www.techpowerup.com/reviews/AMD/RX_480_CrossFire/9.html

mobby_6kl fucked around with this message at 16:45 on Jul 4, 2016

repiv
Aug 13, 2009

mobby_6kl posted:

Ars has some tests of the RX480s in crossfire. Weirdly the data don't match up exactly with some of the conclusions in the text, but there are some interesting charts nevertheless: http://arstechnica.com/gadgets/2016/07/amd-rx-480-crossfire-vs-nvidia-gtx-1080-ashes/

tl;dr version: Ashes of Singularity is best case, as expected. Where CF works, it's somewhere between the 1070 and 1080. You're still dumb if you buy two 480s.

I'm not surprised the performance is more or less what AMD claimed, but I can't figure out what contortion of logic they used to claim better efficiency.



82% more power for 6% more frames doesn't really add up.

Hubis
May 18, 2003

Boy, I wish we had one of those doomsday machines...

repiv posted:

I'm not surprised the performance is more or less what AMD claimed, but I can't figure out what contortion of logic they used to claim better efficiency.



82% more power for 6% more frames doesn't really add up.

have to be talking about perf/$

(in Ashes)

repiv
Aug 13, 2009

Must be. They pushed the power efficiency narrative so hard through the presentation that I assumed that's what they meant there as well.

Hubis
May 18, 2003

Boy, I wish we had one of those doomsday machines...

repiv posted:

Must be. I assumed they meant power efficiency after they pushed the power efficiency narrative so hard through the presentation.

Yes; well, we saw how that turned out.

Moola
Aug 16, 2006

Verizian posted:

When retailers sell their massive stocks of 900 series cards and they get more than 5-15 10x0 cards per SKU in a delivery.

Unless you're in the UK then expect prices to rise, and rise.

god freakin drat it

mobby_6kl
Aug 9, 2009

by Fluffdaddy
To be honest GPU prices are the last thing I'd be worried about if I lived in the UK.

Hubis posted:

Yes; well, we saw how that turned out.
They did significantly improved the power efficiency. Compared to their own previous gen, and it's almost up to maxwell level!

repiv posted:

I'm not surprised the performance is more or less what AMD claimed, but I can't figure out what contortion of logic they used to claim better efficiency.



82% more power for 6% more frames doesn't really add up.

I actually watched the stream back then but what the hell is that slide supposed to say? If two 480s result in 62fps at 51% GPU utilization... isn't one of them being wasted? Then why doesn't a single 480 perform like the 1080??

Tokamak
Dec 22, 2004

Thanks everyone.
After looking a little more into it, I think I'll wait a few weeks and see if the 1060 is promising. I guess I'm stuck playing whatever this ancient, spare 9800 GT can run. On paper it is still three times faster than the iGPU...

mobby_6kl posted:

It's summer, go outside and meet people.
It is winter :negative: :australia:

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

Hubis posted:

have to be talking about perf/$

(in Ashes)

You mean the game nobody cared about other than as a pure DX12 canned benchmark?

Lungboy
Aug 23, 2002

NEED SQUAT FORM HELP

repiv posted:

Yeah, don't buy a reference RX480 until the power draw situation is resolved.

Background if you're out of the loop: http://www.pcper.com/reviews/Graphics-Cards/Power-Consumption-Concerns-Radeon-RX-480

In this OCUK thread about the Sapphire Nitro 480, an OCUK staff member is saying that the power issue is no such thing and hugely downplaying the whole situation. Seems very dodgy if it's as bad as it could be. Rather surprising he's not just saying to wait and see, but I guess OCUK have cards to shift.

Moola
Aug 16, 2006

mobby_6kl posted:

To be honest GPU prices are the last thing I'd be worried about if I lived in the UK.

its actually very important because we need new cards to play better games to escape the reality of this lovely isle

should I bite the bullet on this?

http://www.ebuyer.com/750762-msi-geforce-gtx-1070-aero-8g-oc-8gb-gddr5-dual-link-dvi-d-hdmi-gtx-1070-aero-8g-oc

seems like a reasonable price

mobby_6kl
Aug 9, 2009

by Fluffdaddy

Tokamak posted:

Thanks everyone.
After looking a little more into it, I think I'll wait a few weeks and see if the 1060 is promising. I guess I'm stuck playing whatever this ancient, spare 9800 GT can run. On paper it is still three times faster than the iGPU...

It is winter :negative: :australia:
Yeah just play some older games you might've missed originally for a while. The world won't end :)

Sorry, showing again my best north hemisphere bias!

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

Lungboy posted:

In this OCUK thread about the Sapphire Nitro 480, an OCUK staff member is saying that the power issue is no such thing and hugely downplaying the whole situation. Seems very dodgy if it's as bad as it could be. Rather surprising he's not just saying to wait and see, but I guess OCUK have cards to shift.

It's one thing to say the issue is unlikely to hurt anybody and buyers should understand the potential risk, and another to claim there is no problem when the card is proven conclusively to blatantly violate the PCIE spec. The latter is pure dishonesty.

Anyway: http://www.legitreviews.com/amd-radeon-rx-480-undervolting-performance_183699

quote:

By undervolting the card we were able to keep the clock speed at the top boost clock of 1266MHz for the entire duration of the benchmark run and that meant better performance, lower GPU temperatures and less power consumption. We noticed 10-30W power reductions at the power outlet meter by undervolting the Radeon RX 480! A one voltage fits all approach obviously leaves some room for adjustment!

If this is fix that AMD is going to implement I'm gonna :lol: at them for not doing it right in the first place.

Palladium fucked around with this message at 17:40 on Jul 4, 2016

Naffer
Oct 26, 2004

Not a good chemist

Palladium posted:

It's one thing to say the issue is unlikely to hurt anybody and buyers should understand the potential risk, and another to claim there is no problem when the card is proven conclusively to blatantly violate the PCIE spec. The latter is pure dishonesty.

Anyway: http://www.legitreviews.com/amd-radeon-rx-480-undervolting-performance_183699


If this is fix that AMD is going to implement I'm gonna :lol: at them for not doing it right in the first place.

They might have just been trying to help their yields.
I wouldn't be surprised if this ruckus causes them to start triaging 480 dies that require 1.15V to hit 1266MHz down to lower-clocked 470's, assuming they're going to ship the 470 at a lower clockrate.

Craptacular!
Jul 9, 2001

Fuck the DH

Palladium posted:

The high Pascal prices has completely to do with Nvidia/retailer price gouging and high demand, and totally nothing to do with AMD complete non-competition in the GTX 1070-1080 tier and massively screwing up the RX480 launch.

According to AMD fanboys, anyway.

Nobody cares. The 480 will eventually have cards with aftermarket coolers and modified clocks by Asus, MSI, etc. Those are the models consumers actually buy most often.

NVidia doesn't have high prices on a product because of this limited audience having power problems on the first batch of reference hardware, the high prices are because NVidia always competes among themselves and ignores AMD in pricing. They always have done it that way.

It's about not cannibalizing the 1070 audience, so either price/perf gets adjusted accordingly. Competition would make one model or another the red-headed stepchild, and for some reason they can't have that. That's probably why they got rid of ###Ti cards.

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

Naffer posted:

They might have just been trying to help their yields.
I wouldn't be surprised if this ruckus causes them to start triaging 480 dies that require 1.15V to hit 1266MHz down to lower-clocked 470's, assuming they're going to ship the 470 at a lower clockrate.

Everyone with a RX480 seems to have great success undervolting their cards, so I don't believe overall binning were that bad to begin with.

Besides if there were some truly bad Polaris chips, why not just dump them instead while not unnecessarily risking a PR disaster? So what if the 1060 was right around the corner, it's not like the RX480 is going be discontinued right after that.

And there is evidence the card VRM can be programmed to shift power loading on-the-fly from the slot to the 6-pin, but someone in AMD decides to violate the slot spec than the ridiculously overbuilt 6-pin because :psyduck:

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

I think they accidentally implemented Dr. Frankenstein, and here's hoping he's in software, not hardware.

Hubis
May 18, 2003

Boy, I wish we had one of those doomsday machines...

xthetenth posted:

I think they accidentally implemented Dr. Frankenstein, and here's hoping he's in software, not hardware.

Preview of the Vega launch event:
https://www.youtube.com/watch?v=w1FLZPFI3jc

wicka
Jun 28, 2007


didn't the 480 launch with way, way more stock than the 1070? and isn't it still sold out pretty much everywhere?

Adbot
ADBOT LOVES YOU

Hubis
May 18, 2003

Boy, I wish we had one of those doomsday machines...

wicka posted:

didn't the 480 launch with way, way more stock than the 1070? and isn't it still sold out pretty much everywhere?

[Citation Needed]

There were lots of rumors and WCCFTech articles about how much stock the 480 had, but there were also the same that it had 980ti level performance, so...

It's not surprising that a smaller chip would have better stock than a larger one, all things being equal. Given that they are different processes from different tabs that may have launched at very different points in their production schedule, though, who knows? All you can really say is that there isn't enough of either to meet demand, apparently.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply