Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
1gnoirents
Jun 28, 2014

hello :)
If the 20nm process started from what I posted, I really didn't mean that. I meant a new/sexy architecture from AMD

Adbot
ADBOT LOVES YOU

mikemelbrooks
Jun 11, 2012

One tough badass

1gnoirents posted:

This trend seems to finally be breaking this year, although I wonder how long 1440 will be popular when 4k has a real chance of being viable with dx12 from my understanding.

Don't forget game companies need to write games with 4k in mind, gently caress peering at a tiny HUD because lazy coders.

veedubfreak
Apr 2, 2005

by Smythe

mikemelbrooks posted:

Don't forget game companies need to write games with 4k in mind, gently caress peering at a tiny HUD because lazy coders.

This is actually an awesome feature of Borderlands the pre-sequel. It lets you adjust your hud size, which is super handy on triple wide monitors.

Mr SoupTeeth
Jan 16, 2015

mikemelbrooks posted:

Don't forget game companies need to write games with 4k in mind, gently caress peering at a tiny HUD because lazy coders.

poo poo, Microsoft still needs to make an OS with 4K in mind. It blows my mind how the market is becoming saturated with these displays yet the scaling is still trash.

Koramei
Nov 11, 2011

I have three regrets
The first is to be born in Joseon.

cisco privilege posted:

That indicates a bad fan so replacing it should fix it. Most semi-stock GPU fans just screw into the heatsink, so you'd need to remove the top cover and replace it. You can oil the fan but it's only a temporary fix.

I'd probably go with this fan assembly instead since the fans daisy-chain power cables and no idea if the one you linked is the exact type required (although theoretically any matching PWM connector should work):
http://www.ebay.com/itm/75mm-Dual-X...549332234&rt=nc

Just wanted to check in to say thanks, I got this and replaced the fans last week and it seems to have fixed it.

Was pretty fiddly replacing them though, drat, my card manufacturer didn't make it easy.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Mr SoupTeeth posted:

poo poo, Microsoft still needs to make an OS with 4K in mind. It blows my mind how the market is becoming saturated with these displays yet the scaling is still trash.

Is Windows 10 not better on that score?

Kazinsal
Dec 13, 2011

Subjunctive posted:

Is Windows 10 not better on that score?

Yeah, it's better. Much better than previous versions, but not perfect. Program developers need to start doing icons in SVG in order to take decent advantage of it totally though.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

HalloKitty posted:

Aren't you just willing that to happen, though? Because you toxxed yourself based on it. There's nothing massively wrong with AMD GPUs. The 290 and 290X are somewhat power hungry, but the main reason for their decline is people constantly saying things like "oh, NVIDIA drivers are the best", and only recommending NVIDIA.

AMD CPUs on the other hand are a desperate cause.

I didn't do a full toxx because Swiss Army Druid didn't set the parameters.

I wouldn't say Nvidia drivers are the best, I've had problems with them being just as finicky as AMD, and the only card to ever die on me was an 8400GS in a media center, otherwise Nvidia or AMD, the cards have just worked. The issue is performance per watt, and AMD simply cannot keep up. I know we want to keep talking about how AMD will survive like some low price cockroach but there comes a point where the added expense in trying to support an AMD card outweighs any benefit both to the OEM and AMD.

The 300 series isn't promising anything good, and since that's an easier and quicker to resolve toxx for now, :toxx: I'll buy an avatar/username combo of the mods picking if the 300 series beats the 200 series in performance per watt while having a better price/performance ratio than comparable Nvidia cards. This isn't an impossible standard to set, correct?

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva
^^^
Dunno if they can pull that off without a die shrink but it'd be nice to see at least. Doubt we'll see anything like what happened with the 3850/3870 cards that basically fixed the inefficient 2900xt's, or the 6xxx-series cards that fixed the shortcomings and improved on the 5xxx-series.

Koramei posted:

Just wanted to check in to say thanks, I got this and replaced the fans last week and it seems to have fixed it.

Was pretty fiddly replacing them though, drat, my card manufacturer didn't make it easy.
Glad you got it sorted. Sapphire fans just seem to like falling apart after a couple years. Since you posted that I've read 3-4 recent posts on different forums where Dual-X fans failed for people and they needed replacements. The good thing is the cooler's beefy enough most of the time you can strap a 120mm fan to it (or a couple 92mm's) and be set.

future ghost fucked around with this message at 00:19 on Feb 26, 2015

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

FaustianQ posted:

The 300 series isn't promising anything good, and since that's an easier and quicker to resolve toxx for now, :toxx: I'll buy an avatar/username combo of the mods picking if the 300 series beats the 200 series in performance per watt while having a better price/performance ratio than comparable Nvidia cards. This isn't an impossible standard to set, correct?

Comparing worst 300 to best 200? Mean of the whole lines?

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Hmm, so, people are pretty down on the potential of the 300 series then? I thought they might do pretty OK - memory is one of the key bottlenecks for GPUs, and having what amounts to a fuckoff huge L3 cache on-die (1MB L2 on the 290X, to hundreds of megs/gigs of L3 on the R300) should really help.

On the other hand - AMD definitely has not been advertising the R300s yet. And they're like 5 months overdue at this point, which is not good news either. Any idea when AMD is going to release these chips?

I bought a 4K monitor and I'm looking to do a pretty complete system upgrade within 1-4 months to support that. I'm thinking it probably makes sense to spring for a 980 and eventually SLI it, but I will wait and see in case AMD makes a big leap in performance (and hopefully take advantage of price breaks on the 980 as the R300s release).

Paul MaudDib fucked around with this message at 01:36 on Feb 26, 2015

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Subjunctive posted:

Comparing worst 300 to best 200? Mean of the whole lines?

I'm thinking of comparing an R7 350 to an R7 250, R7 360 to R7 260, etc for performance/watt improvement, and equivalent market GPUs between Nvidia and AMD, so 390/980, 380/970, 370/960, etc for whether or not the buy is worth it. Maybe factor in the added expense of increased power usage by requiring beefier PSUs?

If this is a good metric to go by then sure, but what I'm really trying to toxx on is whether AMD is going to make a competitive product without destroying their margins to achieve that. Maybe I am being too specific on this and should just go with reviews, but eh, it'll only be a 20$ idiot tax on my Radeon purchase.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
I'm loathe to buy anything that *requires* a CLC system to operate. The 390X could be the single best GPU of all time, but a CLC - no matter how reliable they might've have gotten in recent years - is still a potential point of failure that will almost certainly fail out of warranty. But this doesn't matter to most, as those who buy ~the best~ generally stick to a 1-2 year upgrade cycle.

About the only thing that'd get me to bite on a new graphics card in the next 6-9 months would be something like a 970Ti or 975 with the memory issue fixed with a larger frame buffer and less-iffy HDMI 2.0 and DP 1.4a support, the latter of which I'd grant might be iffy, since the standard was *just* submitted, and 8K displays are insane to even think about now.

BIG HEADLINE fucked around with this message at 02:59 on Feb 26, 2015

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

BIG HEADLINE posted:

About the only thing that'd get me to bite on a new graphics card in the next 6-9 months would be something like a 970Ti or 975 with the memory issue fixed with a larger frame buffer and less-iffy HDMI 2.0 and DP 1.4a support, the latter of which I'd grant might be iffy, since the standard was *just* submitted, and 8K displays are insane to even think about now.

Apart from hardware accelerated video playback/encoding I don't even know what you could conceivably do with it that a 970/Ti wouldn't be horrifically underpowered for. We're barely at the point where 4K is feasible with games, 4K barely fits in 4gb of memory, 8K is 4x as much stuff. As much as 4K is a toy right now*, 8K would have even fewer practical uses.

* Except for some specific niches like graphics/video work, of course

Ragingsheep
Nov 7, 2009
What happens if an OEM CLC breaks? Does the card marker or AMD pay for a new system or is the whole thing at your own risk?

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Ragingsheep posted:

What happens if an OEM CLC breaks? Does the card marker or AMD pay for a new system or is the whole thing at your own risk?

Assuming the warranty covers the CLC at all (there may be shortened/no coverage) it probably just covers the card. I severely doubt they would cover damage resulting from failure.

You might have better luck with a pre-built system where one company is accountable for the entire thing, but if they were smart they'd probably limit the coverage on the CLC again.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

Paul MaudDib posted:

Apart from hardware accelerated video playback/encoding I don't even know what you could conceivably do with it that a 970/Ti wouldn't be horrifically underpowered for. We're barely at the point where 4K is feasible with games, 4K barely fits in 4gb of memory, 8K is 4x as much stuff. As much as 4K is a toy right now*, 8K would have even fewer practical uses.

* Except for some specific niches like graphics/video work, of course

Yeah, I'm not saying you'd use an updated 9xx for 8K, just that it'd be nice to have DP 1.4a and less-wonky HDMI 2.0 support.

BurritoJustice
Oct 9, 2012

What is wonky about the HDMI 2.0 support on the 900 series? It does 3840x2160@60hz with 4:4:4, what more do you want?

The only cards that needed to use 4:2:0 to get 4K60 were the older Nvidia cards with HDMI 1.4, and even that is a step up from AMD's complete lack of 4K60 support over HDMI.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
I'd heard HDMI 2.0 was sort of an iffy thing on the 9xx cards, and since I don't have any displays to test it for myself, I'd just been under that notion.

sauer kraut
Oct 2, 2004

Paul MaudDib posted:

On the other hand - AMD definitely has not been advertising the R300s yet. And they're like 5 months overdue at this point, which is not good news either. Any idea when AMD is going to release these chips?

AMD and their partners must be sitting on huge piles of 290s, probably 280s since it still shows no signs of being phased out for Tonga, and most likely 270's too after the 750ti smashed the light gaming/HTPC market.

Double Punctuation
Dec 30, 2009

Ships were made for sinking;
Whiskey made for drinking;
If we were made of cellophane
We'd all get stinking drunk much faster!

Kazinsal posted:

Yeah, it's better. Much better than previous versions, but not perfect. Program developers need to start doing icons in SVG in order to take decent advantage of it totally though.

Have they fixed the rendering in Office yet? Because right now, it looks like absolute garbage.

Kazinsal
Dec 13, 2011

dpbjinc posted:

Have they fixed the rendering in Office yet? Because right now, it looks like absolute garbage.

I don't have access to a 4K monitor presently but my WAG is no, that'll take an Office 2015 (and optionally a corresponding update for Office 365).

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

cisco privilege posted:

Sapphire fans just seem to like falling apart after a couple years.

This is interesting because this is basically what happened to my sapphire 5970 after 2.5 years. I just returned it for a refund and bought a gtx680 though (pocketing the $200 price difference)

veedubfreak
Apr 2, 2005

by Smythe
Well looks like despite my previous decision to NOT go SLI, the deal is just too good to pass up.

Even if I decide to sell one I'll still come out ahead vs the card I bought from Microcenter.

1gnoirents
Jun 28, 2014

hello :)

Paul MaudDib posted:

Hmm, so, people are pretty down on the potential of the 300 series then? I thought they might do pretty OK - memory is one of the key bottlenecks for GPUs, and having what amounts to a fuckoff huge L3 cache on-die (1MB L2 on the 290X, to hundreds of megs/gigs of L3 on the R300) should really help.

On the other hand - AMD definitely has not been advertising the R300s yet. And they're like 5 months overdue at this point, which is not good news either. Any idea when AMD is going to release these chips?

I bought a 4K monitor and I'm looking to do a pretty complete system upgrade within 1-4 months to support that. I'm thinking it probably makes sense to spring for a 980 and eventually SLI it, but I will wait and see in case AMD makes a big leap in performance (and hopefully take advantage of price breaks on the 980 as the R300s release).

The cool memory thing is only the 390x unfortunately. If that's what you're looking at then heck yeah, but its the only 300 series card that looks interesting.

I would get into a memory bottleneck being a key bottleneck or not argument here but I feel like I've run that into the ground, and to be fair we are approaching uncharted territories too . Certainly nothing wrong with bandwidth as well, of course, regardless.

Methylethylaldehyde
Oct 23, 2004

BAKA BAKA

Paul MaudDib posted:

On the other hand - AMD definitely has not been advertising the R300s yet. And they're like 5 months overdue at this point, which is not good news either. Any idea when AMD is going to release these chips?

If they're moving to a Global Foundries FinFET process, tapeout, validation, and volume production is a year long process. Given the thermal and power constraints they're running into, moving to a new process would be a pretty compelling reason to hold off one or two quarters.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Methylethylaldehyde posted:

If they're moving to a Global Foundries FinFET process, tapeout, validation, and volume production is a year long process. Given the thermal and power constraints they're running into, moving to a new process would be a pretty compelling reason to hold off one or two quarters.

I'm not sure that the current 300-series will be Samsung FinFET. The deal was announced April 17th, 2014. Even assuming a best-case scenario of implementing Samsung's process tech, there's no way that it was done in time to permit GloFo to tapeout, validate, and ramp up production in time for anything 2015H1.

No, if we are to see anything FinFET from AMD, it likely won't be until 2016.

veedubfreak
Apr 2, 2005

by Smythe

1gnoirents posted:

The cool memory thing is only the 390x unfortunately. If that's what you're looking at then heck yeah, but its the only 300 series card that looks interesting.

I would get into a memory bottleneck being a key bottleneck or not argument here but I feel like I've run that into the ground, and to be fair we are approaching uncharted territories too . Certainly nothing wrong with bandwidth as well, of course, regardless.

Unfortunately there are also rumors that HBM memory is only going to be capable of 4gb on the card. So while they may get more bandwidth, the cards are still going to be limited on actual vram. The more info coming out about the 390x makes me glad I went ahead and just got 980s to hold me over. I mean if the 390x comes out and totally blows away the 980 I'll still upgrade, because I'm just that stupid.

Gwaihir
Dec 8, 2009
Hair Elf

veedubfreak posted:

Unfortunately there are also rumors that HBM memory is only going to be capable of 4gb on the card. So while they may get more bandwidth, the cards are still going to be limited on actual vram. The more info coming out about the 390x makes me glad I went ahead and just got 980s to hold me over. I mean if the 390x comes out and totally blows away the 980 I'll still upgrade, because I'm just that stupid.

Irony speculation: The cards will have 8 gigs of ram, but only 4 gigs of it will be HBM on chip "fast" ram :v:

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Gwaihir posted:

Irony speculation: The cards will have 8 gigs of ram, but only 4 gigs of it will be HBM on chip "fast" ram :v:

I actually think that's pretty likely. I think functionally it'll more or less be 4GB of on-chip cache, or at least have a mode which presents itself to the graphics APIs like that. It may expose the underlying memory arragement to OpenCL, who knows, but I really think they'll go with the cache thing first and foremost.

I'd hazard a guess that there'll be a 4GB 390 model (2+2 or 4+0), a 4+4GB 390x model, and 6-12 months down the road a 4+8GB 390xl model for multi-GPU setups.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Gwaihir posted:

Irony speculation: The cards will have 8 gigs of ram, but only 4 gigs of it will be HBM on chip "fast" ram :v:

That's essentially a given.

It remains to be seen just what kind of performance gains will be derived from having any HBM at all, though. My cynicism is kicking in. It can't be THAT good, if they're not bringing it to their entire lineup.

I fully expect Pascal's entire lineup to have HBM from top-to-bottom from day 1, though.

veedubfreak
Apr 2, 2005

by Smythe

SwissArmyDruid posted:

That's essentially a given.

It remains to be seen just what kind of performance gains will be derived from having any HBM at all, though. My cynicism is kicking in. It can't be THAT good, if they're not bringing it to their entire lineup.

I fully expect Pascal's entire lineup to have HBM from top-to-bottom from day 1, though.

Welp, my credit card just got heavier. 2 g1 980s and a 2nd waterblock on the way. Now the question is, do I give enough of a poo poo to run benches while I have 3 before returning the Microcenter card.

Mr SoupTeeth
Jan 16, 2015

Paul MaudDib posted:

I actually think that's pretty likely. I think functionally it'll more or less be 4GB of on-chip cache, or at least have a mode which presents itself to the graphics APIs like that. It may expose the underlying memory arragement to OpenCL, who knows, but I really think they'll go with the cache thing first and foremost.

I'd hazard a guess that there'll be a 4GB 390 model (2+2 or 4+0), a 4+4GB 390x model, and 6-12 months down the road a 4+8GB 390xl model for multi-GPU setups.

Jokes on all of you, it will boast an incredible 32MB of ESRAM for maximum 900p fidelity. The rest of the RAM will be powered by The Cloud(tm).

Mr SoupTeeth fucked around with this message at 05:21 on Feb 27, 2015

SwissArmyDruid
Feb 14, 2014

by sebmojo

veedubfreak posted:

Welp, my credit card just got heavier. 2 g1 980s and a 2nd waterblock on the way. Now the question is, do I give enough of a poo poo to run benches while I have 3 before returning the Microcenter card.

I'm sorry for your wallet. 4 gigabytes of HBM is a physical limitation, not a design decision. With the current version, HBM comes in one configuration: 1 gigabyte packages, each package being 4 layers of silicon tall, with each layer having a capacity of 2 gigabits.

HBM 2 will come in 1 gigabyte per layer densities, in 2-, 4-, and 8-layer packages, making it a trivial thing to have 8 GB of HBM on the die. I suspect that it won't be one package for thermal and maybe bandwidth reasons, because hey, why the hell not parallelize the hell out of an already massively-parallel memory format? It's an appealing idea to consider the potential of a 32 gigabyte ITX- or low-profile form factor card.

Here's hoping that Arctic Islands's name is a hint at revising for power and temperatures.

SwissArmyDruid fucked around with this message at 07:20 on Feb 27, 2015

PC LOAD LETTER
May 23, 2005
WTF?!

SwissArmyDruid posted:

My cynicism is kicking in. It can't be THAT good, if they're not bringing it to their entire lineup. I fully expect Pascal's entire lineup to have HBM from top-to-bottom from day 1, though.
Cost and production issues with interposers and HBM are probably the limit right now. It'll probably be a long time before you see interposers and HBM either nvidia's or AMD's line up from top to bottom. Its too new and inherently expensive.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
I know it's a kind of ironic speculation that the 390 will have fast and slow RAM, but wouldn't it be more correct to say it's fast and faster RAM? Would we even notice a bottleneck if it needs to switch over to GDDR5?

BurritoJustice
Oct 9, 2012

FaustianQ posted:

I know it's a kind of ironic speculation that the 390 will have fast and slow RAM, but wouldn't it be more correct to say it's fast and faster RAM? Would we even notice a bottleneck if it needs to switch over to GDDR5?

I think it is inherent to HBM providing a speed increase that anything less would cause a bottleneck.

PC LOAD LETTER
May 23, 2005
WTF?!
How so? If they have some sort of caching scheme, which should be easy to pull off given a 4GB HBM RAM buffer, there should be little to no bottleneck in performance. If they do 4GB HBM VRAM + 2-4GB GDDR5 on a 512 or 256 bit bus they should be fine almost no matter what. Even with 'only' 4GB of VRAM at 4K on current cards I don't think there is much issue with the GPU having to go to system RAM due to buffer over runs.

SwissArmyDruid
Feb 14, 2014

by sebmojo

FaustianQ posted:

I know it's a kind of ironic speculation that the 390 will have fast and slow RAM, but wouldn't it be more correct to say it's fast and faster RAM? Would we even notice a bottleneck if it needs to switch over to GDDR5?

Fast and faster is probably correct.

Now, I was ABOUT to say, "At its very worst, you will have GDDR5.", but as we've seen with the 970, if the interface between the interposer and GDDR5 is no good, then it doesn't matter if it's GDDR5, GDDR4, DDR4, or anything, it's worthless.

Fingers crossed that they haven't hosed this up.

Adbot
ADBOT LOVES YOU

BurritoJustice
Oct 9, 2012

I almost want them to do a two layer ram setup and then cock it up just for the schadenfreude after the amount they threw poo poo about the 970. Which is the worst case for the consumer of course, but it'd be a silver liming.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply