|
If the 20nm process started from what I posted, I really didn't mean that. I meant a new/sexy architecture from AMD
|
# ? Feb 25, 2015 20:49 |
|
|
# ? May 30, 2024 06:18 |
|
1gnoirents posted:This trend seems to finally be breaking this year, although I wonder how long 1440 will be popular when 4k has a real chance of being viable with dx12 from my understanding. Don't forget game companies need to write games with 4k in mind, gently caress peering at a tiny HUD because lazy coders.
|
# ? Feb 25, 2015 23:14 |
|
mikemelbrooks posted:Don't forget game companies need to write games with 4k in mind, gently caress peering at a tiny HUD because lazy coders. This is actually an awesome feature of Borderlands the pre-sequel. It lets you adjust your hud size, which is super handy on triple wide monitors.
|
# ? Feb 25, 2015 23:17 |
|
mikemelbrooks posted:Don't forget game companies need to write games with 4k in mind, gently caress peering at a tiny HUD because lazy coders. poo poo, Microsoft still needs to make an OS with 4K in mind. It blows my mind how the market is becoming saturated with these displays yet the scaling is still trash.
|
# ? Feb 25, 2015 23:22 |
|
cisco privilege posted:That indicates a bad fan so replacing it should fix it. Most semi-stock GPU fans just screw into the heatsink, so you'd need to remove the top cover and replace it. You can oil the fan but it's only a temporary fix. Just wanted to check in to say thanks, I got this and replaced the fans last week and it seems to have fixed it. Was pretty fiddly replacing them though, drat, my card manufacturer didn't make it easy.
|
# ? Feb 25, 2015 23:27 |
|
Mr SoupTeeth posted:poo poo, Microsoft still needs to make an OS with 4K in mind. It blows my mind how the market is becoming saturated with these displays yet the scaling is still trash. Is Windows 10 not better on that score?
|
# ? Feb 25, 2015 23:32 |
|
Subjunctive posted:Is Windows 10 not better on that score? Yeah, it's better. Much better than previous versions, but not perfect. Program developers need to start doing icons in SVG in order to take decent advantage of it totally though.
|
# ? Feb 25, 2015 23:46 |
|
HalloKitty posted:Aren't you just willing that to happen, though? Because you toxxed yourself based on it. There's nothing massively wrong with AMD GPUs. The 290 and 290X are somewhat power hungry, but the main reason for their decline is people constantly saying things like "oh, NVIDIA drivers are the best", and only recommending NVIDIA. I didn't do a full toxx because Swiss Army Druid didn't set the parameters. I wouldn't say Nvidia drivers are the best, I've had problems with them being just as finicky as AMD, and the only card to ever die on me was an 8400GS in a media center, otherwise Nvidia or AMD, the cards have just worked. The issue is performance per watt, and AMD simply cannot keep up. I know we want to keep talking about how AMD will survive like some low price cockroach but there comes a point where the added expense in trying to support an AMD card outweighs any benefit both to the OEM and AMD. The 300 series isn't promising anything good, and since that's an easier and quicker to resolve toxx for now, I'll buy an avatar/username combo of the mods picking if the 300 series beats the 200 series in performance per watt while having a better price/performance ratio than comparable Nvidia cards. This isn't an impossible standard to set, correct?
|
# ? Feb 25, 2015 23:49 |
|
^^^ Dunno if they can pull that off without a die shrink but it'd be nice to see at least. Doubt we'll see anything like what happened with the 3850/3870 cards that basically fixed the inefficient 2900xt's, or the 6xxx-series cards that fixed the shortcomings and improved on the 5xxx-series. Koramei posted:Just wanted to check in to say thanks, I got this and replaced the fans last week and it seems to have fixed it. future ghost fucked around with this message at 00:19 on Feb 26, 2015 |
# ? Feb 26, 2015 00:13 |
|
FaustianQ posted:The 300 series isn't promising anything good, and since that's an easier and quicker to resolve toxx for now, I'll buy an avatar/username combo of the mods picking if the 300 series beats the 200 series in performance per watt while having a better price/performance ratio than comparable Nvidia cards. This isn't an impossible standard to set, correct? Comparing worst 300 to best 200? Mean of the whole lines?
|
# ? Feb 26, 2015 01:29 |
|
Hmm, so, people are pretty down on the potential of the 300 series then? I thought they might do pretty OK - memory is one of the key bottlenecks for GPUs, and having what amounts to a fuckoff huge L3 cache on-die (1MB L2 on the 290X, to hundreds of megs/gigs of L3 on the R300) should really help. On the other hand - AMD definitely has not been advertising the R300s yet. And they're like 5 months overdue at this point, which is not good news either. Any idea when AMD is going to release these chips? I bought a 4K monitor and I'm looking to do a pretty complete system upgrade within 1-4 months to support that. I'm thinking it probably makes sense to spring for a 980 and eventually SLI it, but I will wait and see in case AMD makes a big leap in performance (and hopefully take advantage of price breaks on the 980 as the R300s release). Paul MaudDib fucked around with this message at 01:36 on Feb 26, 2015 |
# ? Feb 26, 2015 01:31 |
|
Subjunctive posted:Comparing worst 300 to best 200? Mean of the whole lines? I'm thinking of comparing an R7 350 to an R7 250, R7 360 to R7 260, etc for performance/watt improvement, and equivalent market GPUs between Nvidia and AMD, so 390/980, 380/970, 370/960, etc for whether or not the buy is worth it. Maybe factor in the added expense of increased power usage by requiring beefier PSUs? If this is a good metric to go by then sure, but what I'm really trying to toxx on is whether AMD is going to make a competitive product without destroying their margins to achieve that. Maybe I am being too specific on this and should just go with reviews, but eh, it'll only be a 20$ idiot tax on my Radeon purchase.
|
# ? Feb 26, 2015 02:05 |
|
I'm loathe to buy anything that *requires* a CLC system to operate. The 390X could be the single best GPU of all time, but a CLC - no matter how reliable they might've have gotten in recent years - is still a potential point of failure that will almost certainly fail out of warranty. But this doesn't matter to most, as those who buy ~the best~ generally stick to a 1-2 year upgrade cycle. About the only thing that'd get me to bite on a new graphics card in the next 6-9 months would be something like a 970Ti or 975 with the memory issue fixed with a larger frame buffer and less-iffy HDMI 2.0 and DP 1.4a support, the latter of which I'd grant might be iffy, since the standard was *just* submitted, and 8K displays are insane to even think about now. BIG HEADLINE fucked around with this message at 02:59 on Feb 26, 2015 |
# ? Feb 26, 2015 02:42 |
|
BIG HEADLINE posted:About the only thing that'd get me to bite on a new graphics card in the next 6-9 months would be something like a 970Ti or 975 with the memory issue fixed with a larger frame buffer and less-iffy HDMI 2.0 and DP 1.4a support, the latter of which I'd grant might be iffy, since the standard was *just* submitted, and 8K displays are insane to even think about now. Apart from hardware accelerated video playback/encoding I don't even know what you could conceivably do with it that a 970/Ti wouldn't be horrifically underpowered for. We're barely at the point where 4K is feasible with games, 4K barely fits in 4gb of memory, 8K is 4x as much stuff. As much as 4K is a toy right now*, 8K would have even fewer practical uses. * Except for some specific niches like graphics/video work, of course
|
# ? Feb 26, 2015 03:07 |
|
What happens if an OEM CLC breaks? Does the card marker or AMD pay for a new system or is the whole thing at your own risk?
|
# ? Feb 26, 2015 03:08 |
|
Ragingsheep posted:What happens if an OEM CLC breaks? Does the card marker or AMD pay for a new system or is the whole thing at your own risk? Assuming the warranty covers the CLC at all (there may be shortened/no coverage) it probably just covers the card. I severely doubt they would cover damage resulting from failure. You might have better luck with a pre-built system where one company is accountable for the entire thing, but if they were smart they'd probably limit the coverage on the CLC again.
|
# ? Feb 26, 2015 03:10 |
|
Paul MaudDib posted:Apart from hardware accelerated video playback/encoding I don't even know what you could conceivably do with it that a 970/Ti wouldn't be horrifically underpowered for. We're barely at the point where 4K is feasible with games, 4K barely fits in 4gb of memory, 8K is 4x as much stuff. As much as 4K is a toy right now*, 8K would have even fewer practical uses. Yeah, I'm not saying you'd use an updated 9xx for 8K, just that it'd be nice to have DP 1.4a and less-wonky HDMI 2.0 support.
|
# ? Feb 26, 2015 03:13 |
|
What is wonky about the HDMI 2.0 support on the 900 series? It does 3840x2160@60hz with 4:4:4, what more do you want? The only cards that needed to use 4:2:0 to get 4K60 were the older Nvidia cards with HDMI 1.4, and even that is a step up from AMD's complete lack of 4K60 support over HDMI.
|
# ? Feb 26, 2015 03:51 |
|
I'd heard HDMI 2.0 was sort of an iffy thing on the 9xx cards, and since I don't have any displays to test it for myself, I'd just been under that notion.
|
# ? Feb 26, 2015 03:54 |
|
Paul MaudDib posted:On the other hand - AMD definitely has not been advertising the R300s yet. And they're like 5 months overdue at this point, which is not good news either. Any idea when AMD is going to release these chips? AMD and their partners must be sitting on huge piles of 290s, probably 280s since it still shows no signs of being phased out for Tonga, and most likely 270's too after the 750ti smashed the light gaming/HTPC market.
|
# ? Feb 26, 2015 06:43 |
|
Kazinsal posted:Yeah, it's better. Much better than previous versions, but not perfect. Program developers need to start doing icons in SVG in order to take decent advantage of it totally though. Have they fixed the rendering in Office yet? Because right now, it looks like absolute garbage.
|
# ? Feb 26, 2015 06:54 |
|
dpbjinc posted:Have they fixed the rendering in Office yet? Because right now, it looks like absolute garbage. I don't have access to a 4K monitor presently but my WAG is no, that'll take an Office 2015 (and optionally a corresponding update for Office 365).
|
# ? Feb 26, 2015 07:58 |
|
cisco privilege posted:Sapphire fans just seem to like falling apart after a couple years. This is interesting because this is basically what happened to my sapphire 5970 after 2.5 years. I just returned it for a refund and bought a gtx680 though (pocketing the $200 price difference)
|
# ? Feb 26, 2015 12:28 |
|
Well looks like despite my previous decision to NOT go SLI, the deal is just too good to pass up. Even if I decide to sell one I'll still come out ahead vs the card I bought from Microcenter.
|
# ? Feb 26, 2015 16:28 |
|
Paul MaudDib posted:Hmm, so, people are pretty down on the potential of the 300 series then? I thought they might do pretty OK - memory is one of the key bottlenecks for GPUs, and having what amounts to a fuckoff huge L3 cache on-die (1MB L2 on the 290X, to hundreds of megs/gigs of L3 on the R300) should really help. The cool memory thing is only the 390x unfortunately. If that's what you're looking at then heck yeah, but its the only 300 series card that looks interesting. I would get into a memory bottleneck being a key bottleneck or not argument here but I feel like I've run that into the ground, and to be fair we are approaching uncharted territories too . Certainly nothing wrong with bandwidth as well, of course, regardless.
|
# ? Feb 26, 2015 19:41 |
|
Paul MaudDib posted:On the other hand - AMD definitely has not been advertising the R300s yet. And they're like 5 months overdue at this point, which is not good news either. Any idea when AMD is going to release these chips? If they're moving to a Global Foundries FinFET process, tapeout, validation, and volume production is a year long process. Given the thermal and power constraints they're running into, moving to a new process would be a pretty compelling reason to hold off one or two quarters.
|
# ? Feb 26, 2015 21:32 |
|
Methylethylaldehyde posted:If they're moving to a Global Foundries FinFET process, tapeout, validation, and volume production is a year long process. Given the thermal and power constraints they're running into, moving to a new process would be a pretty compelling reason to hold off one or two quarters. I'm not sure that the current 300-series will be Samsung FinFET. The deal was announced April 17th, 2014. Even assuming a best-case scenario of implementing Samsung's process tech, there's no way that it was done in time to permit GloFo to tapeout, validate, and ramp up production in time for anything 2015H1. No, if we are to see anything FinFET from AMD, it likely won't be until 2016.
|
# ? Feb 26, 2015 21:48 |
|
1gnoirents posted:The cool memory thing is only the 390x unfortunately. If that's what you're looking at then heck yeah, but its the only 300 series card that looks interesting. Unfortunately there are also rumors that HBM memory is only going to be capable of 4gb on the card. So while they may get more bandwidth, the cards are still going to be limited on actual vram. The more info coming out about the 390x makes me glad I went ahead and just got 980s to hold me over. I mean if the 390x comes out and totally blows away the 980 I'll still upgrade, because I'm just that stupid.
|
# ? Feb 26, 2015 21:59 |
|
veedubfreak posted:Unfortunately there are also rumors that HBM memory is only going to be capable of 4gb on the card. So while they may get more bandwidth, the cards are still going to be limited on actual vram. The more info coming out about the 390x makes me glad I went ahead and just got 980s to hold me over. I mean if the 390x comes out and totally blows away the 980 I'll still upgrade, because I'm just that stupid. Irony speculation: The cards will have 8 gigs of ram, but only 4 gigs of it will be HBM on chip "fast" ram
|
# ? Feb 26, 2015 22:30 |
|
Gwaihir posted:Irony speculation: The cards will have 8 gigs of ram, but only 4 gigs of it will be HBM on chip "fast" ram I actually think that's pretty likely. I think functionally it'll more or less be 4GB of on-chip cache, or at least have a mode which presents itself to the graphics APIs like that. It may expose the underlying memory arragement to OpenCL, who knows, but I really think they'll go with the cache thing first and foremost. I'd hazard a guess that there'll be a 4GB 390 model (2+2 or 4+0), a 4+4GB 390x model, and 6-12 months down the road a 4+8GB 390xl model for multi-GPU setups.
|
# ? Feb 27, 2015 02:19 |
|
Gwaihir posted:Irony speculation: The cards will have 8 gigs of ram, but only 4 gigs of it will be HBM on chip "fast" ram That's essentially a given. It remains to be seen just what kind of performance gains will be derived from having any HBM at all, though. My cynicism is kicking in. It can't be THAT good, if they're not bringing it to their entire lineup. I fully expect Pascal's entire lineup to have HBM from top-to-bottom from day 1, though.
|
# ? Feb 27, 2015 02:33 |
|
SwissArmyDruid posted:That's essentially a given. Welp, my credit card just got heavier. 2 g1 980s and a 2nd waterblock on the way. Now the question is, do I give enough of a poo poo to run benches while I have 3 before returning the Microcenter card.
|
# ? Feb 27, 2015 04:43 |
|
Paul MaudDib posted:I actually think that's pretty likely. I think functionally it'll more or less be 4GB of on-chip cache, or at least have a mode which presents itself to the graphics APIs like that. It may expose the underlying memory arragement to OpenCL, who knows, but I really think they'll go with the cache thing first and foremost. Jokes on all of you, it will boast an incredible 32MB of ESRAM for maximum 900p fidelity. The rest of the RAM will be powered by The Cloud(tm). Mr SoupTeeth fucked around with this message at 05:21 on Feb 27, 2015 |
# ? Feb 27, 2015 05:18 |
|
veedubfreak posted:Welp, my credit card just got heavier. 2 g1 980s and a 2nd waterblock on the way. Now the question is, do I give enough of a poo poo to run benches while I have 3 before returning the Microcenter card. I'm sorry for your wallet. 4 gigabytes of HBM is a physical limitation, not a design decision. With the current version, HBM comes in one configuration: 1 gigabyte packages, each package being 4 layers of silicon tall, with each layer having a capacity of 2 gigabits. HBM 2 will come in 1 gigabyte per layer densities, in 2-, 4-, and 8-layer packages, making it a trivial thing to have 8 GB of HBM on the die. I suspect that it won't be one package for thermal and maybe bandwidth reasons, because hey, why the hell not parallelize the hell out of an already massively-parallel memory format? It's an appealing idea to consider the potential of a 32 gigabyte ITX- or low-profile form factor card. Here's hoping that Arctic Islands's name is a hint at revising for power and temperatures. SwissArmyDruid fucked around with this message at 07:20 on Feb 27, 2015 |
# ? Feb 27, 2015 06:25 |
|
SwissArmyDruid posted:My cynicism is kicking in. It can't be THAT good, if they're not bringing it to their entire lineup. I fully expect Pascal's entire lineup to have HBM from top-to-bottom from day 1, though.
|
# ? Feb 27, 2015 07:27 |
|
I know it's a kind of ironic speculation that the 390 will have fast and slow RAM, but wouldn't it be more correct to say it's fast and faster RAM? Would we even notice a bottleneck if it needs to switch over to GDDR5?
|
# ? Feb 27, 2015 07:59 |
|
FaustianQ posted:I know it's a kind of ironic speculation that the 390 will have fast and slow RAM, but wouldn't it be more correct to say it's fast and faster RAM? Would we even notice a bottleneck if it needs to switch over to GDDR5? I think it is inherent to HBM providing a speed increase that anything less would cause a bottleneck.
|
# ? Feb 27, 2015 09:05 |
|
How so? If they have some sort of caching scheme, which should be easy to pull off given a 4GB HBM RAM buffer, there should be little to no bottleneck in performance. If they do 4GB HBM VRAM + 2-4GB GDDR5 on a 512 or 256 bit bus they should be fine almost no matter what. Even with 'only' 4GB of VRAM at 4K on current cards I don't think there is much issue with the GPU having to go to system RAM due to buffer over runs.
|
# ? Feb 27, 2015 09:12 |
|
FaustianQ posted:I know it's a kind of ironic speculation that the 390 will have fast and slow RAM, but wouldn't it be more correct to say it's fast and faster RAM? Would we even notice a bottleneck if it needs to switch over to GDDR5? Fast and faster is probably correct. Now, I was ABOUT to say, "At its very worst, you will have GDDR5.", but as we've seen with the 970, if the interface between the interposer and GDDR5 is no good, then it doesn't matter if it's GDDR5, GDDR4, DDR4, or anything, it's worthless. Fingers crossed that they haven't hosed this up.
|
# ? Feb 27, 2015 09:26 |
|
|
# ? May 30, 2024 06:18 |
|
I almost want them to do a two layer ram setup and then cock it up just for the schadenfreude after the amount they threw poo poo about the 970. Which is the worst case for the consumer of course, but it'd be a silver liming.
|
# ? Feb 27, 2015 09:42 |