|
RealWorldTech has posted a low-level article about Steamroller's adaptive clock throttling in response to voltage droop. If it detects CPU core voltage dropping by more than 2.5%, it responds with <1 nanosecond latency to lower CPU clockspeed by 7%. I'd be interested to see someone test performance with motherboards of varying power delivery quality, especially when overclocked. AMD really pulled out all the stops to clock Steamroller as high as possible, it's too bad the underlying architecture couldn't make enough use of those clocks to deliver good performance.
|
# ? May 7, 2014 17:56 |
|
|
# ? May 21, 2024 18:16 |
|
Rastor posted:They are developing a 64-bit ARM core, called K12. With the shift to ARM-architecture APUs (which this presumably is) AMD is now in direct competition with Nvidia designs like the Tegra series. Nvidia is ahead on the 64-bit ARM architecture, but if I remember they still haven't quite made it to a fully unified memory architecture. I'm really liking the trend of heterogenous processors working independently inside a unified memory architecture - that really seems like a logical design choice to me. You have your general-purpose processors for random access stuff and the ability to deploy bigger guns on computationally-intensive tasks. The unified memory architecture means there's little penalty for this processor switch, you avoid stuff like time copying down to the coprocessor. The Bulldozer/Steamroller/etc architecture was just a bad idea and it looks like even AMD is giving up on fixing it. I'm reading this as capitulation and trying to figure out where to go from here. At least they're aware how hosed they are.
|
# ? May 7, 2014 18:12 |
|
edit: nah, delete me
|
# ? May 7, 2014 19:41 |
|
Paul MaudDib posted:The Bulldozer/Steamroller/etc architecture was just a bad idea and it looks like even AMD is giving up on fixing it. I'm reading this as capitulation and trying to figure out where to go from here. At least they're aware how hosed they are. The combo of patent issues, long development times, highly competent competition, and high production costs in making a high end x86 chip are brutally risky from a business perspective which is why no one but AMD tries (tried?) anymore to compete in that arena against Intel.
|
# ? May 8, 2014 06:01 |
|
PC LOAD LETTER posted:AMD always planned on having some sort of new arch. out by 2015/2016 since Excavator was supposed to be the last BD revision. They probably knew they were hosed all the way back in mid to late 2011 since by then they'd have been able to do plenty of testing on samples from the fabs to see what yields they could get and how well power and performance scaled with clocks. It just takes a long time to design a new architecture and K10h was probably out of scaling room so they had to go with what they had and try to make the best of it. Let's not kid ourselves, while at the same time AMD was spending money like a company much larger than its real size, Intel was busy bribing the PC manufacturers and engaged in anticompetitive behaviour. Edit: I sometimes think that AMD actually died in 2009 when they spun off GloFo and that they just don't know it yet, and that the purchase of ATI was the life support that's keeping the coma patient going. $1.25 billion in lawsuit settlement doesn't exactly pay for much. SwissArmyDruid fucked around with this message at 23:15 on May 8, 2014 |
# ? May 8, 2014 07:53 |
|
Yeah the anti competitive practices by Intel certainly didn't help at all but AFAIK that was more of a way to hamstring AMD during the K7/A64 days. By the time AMD was pushing PhenomII's vs Core2 I don't think Intel was evening bothering to try that stuff anymore. e: Even during the 'good days' when their chips sold well and for good ASP's AMD was always fab constrained so I kinda hate to say it but going forward due to the increasing costs of shrinking the processes down they would've been 'doomed' to go fabless eventually. PC LOAD LETTER fucked around with this message at 08:10 on May 8, 2014 |
# ? May 8, 2014 08:07 |
|
The insider trading Ruiz did while cutting back on R&D funds and overpaying for Ati certainly didn't help either.
|
# ? May 9, 2014 11:01 |
|
I feel like AMD's biggest mistake in recent years was not crushing Intel in the low-power market. The netbook was born with the launch of the Intel Atom in 2008, and then withered and died because it took Intel six years to release a decent successor. AMD Brazos processors were launched at the beginning of 2011 and were the perfect netbook SoCs, yet AMD devoted limited manufacturing resources to Brazos and availability was poor, so it mostly appeared in low-end notebooks. I believe that if AMD had really pushed Brazos in the netbook segment they would still be considered a viable mainstream form factor today. Selling cheap netbook SoCs probably isn't very profitable which is likely why they didn't do this versus just making more Radeon GPUs with the TSMC wafers, but it seems like there's value in keeping AMD's CPU division operating outside of consoles.
|
# ? May 9, 2014 17:24 |
|
Alereon posted:I feel like AMD's biggest mistake in recent years was not crushing Intel in the low-power market. The netbook was born with the launch of the Intel Atom in 2008, and then withered and died because it took Intel six years to release a decent successor. AMD Brazos processors were launched at the beginning of 2011 and were the perfect netbook SoCs, yet AMD devoted limited manufacturing resources to Brazos and availability was poor, so it mostly appeared in low-end notebooks. I believe that if AMD had really pushed Brazos in the netbook segment they would still be considered a viable mainstream form factor today. See, I like APUs as an idea. For your thin-and-lights and your ultrabooks, it makes 100% sense. But where AMD keeps killing me is that they keep trying to scale this stupid poo poo upwards, and instead of having more die space they could be devoting to improving single-threaded performance-per-core on chips that are most certainly going to be paired with a discrete GPU anyways, they keep wasting it on anemic iGPU on-die. And every few years or so, they keep trotting out the old hybrid Crossfire buzzword again, (and let's face it, when has hybrid Crossfire ever meant poo poo?) and just like every other time, they haven't done squat with it again this time with APUs either.
|
# ? May 9, 2014 21:03 |
|
They can't really do much to improve single threaded performance without a major revision (which is what Excavator is supposed to be) which takes a long time (years) to do. They keep throwing more die space at the iGPU for their APU's because for some reason they seem totally unable to increase the bandwidth to the iGPU and can't do much to improve CPU performance beyond what they're already doing so more iGPU is about the only way left for them to differentiate their product vs Intel's offerings and their own older APU's. The funny thing is if they could just feed it enough data the iGPU would actually be getting some fairly respectable performance and it would add a lot of value to their products even if they continued to do little to improve single thread performance on the CPU side. Many of their cheaper APU's in the thin n' light category are incredibly hamstrung by single channel DDR3 1333 levels of bandwidth though. Quad channel DDR3 probably isn't practical for them to do for cost reasons, same goes for on package DDR3, but maybe they could've hung some extra DDR3 or GDDR5 off of the chipset to feed their iGPU's better. They did do something like that for a while back when they were still putting the iGPU in the north bridge in the 780G or some such.
|
# ? May 9, 2014 23:54 |
|
PC LOAD LETTER posted:They can't really do much to improve single threaded performance without a major revision (which is what Excavator is supposed to be) which takes a long time (years) to do. Well, DDR4 is starting to ship.
|
# ? May 10, 2014 04:07 |
|
Paul MaudDib posted:Nvidia is ahead on the 64-bit ARM architecture In what way? Tegra K1 is still based on A15 and Project Denver is nothing but vague press releases at the moment.
|
# ? May 10, 2014 04:49 |
|
SwissArmyDruid posted:Well, DDR4 is starting to ship. From the looks of it, early DDR4 chips aren't going to bring significant bandwidth increases right away while costing quite a bit more. Stacked memory can't come soon enough.
|
# ? May 10, 2014 10:48 |
|
SwissArmyDruid posted:Well, DDR4 is starting to ship. On package or soldered to mobo HMC/eDRAM would definitely solve their bandwidth problems for their APU's but would probably still be too expensive. You'd still have to deal with the pin/pad limit issue too.
|
# ? May 10, 2014 15:52 |
|
PC LOAD LETTER posted:Quad channel DDR3 probably isn't practical for them to do for cost reasons, same goes for on package DDR3, but maybe they could've hung some extra DDR3 or GDDR5 off of the chipset to feed their iGPU's better. They did do something like that for a while back when they were still putting the iGPU in the north bridge in the 780G or some such. I always wondered about this - could they theoretically work this out like they would a laptop with discrete graphics that has its own dedicated memory, versus sharing the system memory? Is it a cost decision or an implementation decision that would keep them doing that (or is it just not possible)? Seems it would make more sense as you said to give the iGPU its own dedicated memory to use, if it's possible and not too costly to do.
|
# ? May 12, 2014 19:07 |
|
Ozz81 posted:I always wondered about this - could they theoretically work this out like they would a laptop with discrete graphics that has its own dedicated memory, versus sharing the system memory? Is it a cost decision or an implementation decision that would keep them doing that (or is it just not possible)? Seems it would make more sense as you said to give the iGPU its own dedicated memory to use, if it's possible and not too costly to do. I'd imagine raw size on the die to add memory would have to be a consideration, the Xbox One is a pretty good example of how much space it takes on their SoC to put eSRAM on there. The memory would also require extra power and thermal considerations.
|
# ? May 12, 2014 19:40 |
|
If you're going to the effort and cost of adding an additional memory interface to get more bandwidth it would be easier and better to just increase the bus width to improve performance. Compare the Xbox One and PS4, like the 360 the Xbox One uses slow main RAM supplemented by an eDRAM cache, while the PS4 simply uses fast main RAM. Despite having nearly identical CPU and GPU performance, the fact that you just have enough memory bandwidth on the PS4 without resorting to trick is a big deal. The best solution for AMD APUs was to use mGDDR5, which placed ultra-fast GDDR5 memory on removable DIMMs, and as GDDR5 is based on DDR3 it maintained backwards compatibility. Unfortunately mGDDR5 did not attract sufficient manufacturer interest and never entered the market.
|
# ? May 12, 2014 20:19 |
|
Anyone know how ddr4 solved row hammer?
|
# ? May 12, 2014 20:27 |
|
JawnV6 posted:Anyone know how ddr4 solved row hammer?
|
# ? May 12, 2014 20:42 |
|
I still have a near-allergic reaction to accessing patents, but reading between the lines on claim 26 makes it sound like there's an extra command that makes the DIMM go figure out which rows are victims? But the abstract still puts it on the MC to figure it out? A big part of the problem iirc is that the DIMM will swap things around and it isn't clear to the MC which rows are victims to any potential hammer.
|
# ? May 12, 2014 21:57 |
|
Alereon posted:The best solution for AMD APUs was to use mGDDR5, which placed ultra-fast GDDR5 memory on removable DIMMs.... Unfortunately mGDDR5 did not attract sufficient manufacturer interest and never entered the market.
|
# ? May 15, 2014 13:58 |
|
PC LOAD LETTER posted:I wonder if they were planning on something like that. AMD did start releasing their own re-branded RAM not too long ago which struck me as fairly pointless for them. Maybe they had some ideas that they just weren't able to get realized because of market inertia and/or financial reasons? Well there is this: http://www.anandtech.com/show/7702/amd-kaveri-docs-reference-quadchannel-memory-interface-gddr5-option
|
# ? May 15, 2014 14:31 |
|
PC LOAD LETTER posted:I wonder if they were planning on something like that. AMD did start releasing their own re-branded RAM not too long ago which struck me as fairly pointless for them. Maybe they had some ideas that they just weren't able to get realized because of market inertia and/or financial reasons? I actually bought that a few years ago. It was the cheapest one with 8-8-8-24 timings and didn't have idiotic fins all over it. So far it's worked well. NullPtr4Lunch fucked around with this message at 18:53 on May 15, 2014 |
# ? May 15, 2014 18:50 |
|
PC LOAD LETTER posted:I wonder if they were planning on something like that. AMD did start releasing their own re-branded RAM not too long ago which struck me as fairly pointless for them. Maybe they had some ideas that they just weren't able to get realized because of market inertia and/or financial reasons?
|
# ? May 15, 2014 19:50 |
|
Hace posted:Well there is this: http://www.anandtech.com/show/7702/amd-kaveri-docs-reference-quadchannel-memory-interface-gddr5-option Alereon posted:It's not actually AMD RAM,
|
# ? May 15, 2014 22:17 |
|
Anyone here have a 740 athalon? Thinking about replacing my hp with one, currently have an A8-6800k I believe.
|
# ? May 15, 2014 22:23 |
|
PC LOAD LETTER posted:I know hence the 're-branded' comment. Didn't know it was just re-branded Patriot RAM but I wasn't paying too much attention to what they were doing with it. It was just like 'WOOOO WE GOT OUR OWN RAM NOW' typical hype but no info. on where they were going with it in the future and the specs looked ordinary so I paid it no mind. Dilbert As gently caress posted:Anyone here have a 740 athalon? Thinking about replacing my hp with one, currently have an A8-6800k I believe.
|
# ? May 16, 2014 00:13 |
|
Alereon posted:you still have to deal with Dataram for everything.
|
# ? May 16, 2014 02:12 |
|
Alereon posted:
poo poo sorry I was on the goon app, I am on the A8-6500. I am looking to upgrade? to the AMD Athlon X4 740 Trinity. To my understanding the 740 had 4 physical cores, unlike the A8-6500 which is 2 cores and 2 FPU's. Should I just gently caress it and go to intel AM3+
|
# ? May 16, 2014 02:34 |
|
Dilbert As gently caress posted:poo poo sorry I was on the goon app, I am on the A8-6500. I am looking to upgrade? to the AMD Athlon X4 740 Trinity.
|
# ? May 16, 2014 03:02 |
|
Alereon posted:The Athlon X4 740 is same tier APU as what you have but from the previous generation and with the GPU part disabled. If you need more CPU performance you really do need to go Intel. You can't upgrade to an AMD 7000-series APU (new socket), and you couldn't overclock an A10-6800K (OEM motherboard), and a straight upgrade probably wouldn't get you enough to be worth it (~15% max). Ah okay thanks.
|
# ? May 16, 2014 03:08 |
|
That is so many megahertz we are talking about.
|
# ? May 19, 2014 04:36 |
|
I had the 1.2Ghz Thunderbird and holy poo poo was Half Life butter smooth
|
# ? May 19, 2014 04:39 |
|
dud root posted:I had the 1.2Ghz Thunderbird and holy poo poo was Half Life butter smooth I should hope so, minimum spec for Half Life was a 233 mHz Pentium 1
|
# ? May 19, 2014 04:47 |
|
dud root posted:I had the 1.2Ghz Thunderbird and holy poo poo was Half Life butter smooth I used to love that during that period, you could get a mobile CPU that was the same socket type, ran at a lower power, and toss it in your desktop to overclock the hell out of it. I had a Barton based mobile Athlon XP that I got up to 2.2Ghz on air, at the time that thing was one of the best on the block. Only problem I ever had with those older AMD procs was the lack of a heatspreader and the very high likelihood of either damaging the die, or jamming a flathead screwdriver into your motherboard putting the drat heatsink clip in place.
|
# ? May 19, 2014 05:10 |
|
Ozz81 posted:I used to love that during that period, you could get a mobile CPU that was the same socket type, ran at a lower power, and toss it in your desktop to overclock the hell out of it. I had a Barton based mobile Athlon XP that I got up to 2.2Ghz on air, at the time that thing was one of the best on the block. Only problem I ever had with those older AMD procs was the lack of a heatspreader and the very high likelihood of either damaging the die, or jamming a flathead screwdriver into your motherboard putting the drat heatsink clip in place. I had one of those Mobile Barton's as well, 2.2 ghz sounds about right for what I hit as well, I remember being in the 3200+ range of speeds. God those CPU's were amazing. I shall shed a single tear for the boxes of computers I have sitting in my closet that were using AMD processors, maybe with AMD designing a new core and the cross-licensing deal so they have access to 14nm, we might see something worth buying from them in a few years.
|
# ? May 19, 2014 05:17 |
|
Sup Barton bros, I had a Barton 2500+ @ 2.2Ghz also, in an Abit NF7-S v2.0 motherboard. It actually became my mom's living room computer after I upgraded, and I finally retired it yesterday and replaced it with an Intel NUC with a Core i5 4250U. Before that I had a Thunderbird 900A in an Abit KT7A-RAID that didn't quite make 1Ghz. That fuckin' CPU, not a Mhz past 10%. In retrospect I really like owning high-end motherboards, the Asus P5E-Deluxe has served me well since I replaced that Barton system, and I expect I'll use an Asus ROG Maximus VII Hero in my next build. I really wish AMD could pull it together, at the very least stop having these stupid issues with motherboard power delivery. Alereon fucked around with this message at 06:36 on May 19, 2014 |
# ? May 19, 2014 06:30 |
|
canyoneer posted:
Check out this thing I have sitting next to me on my desk: I remembered "the biggest upset in processor technology" when my friend's computer got struck by lightning so I made a point to rip this off the motherboard (I wanted to keep the pins straight. Came off disturbingly easy and it makes a dandy carrying case for the processor).
|
# ? May 19, 2014 08:02 |
|
Ozz81 posted:lack of a heatspreader and the very high likelihood of either damaging the die, or jamming a flathead screwdriver into your motherboard putting the drat heatsink clip in place. And to think, people get terrified of loving up their processors NOW with the Intel LGA sockets.
|
# ? May 19, 2014 09:11 |
|
|
# ? May 21, 2024 18:16 |
|
Alereon posted:I really wish AMD could pull it together, at the very least stop having these stupid issues with motherboard power delivery. Me to, I just recently finally gave in and replaced my x6 1090t with an i5. Its pretty shocking how much better it performs whilst using a 1/4 of the power my 1090 was drawing whilst overclocked. Those days of the awesome barton mobile or the opterons that could be used in the 939 desktop boards kept me with some hope that they would eventually make something decent to compete with Intel. I think I waited long enough. SwissArmyDruid posted:And to think, people get terrified of loving up their processors NOW with the Intel LGA sockets. Imagine those people having to do a CPU pin mod to overclock instead of being able to just raise a multiplier a little and get an extra GHz clockspeed. Shitty Treat fucked around with this message at 12:08 on May 19, 2014 |
# ? May 19, 2014 11:58 |