Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
RealWorldTech has posted a low-level article about Steamroller's adaptive clock throttling in response to voltage droop. If it detects CPU core voltage dropping by more than 2.5%, it responds with <1 nanosecond latency to lower CPU clockspeed by 7%. I'd be interested to see someone test performance with motherboards of varying power delivery quality, especially when overclocked.

AMD really pulled out all the stops to clock Steamroller as high as possible, it's too bad the underlying architecture couldn't make enough use of those clocks to deliver good performance.

Adbot
ADBOT LOVES YOU

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Rastor posted:

They are developing a 64-bit ARM core, called K12.

They are also developing a new x86 core which is "a new design built from the ground up". Rumor has it that for this new core AMD is giving up on the CMT design used in Bulldozer/Piledriver/Steamroller/Excavator and will instead go back to something more like what Intel has been using.

With the shift to ARM-architecture APUs (which this presumably is) AMD is now in direct competition with Nvidia designs like the Tegra series. Nvidia is ahead on the 64-bit ARM architecture, but if I remember they still haven't quite made it to a fully unified memory architecture. I'm really liking the trend of heterogenous processors working independently inside a unified memory architecture - that really seems like a logical design choice to me. You have your general-purpose processors for random access stuff and the ability to deploy bigger guns on computationally-intensive tasks. The unified memory architecture means there's little penalty for this processor switch, you avoid stuff like time copying down to the coprocessor.

The Bulldozer/Steamroller/etc architecture was just a bad idea and it looks like even AMD is giving up on fixing it. I'm reading this as capitulation and trying to figure out where to go from here. At least they're aware how hosed they are.

Professor Science
Mar 8, 2006
diplodocus + mortarboard = party
edit: nah, delete me

PC LOAD LETTER
May 23, 2005
WTF?!

Paul MaudDib posted:

The Bulldozer/Steamroller/etc architecture was just a bad idea and it looks like even AMD is giving up on fixing it. I'm reading this as capitulation and trying to figure out where to go from here. At least they're aware how hosed they are.
AMD always planned on having some sort of new arch. out by 2015/2016 since Excavator was supposed to be the last BD revision. They probably knew they were hosed all the way back in mid to late 2011 since by then they'd have been able to do plenty of testing on samples from the fabs to see what yields they could get and how well power and performance scaled with clocks. It just takes a long time to design a new architecture and K10h was probably out of scaling room so they had to go with what they had and try to make the best of it.

The combo of patent issues, long development times, highly competent competition, and high production costs in making a high end x86 chip are brutally risky from a business perspective which is why no one but AMD tries (tried?) anymore to compete in that arena against Intel.

SwissArmyDruid
Feb 14, 2014

by sebmojo

PC LOAD LETTER posted:

AMD always planned on having some sort of new arch. out by 2015/2016 since Excavator was supposed to be the last BD revision. They probably knew they were hosed all the way back in mid to late 2011 since by then they'd have been able to do plenty of testing on samples from the fabs to see what yields they could get and how well power and performance scaled with clocks. It just takes a long time to design a new architecture and K10h was probably out of scaling room so they had to go with what they had and try to make the best of it.

The combo of patent issues, long development times, highly competent competition, and high production costs in making a high end x86 chip are brutally risky from a business perspective which is why no one but AMD tries (tried?) anymore to compete in that arena against Intel.

Let's not kid ourselves, while at the same time AMD was spending money like a company much larger than its real size, Intel was busy bribing the PC manufacturers and engaged in anticompetitive behaviour.

Edit: I sometimes think that AMD actually died in 2009 when they spun off GloFo and that they just don't know it yet, and that the purchase of ATI was the life support that's keeping the coma patient going. $1.25 billion in lawsuit settlement doesn't exactly pay for much.

SwissArmyDruid fucked around with this message at 23:15 on May 8, 2014

PC LOAD LETTER
May 23, 2005
WTF?!
Yeah the anti competitive practices by Intel certainly didn't help at all but AFAIK that was more of a way to hamstring AMD during the K7/A64 days. By the time AMD was pushing PhenomII's vs Core2 I don't think Intel was evening bothering to try that stuff anymore.

e: Even during the 'good days' when their chips sold well and for good ASP's AMD was always fab constrained so I kinda hate to say it but going forward due to the increasing costs of shrinking the processes down they would've been 'doomed' to go fabless eventually.

PC LOAD LETTER fucked around with this message at 08:10 on May 8, 2014

Arzachel
May 12, 2012
The insider trading Ruiz did while cutting back on R&D funds and overpaying for Ati certainly didn't help either.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
I feel like AMD's biggest mistake in recent years was not crushing Intel in the low-power market. The netbook was born with the launch of the Intel Atom in 2008, and then withered and died because it took Intel six years to release a decent successor. AMD Brazos processors were launched at the beginning of 2011 and were the perfect netbook SoCs, yet AMD devoted limited manufacturing resources to Brazos and availability was poor, so it mostly appeared in low-end notebooks. I believe that if AMD had really pushed Brazos in the netbook segment they would still be considered a viable mainstream form factor today.

Selling cheap netbook SoCs probably isn't very profitable which is likely why they didn't do this versus just making more Radeon GPUs with the TSMC wafers, but it seems like there's value in keeping AMD's CPU division operating outside of consoles.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Alereon posted:

I feel like AMD's biggest mistake in recent years was not crushing Intel in the low-power market. The netbook was born with the launch of the Intel Atom in 2008, and then withered and died because it took Intel six years to release a decent successor. AMD Brazos processors were launched at the beginning of 2011 and were the perfect netbook SoCs, yet AMD devoted limited manufacturing resources to Brazos and availability was poor, so it mostly appeared in low-end notebooks. I believe that if AMD had really pushed Brazos in the netbook segment they would still be considered a viable mainstream form factor today.

Selling cheap netbook SoCs probably isn't very profitable which is likely why they didn't do this versus just making more Radeon GPUs with the TSMC wafers, but it seems like there's value in keeping AMD's CPU division operating outside of consoles.

See, I like APUs as an idea. For your thin-and-lights and your ultrabooks, it makes 100% sense. But where AMD keeps killing me is that they keep trying to scale this stupid poo poo upwards, and instead of having more die space they could be devoting to improving single-threaded performance-per-core on chips that are most certainly going to be paired with a discrete GPU anyways, they keep wasting it on anemic iGPU on-die.

And every few years or so, they keep trotting out the old hybrid Crossfire buzzword again, (and let's face it, when has hybrid Crossfire ever meant poo poo?) and just like every other time, they haven't done squat with it again this time with APUs either.

PC LOAD LETTER
May 23, 2005
WTF?!
They can't really do much to improve single threaded performance without a major revision (which is what Excavator is supposed to be) which takes a long time (years) to do.

They keep throwing more die space at the iGPU for their APU's because for some reason they seem totally unable to increase the bandwidth to the iGPU and can't do much to improve CPU performance beyond what they're already doing so more iGPU is about the only way left for them to differentiate their product vs Intel's offerings and their own older APU's.

The funny thing is if they could just feed it enough data the iGPU would actually be getting some fairly respectable performance and it would add a lot of value to their products even if they continued to do little to improve single thread performance on the CPU side. Many of their cheaper APU's in the thin n' light category are incredibly hamstrung by single channel DDR3 1333 levels of bandwidth though.

Quad channel DDR3 probably isn't practical for them to do for cost reasons, same goes for on package DDR3, but maybe they could've hung some extra DDR3 or GDDR5 off of the chipset to feed their iGPU's better. They did do something like that for a while back when they were still putting the iGPU in the north bridge in the 780G or some such.

SwissArmyDruid
Feb 14, 2014

by sebmojo

PC LOAD LETTER posted:

They can't really do much to improve single threaded performance without a major revision (which is what Excavator is supposed to be) which takes a long time (years) to do.

They keep throwing more die space at the iGPU for their APU's because for some reason they seem totally unable to increase the bandwidth to the iGPU and can't do much to improve CPU performance beyond what they're already doing so more iGPU is about the only way left for them to differentiate their product vs Intel's offerings and their own older APU's.

The funny thing is if they could just feed it enough data the iGPU would actually be getting some fairly respectable performance and it would add a lot of value to their products even if they continued to do little to improve single thread performance on the CPU side. Many of their cheaper APU's in the thin n' light category are incredibly hamstrung by single channel DDR3 1333 levels of bandwidth though.

Quad channel DDR3 probably isn't practical for them to do for cost reasons, same goes for on package DDR3, but maybe they could've hung some extra DDR3 or GDDR5 off of the chipset to feed their iGPU's better. They did do something like that for a while back when they were still putting the iGPU in the north bridge in the 780G or some such.

Well, DDR4 is starting to ship.

gemuse
Oct 13, 2005

Paul MaudDib posted:

Nvidia is ahead on the 64-bit ARM architecture

In what way? Tegra K1 is still based on A15 and Project Denver is nothing but vague press releases at the moment.

Arzachel
May 12, 2012

SwissArmyDruid posted:

Well, DDR4 is starting to ship.

From the looks of it, early DDR4 chips aren't going to bring significant bandwidth increases right away while costing quite a bit more. Stacked memory can't come soon enough.

PC LOAD LETTER
May 23, 2005
WTF?!

SwissArmyDruid posted:

Well, DDR4 is starting to ship.
Initially on a per channel basis it probably won't be any if much faster than current DDR3 and they'll probably be stuck with only 2 channels of it still due to chip package (still using FM2+ so not enough pins) and cost restrictions. If they could do 4 channels of DDR4 it'd be a different story of course. They are supposed to be doing some sort of BGA package APU in 2015 so maybe they could pull it off then since its relatively cheap to add lots more pins to a BGA than a PGA package. Who knows they might still end up pad limited on the die itself which would nix that idea.

On package or soldered to mobo HMC/eDRAM would definitely solve their bandwidth problems for their APU's but would probably still be too expensive. You'd still have to deal with the pin/pad limit issue too.

BOOTY-ADE
Aug 30, 2006

BIG KOOL TELLIN' Y'ALL TO KEEP IT TIGHT

PC LOAD LETTER posted:

Quad channel DDR3 probably isn't practical for them to do for cost reasons, same goes for on package DDR3, but maybe they could've hung some extra DDR3 or GDDR5 off of the chipset to feed their iGPU's better. They did do something like that for a while back when they were still putting the iGPU in the north bridge in the 780G or some such.

I always wondered about this - could they theoretically work this out like they would a laptop with discrete graphics that has its own dedicated memory, versus sharing the system memory? Is it a cost decision or an implementation decision that would keep them doing that (or is it just not possible)? Seems it would make more sense as you said to give the iGPU its own dedicated memory to use, if it's possible and not too costly to do.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

Ozz81 posted:

I always wondered about this - could they theoretically work this out like they would a laptop with discrete graphics that has its own dedicated memory, versus sharing the system memory? Is it a cost decision or an implementation decision that would keep them doing that (or is it just not possible)? Seems it would make more sense as you said to give the iGPU its own dedicated memory to use, if it's possible and not too costly to do.

I'd imagine raw size on the die to add memory would have to be a consideration, the Xbox One is a pretty good example of how much space it takes on their SoC to put eSRAM on there. The memory would also require extra power and thermal considerations.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
If you're going to the effort and cost of adding an additional memory interface to get more bandwidth it would be easier and better to just increase the bus width to improve performance. Compare the Xbox One and PS4, like the 360 the Xbox One uses slow main RAM supplemented by an eDRAM cache, while the PS4 simply uses fast main RAM. Despite having nearly identical CPU and GPU performance, the fact that you just have enough memory bandwidth on the PS4 without resorting to trick is a big deal.

The best solution for AMD APUs was to use mGDDR5, which placed ultra-fast GDDR5 memory on removable DIMMs, and as GDDR5 is based on DDR3 it maintained backwards compatibility. Unfortunately mGDDR5 did not attract sufficient manufacturer interest and never entered the market.

JawnV6
Jul 4, 2004

So hot ...
Anyone know how ddr4 solved row hammer?

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

JawnV6 posted:

Anyone know how ddr4 solved row hammer?
This company selling protocol analyzers says it's still an issue. I did find this patent from Intel that is over my head but seems to describe the memory controller triggering a refresh of the "victim" rows that have been subject to hammering.

JawnV6
Jul 4, 2004

So hot ...
I still have a near-allergic reaction to accessing patents, but reading between the lines on claim 26 makes it sound like there's an extra command that makes the DIMM go figure out which rows are victims? But the abstract still puts it on the MC to figure it out? A big part of the problem iirc is that the DIMM will swap things around and it isn't clear to the MC which rows are victims to any potential hammer.

PC LOAD LETTER
May 23, 2005
WTF?!

Alereon posted:

The best solution for AMD APUs was to use mGDDR5, which placed ultra-fast GDDR5 memory on removable DIMMs.... Unfortunately mGDDR5 did not attract sufficient manufacturer interest and never entered the market.
I wonder if they were planning on something like that. AMD did start releasing their own re-branded RAM not too long ago which struck me as fairly pointless for them. Maybe they had some ideas that they just weren't able to get realized because of market inertia and/or financial reasons?

Hace
Feb 13, 2012

<<Mobius 1, Engage.>>

PC LOAD LETTER posted:

I wonder if they were planning on something like that. AMD did start releasing their own re-branded RAM not too long ago which struck me as fairly pointless for them. Maybe they had some ideas that they just weren't able to get realized because of market inertia and/or financial reasons?

Well there is this: http://www.anandtech.com/show/7702/amd-kaveri-docs-reference-quadchannel-memory-interface-gddr5-option

NullPtr4Lunch
Jun 22, 2012

PC LOAD LETTER posted:

I wonder if they were planning on something like that. AMD did start releasing their own re-branded RAM not too long ago which struck me as fairly pointless for them. Maybe they had some ideas that they just weren't able to get realized because of market inertia and/or financial reasons?

I actually bought that a few years ago. It was the cheapest one with 8-8-8-24 timings and didn't have idiotic fins all over it. So far it's worked well.

NullPtr4Lunch fucked around with this message at 18:53 on May 15, 2014

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

PC LOAD LETTER posted:

I wonder if they were planning on something like that. AMD did start releasing their own re-branded RAM not too long ago which struck me as fairly pointless for them. Maybe they had some ideas that they just weren't able to get realized because of market inertia and/or financial reasons?
It's not actually AMD RAM, they just licensed their brand to sub-vendors that provide their own sales and warranty support. They used to use Patriot, which is reasonably decent RAM, I think that was when it was a pretty good value. Today they use Dataram, and their products don't seem particularly competitive with G.Skill.

PC LOAD LETTER
May 23, 2005
WTF?!
Yea I remember that. Looks they were trying to do something but for some reason just weren't able to pull it off.

Alereon posted:

It's not actually AMD RAM,
I know hence the 're-branded' comment. Didn't know it was just re-branded Patriot RAM but I wasn't paying too much attention to what they were doing with it. It was just like 'WOOOO WE GOT OUR OWN RAM NOW' typical hype but no info. on where they were going with it in the future and the specs looked ordinary so I paid it no mind.

Dilbert As FUCK
Sep 8, 2007

by Cowcaster
Pillbug
Anyone here have a 740 athalon? Thinking about replacing my hp with one, currently have an A8-6800k I believe.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

PC LOAD LETTER posted:

I know hence the 're-branded' comment. Didn't know it was just re-branded Patriot RAM but I wasn't paying too much attention to what they were doing with it. It was just like 'WOOOO WE GOT OUR OWN RAM NOW' typical hype but no info. on where they were going with it in the future and the specs looked ordinary so I paid it no mind.
My point was that AMD doesn't have anything to do with the RAM, they just licensed their name to it, you still have to deal with Dataram for everything. It's not like the old ATI videocards where they were Sapphire-manufactured but branded, sold, and warranteed by ATI.

Dilbert As gently caress posted:

Anyone here have a 740 athalon? Thinking about replacing my hp with one, currently have an A8-6800k I believe.
Please confirm, none of that makes sense. An Athlon X4 740 is just an A8-5500 minus the integrated graphics, the previous generation CPU to what you have now.

PC LOAD LETTER
May 23, 2005
WTF?!

Alereon posted:

you still have to deal with Dataram for everything.
OK my bad, missed your point.

Dilbert As FUCK
Sep 8, 2007

by Cowcaster
Pillbug

Alereon posted:


Please confirm, none of that makes sense. An Athlon X4 740 is just an A8-5500 minus the integrated graphics, the previous generation CPU to what you have now.

poo poo sorry I was on the goon app, I am on the A8-6500. I am looking to upgrade? to the AMD Athlon X4 740 Trinity.

To my understanding the 740 had 4 physical cores, unlike the A8-6500 which is 2 cores and 2 FPU's.

Should I just gently caress it and go to intel AM3+

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Dilbert As gently caress posted:

poo poo sorry I was on the goon app, I am on the A8-6500. I am looking to upgrade? to the AMD Athlon X4 740 Trinity.

To my understanding the 740 had 4 physical cores, unlike the A8-6500 which is 2 cores and 2 FPU's.

Should I just gently caress it and go to intel AM3+
The Athlon X4 740 is same tier APU as what you have but from the previous generation and with the GPU part disabled. If you need more CPU performance you really do need to go Intel. You can't upgrade to an AMD 7000-series APU (new socket), and you couldn't overclock an A10-6800K (OEM motherboard), and a straight upgrade probably wouldn't get you enough to be worth it (~15% max).

Dilbert As FUCK
Sep 8, 2007

by Cowcaster
Pillbug

Alereon posted:

The Athlon X4 740 is same tier APU as what you have but from the previous generation and with the GPU part disabled. If you need more CPU performance you really do need to go Intel. You can't upgrade to an AMD 7000-series APU (new socket), and you couldn't overclock an A10-6800K (OEM motherboard), and a straight upgrade probably wouldn't get you enough to be worth it (~15% max).

Ah okay thanks.

canyoneer
Sep 13, 2005


I only have canyoneyes for you

That is so many megahertz we are talking about.

dud root
Mar 30, 2008
I had the 1.2Ghz Thunderbird and holy poo poo was Half Life butter smooth :smug:

Panty Saluter
Jan 17, 2004

Making learning fun!

dud root posted:

I had the 1.2Ghz Thunderbird and holy poo poo was Half Life butter smooth :smug:

I should hope so, minimum spec for Half Life was a 233 mHz Pentium 1 :v:

BOOTY-ADE
Aug 30, 2006

BIG KOOL TELLIN' Y'ALL TO KEEP IT TIGHT

dud root posted:

I had the 1.2Ghz Thunderbird and holy poo poo was Half Life butter smooth :smug:

I used to love that during that period, you could get a mobile CPU that was the same socket type, ran at a lower power, and toss it in your desktop to overclock the hell out of it. I had a Barton based mobile Athlon XP that I got up to 2.2Ghz on air, at the time that thing was one of the best on the block. Only problem I ever had with those older AMD procs was the lack of a heatspreader and the very high likelihood of either damaging the die, or jamming a flathead screwdriver into your motherboard putting the drat heatsink clip in place.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

Ozz81 posted:

I used to love that during that period, you could get a mobile CPU that was the same socket type, ran at a lower power, and toss it in your desktop to overclock the hell out of it. I had a Barton based mobile Athlon XP that I got up to 2.2Ghz on air, at the time that thing was one of the best on the block. Only problem I ever had with those older AMD procs was the lack of a heatspreader and the very high likelihood of either damaging the die, or jamming a flathead screwdriver into your motherboard putting the drat heatsink clip in place.

I had one of those Mobile Barton's as well, 2.2 ghz sounds about right for what I hit as well, I remember being in the 3200+ range of speeds. God those CPU's were amazing. I shall shed a single tear for the boxes of computers I have sitting in my closet that were using AMD processors, maybe with AMD designing a new core and the cross-licensing deal so they have access to 14nm, we might see something worth buying from them in a few years.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
Sup Barton bros, I had a Barton 2500+ @ 2.2Ghz also, in an Abit NF7-S v2.0 motherboard. It actually became my mom's living room computer after I upgraded, and I finally retired it yesterday and replaced it with an Intel NUC with a Core i5 4250U. Before that I had a Thunderbird 900A in an Abit KT7A-RAID that didn't quite make 1Ghz. That fuckin' CPU, not a Mhz past 10%. In retrospect I really like owning high-end motherboards, the Asus P5E-Deluxe has served me well since I replaced that Barton system, and I expect I'll use an Asus ROG Maximus VII Hero in my next build.

I really wish AMD could pull it together, at the very least stop having these stupid issues with motherboard power delivery.

Alereon fucked around with this message at 06:36 on May 19, 2014

Cat Hatter
Oct 24, 2006

Hatters gonna hat.

canyoneer posted:


That is so many megahertz we are talking about.

Check out this thing I have sitting next to me on my desk:


I remembered "the biggest upset in processor technology" when my friend's computer got struck by lightning so I made a point to rip this off the motherboard (I wanted to keep the pins straight. Came off disturbingly easy and it makes a dandy carrying case for the processor).

SwissArmyDruid
Feb 14, 2014

by sebmojo

Ozz81 posted:

lack of a heatspreader and the very high likelihood of either damaging the die, or jamming a flathead screwdriver into your motherboard putting the drat heatsink clip in place.

And to think, people get terrified of loving up their processors NOW with the Intel LGA sockets.

Adbot
ADBOT LOVES YOU

Shitty Treat
Feb 21, 2012

Stoopid?

Alereon posted:

I really wish AMD could pull it together, at the very least stop having these stupid issues with motherboard power delivery.

Me to, I just recently finally gave in and replaced my x6 1090t with an i5. Its pretty shocking how much better it performs whilst using a 1/4 of the power my 1090 was drawing whilst overclocked.
Those days of the awesome barton mobile or the opterons that could be used in the 939 desktop boards kept me with some hope that they would eventually make something decent to compete with Intel.
I think I waited long enough.

SwissArmyDruid posted:

And to think, people get terrified of loving up their processors NOW with the Intel LGA sockets.

Imagine those people having to do a CPU pin mod to overclock instead of being able to just raise a multiplier a little and get an extra GHz clockspeed.

Shitty Treat fucked around with this message at 12:08 on May 19, 2014

  • Locked thread