Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
No Gravitas
Jun 12, 2013

by FactsAreUseless

Lord Windy posted:

In this thread, does anyone explain the problems that Bulldozer had? I am interested and want to read up on it.

http://www.agner.org/optimize/microarchitecture.pdf

Section 15.19 is of interest to you.

Long story short: poo poo instruction decode in early models, poo poo execution unit balance which harms integer-based performance, long latencies for many operations, long pipelines causing problems with mispredicted branches aaaaaand finally some issues with caches (more severe on some models).

Adbot
ADBOT LOVES YOU

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

No Gravitas posted:

http://www.agner.org/optimize/microarchitecture.pdf

Section 15.19 is of interest to you.

Long story short: poo poo instruction decode in early models, poo poo execution unit balance which harms integer-based performance, long latencies for many operations, long pipelines causing problems with mispredicted branches aaaaaand finally some issues with caches (more severe on some models).

All of this is prettying shocking to me, especially that AMD's best years from 2003-2006 were primarily because Intel had an overly long pipeline and long latencies that made their chips less power efficient and less competitive. I guess AMD failed to learn a single thing from watching Intel flounder with the P4.

Proud Christian Mom
Dec 20, 2006
READING COMPREHENSION IS HARD
Bulldozer was the epitome of 'hurry up and wait'

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

No Gravitas posted:

http://www.agner.org/optimize/microarchitecture.pdf

Section 15.19 is of interest to you.

Long story short: poo poo instruction decode in early models, poo poo execution unit balance which harms integer-based performance, long latencies for many operations, long pipelines causing problems with mispredicted branches aaaaaand finally some issues with caches (more severe on some models).

Was anything but the pipeline theoretically fixable, and if so would it have had any real improvement on performance? I get the impression long pipelines are intrinsic to the design, but I'm not savvy enough to be able to read the pdf as much more than words in a comparative sense, I completely lack reference to draw conclusions from.

atomicthumbs
Dec 26, 2010


We're in the business of extending man's senses.

Twerk from Home posted:

All of this is prettying shocking to me, especially that AMD's best years from 2003-2006 were primarily because Intel had an overly long pipeline and long latencies that made their chips less power efficient and less competitive. I guess AMD failed to learn a single thing from watching Intel flounder with the P4.

Wasn't part of their success in that time period due to folks from DEC who worked on Alpha getting jobs at AMD and using the stuff they learned from that?

PC LOAD LETTER
May 23, 2005
WTF?!
They were also a lot closer to Intel when it came to process tech back then too. They even had a lead on Intel for a brief time during the transition from aluminum to copper interconnects. AMD got their first by at least a few months.

Today they're almost 2 yr behind Intel on process tech and they don't have control of the fab they use like they used to.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
I'm still confused on why Intel thought Netburst was going to be a thing. I think it speaks volumes that they could force their way through that disaster yet come out fine. Even weirder is that at no point did they really need such a radical new architecture, the PIII was fine. In theory they could have moved the PIII to 90nm and P4 would have had no reason to exist, if 479 replaced 478 in this new timeline AMD would have been sunk even before Piledriver, as the Dothans and Yonahs we're perfectly capable of keeping up with equivalent AMD stuff.

Maybe there was some deepcover AMD shill at Intel which managed to convince everyone to shove their heads up their asses until they saw daylight again.

Further reading on the PDF Gravitas linked has me in stitches at how enormously awful the Atom must be to lose to the VIA Nano. Or I am just not being appreciative enough at how capable the Nano actually is :shrug:?

Nintendo Kid
Aug 4, 2011

by Smythe

FaustianQ posted:

I'm still confused on why Intel thought Netburst was going to be a thing. I think it speaks volumes that they could force their way through that disaster yet come out fine. Even weirder is that at no point did they really need such a radical new architecture, the PIII was fine. In theory they could have moved the PIII to 90nm and P4 would have had no reason to exist, if 479 replaced 478 in this new timeline AMD would have been sunk even before Piledriver, as the Dothans and Yonahs we're perfectly capable of keeping up with equivalent AMD stuff.

Maybe there was some deepcover AMD shill at Intel which managed to convince everyone to shove their heads up their asses until they saw daylight again.

Further reading on the PDF Gravitas linked has me in stitches at how enormously awful the Atom must be to lose to the VIA Nano. Or I am just not being appreciative enough at how capable the Nano actually is :shrug:?

Atom's whole bag was "low power x86 at any cost, and do it as quickly as possible" so it's no surprise it kinda sucked until the latest revs.

PC LOAD LETTER
May 23, 2005
WTF?!

FaustianQ posted:

I'm still confused on why Intel thought Netburst was going to be a thing.
Everyone underestimated how much power and heat would be a problem at smaller process nodes + they overestimated how high they could clock the chips.

Even when cooled with liquid nitrogen I don't the P4 ever got to the 10Ghz it was expected to at some point on air during its lifespan. Highest OC that I can recall for that chip is around 7Ghz stable with extreme cooling.

What gets me is why the hell AMD decided to take the same route with Bulldozer and do a speed demon architecture after the failures of the P4 were clear and it was also clear that power and heat were going to still be a big problem in future process shrinks.

adorai
Nov 2, 2002

10/27/04 Never forget
Grimey Drawer

PC LOAD LETTER posted:

What gets me is why the hell AMD decided to take the same route with Bulldozer and do a speed demon architecture after the failures of the P4 were clear and it was also clear that power and heat were going to still be a big problem in future process shrinks.
I believe the initial idea for BD was that their fpu sharing was going to be a home run. It was only after it was apparent internally that it was not that they started going for clock speed. They needed SOMETHING until the next arch.

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

Nintendo Kid posted:

Atom's whole bag was "low power x86 at any cost, and do it as quickly as possible" so it's no surprise it kinda sucked until the latest revs.

It also wasn't intended for general-purpose PCs at launch.

Intel's idea for Atom was that it would give them a credible competitor to ARM in the broad middle class of devices that need way more processing power than a microcontroller can give them, but not enough to make the cost and packaging of a full x86 PC setup worth the hassle. Think set-top boxes, routers, car infotainment, smart appliances, and things like that. This has been a weak point for Intel for a long time. They didn't care so much when they could just pump out high-margin PC and server processors and leave cheap embedded stuff to others, but with a lot of consumer demand moving from PCs to smart devices, ARM was looking like a much bigger threat.

Because Intel was getting eaten alive in that market and wanted to jump-start a competitive product, they sold bargain-price Atoms to OEMs under a term that was supposed to lock out PCs: a maximum display size restriction in the complete hardware. Ten inches would be more than enough for a barcode scanner, industrial control interface, or something like that, but nobody wanted a laptop that small, right?

Of course, OEMs saw dirt-cheap CPUs that they could slap together with legacy chipsets to run Windows or Linux, and it was off to the races in a machine that just barely met the licensing specs. The original Atom netbooks sucked because the CPUs were never intended to run a full-fat consumer OS on a PC. Now that Intel has actually focused on the "cheap low-end PC" market for Windows tablets, Atoms are actually pretty good at it.

This also explains why "netbook" came and went as a size category; it was entirely driven by Intel (and Microsoft) licensing restrictions. When AMD started building netbook-class processors without the display size limit, OEMs immediately slapped them into larger chassis, and they became super-cheap 15" laptops instead of netbooks. And, because Intel and Microsoft are fighting for the tablet market, you can get cheap Atom x86 tablets right now.

PC LOAD LETTER
May 23, 2005
WTF?!

adorai posted:

I believe the initial idea for BD was that their fpu sharing was going to be a home run. It was only after it was apparent internally that it was not that they started going for clock speed.
Yea the module approach was supposed to give them better multi threaded performance than SMT for a minor hit in single thread performance while using less die space than a 'full' dual or quad core CPU. The idea sounded good and I was quite optimistic about it at first.

BD was always supposed to have higher clocks than K10 though to make up for the loss in IPC and the module idea, or at least AMD's implementation of it, panned out worse than expected for single thread performance so BD turned into a real mess. On top of that multi threaded applications never took off as fast as AMD expected them too either so there are very few programs even today allow BD to shine on the one thing it was meant to be, and sorta is, good at.

WhyteRyce
Dec 30, 2001

All I remember about BD was that JF guy from AMD marketing ultimately poisoning the well

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

PC LOAD LETTER posted:

Yea the module approach was supposed to give them better multi threaded performance than SMT for a minor hit in single thread performance while using less die space than a 'full' dual or quad core CPU. The idea sounded good and I was quite optimistic about it at first.

BD was always supposed to have higher clocks than K10 though to make up for the loss in IPC and the module idea, or at least AMD's implementation of it, panned out worse than expected for single thread performance so BD turned into a real mess. On top of that multi threaded applications never took off as fast as AMD expected them too either so there are very few programs even today allow BD to shine on the one thing it was meant to be, and sorta is, good at.

It'd be weird if AMD Thubans and Visheras age better than their Intel cousins if we head into a heavy multithreaded era. Considering the challenges of this, I doubt it but it'd be kinda funny. Is it me or does AMD always seem to kind of trip over itself in it's dash to "The Future"? Maybe I'm just not hearing enough groundbreaking stuff from Intel, or just that any activity from AMD is good.

Nintendo Kid
Aug 4, 2011

by Smythe
I suspect that a lot of future heavily multithreaded stuff is going to be hampered on bulldozer-type chipsets by the sharing FPUs thing.

PC LOAD LETTER
May 23, 2005
WTF?!
Maybe for FP intensive work loads sure. There is a lot of stuff that is still integer/branchy as heck out there though.

FaustianQ posted:

Is it me or does AMD always seem to kind of trip over itself in it's dash to "The Future"?
I'd say no. BD is their 1st true screw up along those lines of targeting the hardware for the wrong work loads. K5, K6, K7, and K8 mostly got that sort of thing right. K7 and K8 in particular pretty much nailed it.

edit:\/\/\/\/Well sure but lots of stuff isn't vectorized either. You can always nitpick stuff but what is the common case scenario?

PC LOAD LETTER fucked around with this message at 03:07 on Mar 22, 2015

Oblivion590
Nov 23, 2010

PC LOAD LETTER posted:

Maybe for FP intensive work loads sure. There is a lot of stuff that is still integer/branchy as heck out there though.

Floating-point hardware frequently is reused for integer SIMD instructions, so vectorized integer code has just as much trouble.

Longinus00
Dec 29, 2005
Ur-Quan

FaustianQ posted:

I'm still confused on why Intel thought Netburst was going to be a thing. I think it speaks volumes that they could force their way through that disaster yet come out fine. Even weirder is that at no point did they really need such a radical new architecture, the PIII was fine. In theory they could have moved the PIII to 90nm and P4 would have had no reason to exist, if 479 replaced 478 in this new timeline AMD would have been sunk even before Piledriver, as the Dothans and Yonahs we're perfectly capable of keeping up with equivalent AMD stuff.

Maybe there was some deepcover AMD shill at Intel which managed to convince everyone to shove their heads up their asses until they saw daylight again.

Further reading on the PDF Gravitas linked has me in stitches at how enormously awful the Atom must be to lose to the VIA Nano. Or I am just not being appreciative enough at how capable the Nano actually is :shrug:?

Intel loves new architectures.

https://en.wikipedia.org/wiki/Intel_iAPX_432
https://en.wikipedia.org/wiki/Intel_i860
https://en.wikipedia.org/wiki/Itanium

You might notice a common pattern behind the failure of those projects.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Longinus00 posted:

Intel loves new architectures.

https://en.wikipedia.org/wiki/Intel_iAPX_432
https://en.wikipedia.org/wiki/Intel_i860
https://en.wikipedia.org/wiki/Itanium

You might notice a common pattern behind the failure of those projects.

They all start with the letter "i"? :v:

Riso
Oct 11, 2008

by merry exmarx
Over-reliance on compilers to make software go fast.

Lord Windy
Mar 26, 2010
Running slower than the 68000 series?

Rastor
Jun 2, 2001

Man, WCCFTech is known for posting some really stupid rumors but the rumor that Samsung is planning to acquire AMD may be the stupidest one yet.

SwissArmyDruid
Feb 14, 2014

by sebmojo
Jesus, that old chestnut again?

Lord Windy
Mar 26, 2010
How good are the Samsung fabs? Wouldn't they be better off doing something else than attempt to make AMD designs better?

SwissArmyDruid
Feb 14, 2014

by sebmojo

Lord Windy posted:

How good are the Samsung fabs? Wouldn't they be better off doing something else than attempt to make AMD designs better?

They already made ARM-based Exynos chips, as well as both planar and 3D TLC NAND. And that's only at their wholly-owned facilities. Recent financials from Nvidia also suggest that they are making chips for Nvidia too, although whether this is GPUs or Tegra is uncertain.

Lord Windy
Mar 26, 2010

SwissArmyDruid posted:

They already made ARM-based Exynos chips, as well as both planar and 3D TLC NAND. And that's only at their wholly-owned facilities. Recent financials from Nvidia also suggest that they are making chips for Nvidia too, although whether this is GPUs or Tegra is uncertain.

Since we're already in crazy land, couldn't thye just make ARM desktop chips instead of fighting with Intel on x86? If there is even a market to fight for that is, but Android could probably work as a Desktop thing.

Nintendo Kid
Aug 4, 2011

by Smythe

Lord Windy posted:

Since we're already in crazy land, couldn't thye just make ARM desktop chips instead of fighting with Intel on x86? If there is even a market to fight for that is, but Android could probably work as a Desktop thing.

Samsung doesn't need AMD to make a lovely ARM desktop if they want, since AMD has no fabs and has less experience with ARM than Samsung themselves have.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Lord Windy posted:

Since we're already in crazy land, couldn't thye just make ARM desktop chips instead of fighting with Intel on x86? If there is even a market to fight for that is, but Android could probably work as a Desktop thing.

Lemme ask you a question. If Intel wants to get into the mobile SoC market with their sub-5-watt parts and heavily subsidizing these parts, why is it crazy for Samsung to want to get into the x86 market? The idea isn't for Samsung to make ARM desktops, it's to get their fingers into the cross-licensing agreement that AMD has with Intel. Intel lets AMD use the x86 free of charge, because AMD lets Intel use the x86-64 free of charge.

This was the crux of a pretty big lawsuit a few years ago where Intel's lawyers tried to break the agreement when Samsung buying AMD rumors came up.

SwissArmyDruid fucked around with this message at 01:24 on Mar 26, 2015

JawnV6
Jul 4, 2004

So hot ...

SwissArmyDruid posted:

Lemme ask you a question. If Intel wants to get into the mobile SoC market with their sub-5-watt parts and heavily subsidizing these parts, why is it crazy for Samsung to want to get into the x86 market?

Want? x86?

Nintendo Kid
Aug 4, 2011

by Smythe

SwissArmyDruid posted:

Lemme ask you a question. If Intel wants to get into the mobile SoC market with their sub-5-watt parts and heavily subsidizing these parts, why is it crazy for Samsung to want to get into the x86 market? The idea isn't for Samsung to make ARM desktops, it's to get their fingers into the cross-licensing agreement that AMD has with Intel. Intel lets AMD use the x86 free of charge, because AMD lets Intel use the x86-64 free of charge.

This was the crux of a pretty big lawsuit a few years ago where Intel's lawyers tried to break the agreement when Samsung buying AMD rumors came up.

Samsung doesn't have the magic ability to make AMD's current lineup perform as well as Intel's stuff, so they'd still need to buy a lot of intel parts for any of their desktops/laptops/servers that are worth a poo poo.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Nintendo Kid posted:

Samsung doesn't have the magic ability to make AMD's current lineup perform as well as Intel's stuff, so they'd still need to buy a lot of intel parts for any of their desktops/laptops/servers that are worth a poo poo.

Agreed. But having fabs three other fabs whose 14nm FinFET process can be directly applied to an AMD product, that's going to improve TDPs relative to current parts, at least, so it's *sort of* like magic. (Especially since they were going to be using Samsung's FinFET at GloFo anyways.)

Remember how AMD was said to have lost an Apple contract because of concerns how they wouldn't be able to keep them supplied a few years back?

Remember how AMD couldn't keep up with production when demand spiked heavily for bitcoin mining last year?

I'm not saying that Samsung buying AMD would make everything better. But it wouldn't be completely without benefit or benefit-neutral for AMD. They would actually get something out of it, which, I assume, is why the rumors persist and keep coming back every few years or so.

SwissArmyDruid fucked around with this message at 02:29 on Mar 26, 2015

Nintendo Kid
Aug 4, 2011

by Smythe

SwissArmyDruid posted:

Agreed. But having fabs three other fabs whose 14nm FinFET process can be directly applied to an AMD product, that's going to improve TDPs relative to current parts, at least, so it's *sort of* like magic. (Especially since they were going to be using Samsung's FinFET at GloFo anyways.)

Remember how AMD was said to have lost an Apple contract because of concerns how they wouldn't be able to keep them supplied a few years back?

Remember how AMD couldn't keep up with production when demand spiked heavily for bitcoin mining last year?

I'm not saying that Samsung buying AMD would make everything better. But it wouldn't be completely without benefit or benefit-neutral for AMD. They would actually get something out of it, which, I assume, is why the rumors persist and keep coming back every few years or so.

Samsung's fabs are largely locked up with producing all their current products and fabbing things on contract. They would need to invest quite a bit more money to build extra cpaacity to try running off x86 stuff as well.

It was also because AMD's performance was terrible. Lack of capacity was just the cherry on top.

AMD couldn't keep up with production for graphics cards because all the AMD cards good for bitcoin mining were obsolete cards that were not being produced in large numbers anymore. The then current cards were worse for mining, since it was a fluke that any of the GPUs they made were good for mining at all.

AMD would get something out of being bought, but Samsung gets about dick out of buying them. Which is why they ain't gonna buy.

Lord Windy
Mar 26, 2010
Does AMD have any good IP, like Sun did? (I assume Sun did, I wasn't old enough, or in the loop enough to understand what happened there).

I'm surprised nobody has bought AMD out, they have to be only worth a few billion will all their debt.

Nintendo Kid
Aug 4, 2011

by Smythe

Lord Windy posted:

Does AMD have any good IP, like Sun did? (I assume Sun did, I wasn't old enough, or in the loop enough to understand what happened there).

I'm surprised nobody has bought AMD out, they have to be only worth a few billion will all their debt.

Well they have the x64 patent/IP and the license to use x86 which is pretty major. The issue is that in order to use either you have to work off of their current offerings, since Intel sure ain't going to sell you some Core designs.

Lord Windy
Mar 26, 2010
Maybe Zen will be super fantastic and solve all problems forever and ever. I'd like that.

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map

Lord Windy posted:

Maybe Zen will be super fantastic and solve all problems forever and ever. I'd like that.
All I want is a general consumer architecture type that can at least be improved upon to compete in its current year market (2016?). I don't want something that will blow the world away, just something that at least follows the trends for what's practical to improve upon in design. Bulldozer couldn't even do that in time.

Wistful of Dollars
Aug 25, 2009

Lord Windy posted:

Maybe Zen will be super fantastic and solve all problems forever and ever. I'd like that.

I'd be delighted if Zen provided an Intel alternative that you wouldn't feel compelled to make excuses for buying.

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map

El Scotch posted:

I'd be delighted if Zen provided an Intel alternative that you wouldn't feel compelled to make excuses for buying.
Most people will attempt to defend their own luxury purchases and often associate them with preferring one brand over another.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Samsung buying AMD makes very little sense, and I doubt it's real. There's not a lot worth having at AMD: the embedded systems contracts, the GPU business, low-power x86, some IP, and I guess maybe a fixer-upper chip design to jump into the desktop market. None of those lack a significant caveat, they'd all take some significant elbow grease to utilize successfully.

The dreamer in me wishes it was true, it would be great if there were a competitor to keep Intel moving forward. But I don't think it really makes sense for Samsung to buy out a client who is barely holding on to profitability with very specific niches, especially when that client is locked into using their fabs to produce high-performance chips.

OK, fanboy time is over. Gotta go drag my new 4690K into the den. :unsmigghh:

Adbot
ADBOT LOVES YOU

Rastor
Jun 2, 2001

I suspect that at most there is some kind of deal being worked on, maybe for Samsung to do some fabbing for AMD, and the rumor mill then blew it up into Samsung is buying AMD!!!!

  • Locked thread