Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

WhyteRyce posted:

No one was talking about performance of Medfield vs. ARM's latest offerings, just that opinion of a lot of arm chair architects seemed (or still seems) to be that x86 is just too inefficient and wasteful to be ever used in a small, low power applications.
To be fair, that IS a valid criticism of all existing x86 processors (with the potential exception of AMD's Bobcat microarchitecture, but they are functionally bankrupt). It will remain a correct and valid criticism until Intel develops a completely new microarchitecture that isn't fundamentally inferior to ARM in terms of performance and power efficiency, which they plan to do in 2013. It's an ignorant mistake to assume that this gap is because Intel uses x86, but since they haven't bothered to build or demonstrate an x86 processor that can be competitive with ARM the fact that it's theoretically possible is kind of academic.

To be clear, the comparison I was referencing between Medfield and ARM processors was for the ARM processor generation previous to Medfield, which is why it turned in numbers that were in the same ballpark. Medfield devices (were there more than one?) were released at about the same time that higher-clocked 32nm Cortex A9 SoCs replaced the 45nm ones, and today we have 32nm Krait cores (Qualcomm's A9/A15 hybrid) in shipping phones and Cortex A15 on the way. 2013-2014 are going to be pretty fun years if Intel doesn't completely gently caress up Silvermont. I think we'll find Silvermont-based SoCs having significantly faster single-threaded CPU performance than ARM, but with ARM SoCs continuing to have far more product wins due to better and more flexible bundled IP blocks. Intel's switch to an in-house GPU will also give the ARM SoC vendors a long lead-time where they can rely on lovely graphics drivers to keep Intel from being competitive.

Colonel Sanders posted:

I think Apple certainly has enough $$$ to buy AMD, and I don't think there are many other tech companies with that much cash. Assuming Apple buought AMD and the R/D department, I think Apple has the funds needed to compete against Intel. I don't think it would ever happen. Why would Apple want to design their own CPUs? It would mean diverting funds away from developing crazy poo poo like retina displays. But more importantly, to make a profit off research $$$, Apple would have to sell CPU designs to 3rd parties and I really don't believe Apple wants to do that.
Apple already designs their own CPUs for the iDevices, the iPhone 5 even uses an in-house designed ARM core. They do this because since ARM SoCs are commodities for handset makers, they need to design their own SoC in order to have a unique advantage that can't be equaled just by buying the same SoC in an Android handset. For this reason, they also wouldn't be interested in selling their products to other companies. On the PC side they are not at all happy with Intel because Intel refuses to release CPU models customized to their needs. For example, Intel had prepared a version of Ivy Bridge with on-die GPU memory, but Apple was the only company that was interested so Intel never produced it. I DON'T believe that Apple will buy AMD or develop their own x86 processors, but I DO think that previous rumors are correct and that Apple will switch as many of their products as they can to their custom ARM SoCs long-term.

Adbot
ADBOT LOVES YOU

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Alereon posted:

By "busted" I guess you meant "confirmed", as Medfield solidly proved that it will take a completely new microarchitecture for Intel x86 offerings to be competitive with ARM. Keep in mind that the phones 32nm Medfield was tested against were using last-gen 45nm dual-core Cortex A9 processors. Intel did write excellent x86 browser code that is responsible for a meaningful browsing performance difference versus older ARM products, but currently-shipping ARM processors beat it because they are just that much faster (and other SoC vendors can and do write custom ARM browser code to improve performance). When you consider that the GPU (PowerVR SGX 540) was obsolete at launch (and Intel doesn't seem likely to make any improvements in the future), and the poor battery life compared to dual-core 45nm ARM SoCs, it's clear why Intel isn't a player in the smartphone market.

Maybe I'm missing something. In AnandTech's iPhone 5 review, the Droid Razer i shows to be a significant performer, even compared to the Apple A6, even as a single core, dumpy ol' Atom.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
Those are browser benchmarks, so what they're doing is comparing the performance of the Intel-written native x86 browser code versus to the stock Android browser Java code being JIT-compiled for ARM (and Safari on iOS). Intel definitely did a lot of work to deliver a superior experience for the Android stock browser, but of course it's still the stock browser so those benefits don't translate to Chrome or Firefox (and why the hell would you use the stock browser when those exist?) You also wouldn't see similar performance in regular applications, as they aren't using Intel-optimized native code.

Anandtech had an article with some benchmarks of similar OEM-delivered optimizations provided for the stock Android browser by Qualcomm, I believe, but I'm having trouble finding the article.

JawnV6
Jul 4, 2004

So hot ...
I agree. x86 isn't imposing any limitations on the user experience.

syzygy86
Feb 1, 2008

I think this is interesting:
http://www.tomshardware.com/news/amd-ultramobile-tablet-apu-cpu,18546.html

In particular:

Tom's Hardware posted:

As a result, he is not only reducing AMD's cost, he is also moving away from the PC market as a whole. "40 to 50 percent" of AMD's future business will not be focused on PCs. Instead, he will aim half of AMD's business three areas: At servers, which will leverage AMD's own CPUs, "third-party" CPUs, and will count on SeaMicro's server fabric to provide custom solutions. Another area will be "semi-custom" APUs for the gaming, industrial and communications market and AMD will be aiming its APUs at ultramobile devices.

AMD expects to generate 40-50% of their business in Q3 2013 in areas where they have little or no presence? And by 3rd party CPUs I assume they mean ARM, but other than the TrustZone thing, they haven't announced any work with ARM cores. There's already a lot of companies producing ARM chips, so I'm not sure what AMD could really bring to that market.

Sure the market for low power chips has been expanding, even Intel is taking low power seriously, but it seems like AMD is a bit late to the party and will just be squeezed out on both the high end performance and the low end since they can't match the power consumption of Intel or ARM chips.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
I think the "third party CPUs" thing is Tom's Hardware being lovely when summarizing the call. Xbitlabs outlined the growth areas as:

1. Low-power servers
2. Game consoles and embedded/industrial applications
3. Ultra-mobile computing

I think THG got the "third party" reference from sticking together AMD chips with third party (SeaMicro and formerly Cray Research) glue logic. The idea is that low-power APUs have the possibility to be awesome in servers because you get >Atom performance and power efficiency, combined with capable GPUs that can be used for hardware acceleration.

The 50% target doesn't seem too unreasonable when you consider that AMD is providing at least the GPU for all three next-gen consoles, and shrinking CPU revenues will actually help them reach this goal. I think they have had some traction with their embedded Radeons, so Trinity APUs and their successors have some potential there.

In the Ultramobile market, AMD has Bobcat and its successors positioned as higher-performance, graphically capable alternatives to Atom. Silvermont might improve Atom's CPU competitiveness in 2013, but I think AMD's GPU advantage will likely be compelling for at least two years. 10W Haswells are a risk here, however.

syzygy86
Feb 1, 2008

Alereon posted:

The 50% target doesn't seem too unreasonable when you consider that AMD is providing at least the GPU for all three next-gen consoles, and shrinking CPU revenues will actually help them reach this goal. I think they have had some traction with their embedded Radeons, so Trinity APUs and their successors have some potential there.

I hadn't considered the consoles, which would certainly help AMD. Especially if they have a full APU rather than just the graphics, as the new Xbox is rumored to have.

GRINDCORE MEGGIDO
Feb 28, 1985


e - I'll post that in the GPU thread.

GRINDCORE MEGGIDO fucked around with this message at 10:20 on Oct 22, 2012

Bob Morales
Aug 18, 2006


Just wear the fucking mask, Bob

I don't care how many people I probably infected with COVID-19 while refusing to wear a mask, my comfort is far more important than the health and safety of everyone around me!

It's nice to see AMD is still doing things for their hardcore customers like throwing LAN parties, given the current situation.

http://www.tomshardware.com/picturestory/610-extravalanza-lan-party-byoc.html

unpronounceable
Apr 4, 2010

You mean we still have another game to go through?!
Fallen Rib
So, The Tech Report has just released their review of the FX 8350. It does well in some multi-threaded tests, like x264 encoding and some of the rendering benchmarks, in a couple instances even beating the 3770k. Single threaded performance is poo poo though.

Overclocking was more of the same. There's not much overhead in the CPUs, and they require a ludicrous voltage. The sample wasn't stable at 4.5 GHz with 1.5375V. Apparently AMD said that you can "...expect something closer to 5 GHz". Power consumption at this setting was about 60W higher than at stock (262W when overclocked).

For most of us, don't bother, because any i5 is a better gaming value.

EDIT: Anand's review basically corroborates Tech Report's review, although they had a better overclocking experience, with the 8350 reaching 4.8 GHz. He didn't say exactly what the voltage was (10% above stock), but the power consumption jumped 100W over stock.

unpronounceable fucked around with this message at 06:22 on Oct 23, 2012

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
The FX-8350 is really the product Bulldozer should have been. Unfortunately, Ivy Bridge is still better, but at least it's not humiliating. I'm not as confident about AMD making up more ground with the next generation, I fully expect Haswell to exceed expectations with improved clockspeed scaling due to greater experience with the 22nm process. I'm also hoping Intel also goes back to using soldered heat spreaders, or at least metallic thermal paste. This was a big limiter of overclocking with Ivy Bridge, and Haswell's improved IPC should make it a MONSTER when overclocked.

teagone
Jun 10, 2003

That was pretty intense, huh?

On the lower end of things, at least the FX 4300 competes nicely with the i3-3220 (http://www.anandtech.com/bench/Product/700?vs=677), and at a similar/lower price point too. Shame it's still stuck at 95W TDP though.

syzygy86
Feb 1, 2008

Alereon posted:

I'm not as confident about AMD making up more ground with the next generation, I fully expect Haswell to exceed expectations with improved clockspeed scaling due to greater experience with the 22nm process.

If I remember correctly, AMD's next process isn't even 22nm. They'll be moving from 32nm to 28nm, which as you said will only widen Intel's lead.

chocolateTHUNDER
Jul 19, 2008

GIVE ME ALL YOUR FREE AGENTS

ALL OF THEM
I ask this not as a troll, but as a genuinely curious person:

How is it that AMD fell this far behind in the CPU arms race? I mean 10 years ago they were neck-and-neck with Intel.

Nintendo Kid
Aug 4, 2011

by Smythe

chocolateTHUNDER posted:

I ask this not as a troll, but as a genuinely curious person:

How is it that AMD fell this far behind in the CPU arms race? I mean 10 years ago they were neck-and-neck with Intel.

Intel woke up and stopped pulling stupid poo poo; and had the advantage of all their money to follow through.

Killer robot
Sep 6, 2010

I was having the most wonderful dream. I think you were in it!
Pillbug

Install Gentoo posted:

Intel woke up and stopped pulling stupid poo poo; and had the advantage of all their money to follow through.

Add onto that how performance CPU design and manufacture not only are fantastically expensive, but become more so with each successive generation. Also, even when Intel was behind in performance it still leveraged its name to keep a lot of the most lucrative contracts so even when AMD was on top making lots of money was hard for them. Finally, AMD made some big bets on risky gambles that didn't pan out.

chocolateTHUNDER
Jul 19, 2008

GIVE ME ALL YOUR FREE AGENTS

ALL OF THEM

Killer robot posted:

Add onto that how performance CPU design and manufacture not only are fantastically expensive, but become more so with each successive generation. Also, even when Intel was behind in performance it still leveraged its name to keep a lot of the most lucrative contracts so even when AMD was on top making lots of money was hard for them. Finally, AMD made some big bets on risky gambles that didn't pan out.

Care to name some of those risky bets that didn't pan out? This kind of stuff fascinates me :)

unpronounceable
Apr 4, 2010

You mean we still have another game to go through?!
Fallen Rib

chocolateTHUNDER posted:

Care to name some of those risky bets that didn't pan out? This kind of stuff fascinates me :)

Bulldozer

movax
Aug 30, 2008

Install Gentoo posted:

Intel woke up and stopped pulling stupid poo poo; and had the advantage of all their money to follow through.

Not to mention the large variety of products Intel makes...ethernet controllers, flash, etc. Lots of cash sources.

Ragingsheep
Nov 7, 2009
Wasn't Intel also doing illegal things like paying their customers to not buy AMD CPUs?

adorai
Nov 2, 2002

10/27/04 Never forget
Grimey Drawer

Ragingsheep posted:

Wasn't Intel also doing illegal things like paying their customers to not buy AMD CPUs?

This would be the number one reason amd was unable to get more revenue when they were on top. Intel had to pay a relatively paltry sum because of it.

FSMC
Apr 27, 2003
I love to live this lie

chocolateTHUNDER posted:

I ask this not as a troll, but as a genuinely curious person:

How is it that AMD fell this far behind in the CPU arms race? I mean 10 years ago they were neck-and-neck with Intel.

Part of it may have been the spinning off of manufacturing. At this level making a CPU involves both the design and manufacturing considerations, you can't do them separately. Many of AMDS designs just couldn't be manufactured with decent yield by globalfoundaries.

SYSV Fanfic
Sep 9, 2003

by Pragmatica
Well, AMD's lackluster single core performance finally caused me to jump ship after 12 years of AMD processors starting with the k6-2. I'll keep an eye on AMD's lineup, but I just don't see how they can recover from bulldozer. I guess with all their rumored design wins in the console market they could scrape up the cash to come back.


I was reminded of an interesting article about bulldozer, and I don't know if its been posted but:

http://www.xbitlabs.com/news/cpu/display/20111013232215_Ex_AMD_Engineer_Explains_Bulldozer_Fiasco.html

SYSV Fanfic fucked around with this message at 14:31 on Oct 24, 2012

Bob Morales
Aug 18, 2006


Just wear the fucking mask, Bob

I don't care how many people I probably infected with COVID-19 while refusing to wear a mask, my comfort is far more important than the health and safety of everyone around me!

chocolateTHUNDER posted:

Care to name some of those risky bets that didn't pan out? This kind of stuff fascinates me :)

They must have figured that 8 integer cores with poo poo FPU performance would run today's software faster. That's what you get by assuming programmers and compilers were going to magically cause all software to be incredibly multi-threaded. Unfortunately it's still true that fewer, faster cores are better than more, slower cores.

Unless of course you're running a handful of applications that can be easily multi-threaded.

JawnV6
Jul 4, 2004

So hot ...

keyvin posted:

I was reminded of an interesting article about bulldozer, and I don't know if its been posted but:

http://www.xbitlabs.com/news/cpu/display/20111013232215_Ex_AMD_Engineer_Explains_Bulldozer_Fiasco.html

That seems like a rather myopic take. You're asking the guy who was in charge of X why a massive engineering effort involving hundreds of people across dozens of unrelated disciplines failed and surprise surprise it's some decision about X!!

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

keyvin posted:

I was reminded of an interesting article about bulldozer, and I don't know if its been posted but:

http://www.xbitlabs.com/news/cpu/display/20111013232215_Ex_AMD_Engineer_Explains_Bulldozer_Fiasco.html

Yeah, posted, digested, and puked back up as largely nonsense. That's a facile explanation for the product's many substantial failures at market, and comes from a "told you so!" perspective. The guy saying if they'd only done things the way HE thinks they should have been done, well, by golly, it'd have turned out differently has hindsight to make him look like he's more correct than he actually is. Really, automated layout is pretty much necessary, we're talking a staggering number of incredibly tiny parts. Humans working in conjunction with automated layout tools can produce exceptional products, see: everything else of similar or greater technological sophistication.

Really it's a whole mess of different problems, from inefficient resource utilization in the first place to thread scheduling failures (leading to, basically, an either/or "fix" that helped some things and hurt some others) but I think just a lot of bad decisions rolled up together made for a product that nobody was particularly interested in.

Piledriver's current performance should have been Bulldozer's performance when Bulldozer launched. Except with greater power efficiency and better relative IPC.

And yeah a ton of it goes back to Intel's outright anticompetitive practices back when AMD had some lean, badass CPUs that were cleaning Intel's clock for efficiency, performance, and multitasking (even with Hyperthreading, the shorter execution pipeline and drastically less wasteful processing of the AMD XP series of processors made them better at serialized multitasking than any single-core iteration of the Pentium 4). At no point did AMD ever go above 25% market share in servers despite dramatic superiority in their processors, and that's all on Intel, using bags of money to block paths for what could have been AMD's ascendancy in the market, through forced bundling and a lot of pressure on partners at all levels by Intel.

AMD never really got a leg up after that, while Intel was able to keep selling Pentium 4 processors even though they sucked, all the while using their superior resources to go back to the drawing board and build on the P6 processor development branch. The first sign that Intel had something exciting coming was the Pentium M, a processor that was pretty sickeningly close to AMD's designs at the time in terms of the basic engineering principles behind it. Past that, Intel never got a substantial punishment for what they had done to AMD, and AMD had issues growing beyond the Athlon 64 processors. Hell, they were already tapped out with the last of the Athlon XP processors; they never got the kind of performance they hoped for out of the Barton-core XP processors. They barely saved the 64-bit Athlon/Phenom development track by increasing the cache available in the Athlon II/Phenom II, addressing a serious design flaw, but that was the K10 generation, and it went on forever. Bulldozer was supposed to succeed it. Meanwhile Intel had successfully Tick Tocked their way to processors that were eating AMD's lunch and more power efficient besides. Bulldozer was pushed back, and pushed back again... How many times?

By the time Bulldozer launched, it would have had to compete on every level, and AMD's miscalculations torpedoed the chances of that happening. Even their marketing material was desperate, putting it up against processors that predated Sandy Bridge by a generation or more. Then the first reviews came in and evidence piled up that Bulldozer wasn't even able to MATCH, let alone BEAT the performance of their own, year-2007-level Phenom II X4 processors when it came to single-threaded performance.

Dark loving times.

Longinus00
Dec 29, 2005
Ur-Quan

Bob Morales posted:

They must have figured that 8 integer cores with poo poo FPU performance would run today's software faster. That's what you get by assuming programmers and compilers were going to magically cause all software to be incredibly multi-threaded. Unfortunately it's still true that fewer, faster cores are better than more, slower cores.

Unless of course you're running a handful of applications that can be easily multi-threaded.

Genuinely curious, if bulldozer/piledriver has poo poo FPU performance then how come it's actually competitive in 3D rendering applications like POVRay and video compression? That is 4 FPUs vs. 4 FPUs right?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
It's four double-wide (256 bit) FPUs that can be automagically partitioned into two 128-bit pipelines in the same situations where something like Hyperthreading would be useful. Intel's FPUs, while also 256-bit wide, cannot subdivide like that.

On workloads that aren't as parallelizable or rely on 256-bit wide FP instructions, those double-wide FPUs can't be scheduled as well and work a lot more like four than "eight."

Maxwell Adams
Oct 21, 2000

T E E F S
Didn't Bulldozer have slightly better performance/power draw on Windows 8? Is that still the case with Piledriver?

Nintendo Kid
Aug 4, 2011

by Smythe

Maxwell Adams posted:

Didn't Bulldozer have slightly better performance/power draw on Windows 8? Is that still the case with Piledriver?

Emphasis on the slightly here; most computing tasks still aren't suited to it but the OS can schedule things onto it slightly better.

Any recent intel chip still wipes the floor with it in most use.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
The same scheduling patches made it in to Windows 7, reducing the difference between the two OSes to near-nothing.

Interestingly, Bulldozer can be scheduled differently depending on your priorities, using the power management settings. You can schedule a module like two full cores, and this increases per-core performance and power efficiency because it allows higher turbo states when other modules are parked. Alternatively, you can schedule a thread per module before loading up a second, and this yields higher threaded performance at the expense of some power efficiency, as modules can't be idled as much.

Pegged Lamb
Nov 5, 2007
Probation
Can't post for 3 years!
Is an A10-5800k with a 6670 going to cut it playing Battlefield 3 at 1600x1050 on high

Pegged Lamb fucked around with this message at 15:06 on Oct 25, 2012

Maxwell Adams
Oct 21, 2000

T E E F S

wikipe tama posted:

Is an A10-5800k going to cut it playing Battlefield 3 at 1600x1050 on high

http://www.tomshardware.com/reviews/trinity-gaming-performance,3304-7.html

Nope.

Pegged Lamb
Nov 5, 2007
Probation
Can't post for 3 years!

Sorry I forgot to mention the passive card. Looks like the answer from what I can find is no here as well

Pegged Lamb fucked around with this message at 15:15 on Oct 25, 2012

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

wikipe tama posted:

Sorry I forgot to mention the passive card. Looks like the answer from what I can find is no here as well
You'd need a minimum of a Radeon HD 7750 with the Catalyst 12.11 Beta drivers to get above 30fps average. I'd go with a 7770 or better for decent average performance. If you're not using the integrated GPU, you're much better off with an Intel CPU anyway.

Pegged Lamb
Nov 5, 2007
Probation
Can't post for 3 years!
I'll probably go with that but the box says the gpu can combine with a 6670 (or 6570, or 7670) for some kind of crossfire boost.

If it can't even play Skyrim on medium at high it's pretty much crap. I need to do more research before buying stuff :(

Pegged Lamb fucked around with this message at 21:47 on Oct 25, 2012

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

wikipe tama posted:

I'll probably go with that but the box says the gpu can combine with a 6670 (or 6570, or 7670) for some kind of crossfire boost.

If it can't even play Skyrim on medium at high it's pretty much crap. I need to do more research before buying stuff :(

That capability is technologically neat and also a useless gimmick for all practical intents and purposes.

Nalin
Sep 29, 2007

Hair Elf
Speaking from experience (A8-3500M laptop w/ Radeon 6750M discrete GPU), Crossfire only serves to make everything slow. I get much better performance just running the discrete by itself. It really is useless.

Mill Village
Jul 27, 2007

I'm really interested in the FX 8350. It still has trouble competing with Intel in some areas, but it looks like it performs significantly better then my lovely FX 6100. Video encoding on that processor is painfully slow.

Adbot
ADBOT LOVES YOU

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Mill Village posted:

I'm really interested in the FX 8350. It still has trouble competing with Intel in some areas, but it looks like it performs significantly better then my lovely FX 6100. Video encoding on that processor is painfully slow.

8350 is OK to a certain degree, it seems, but is still ridiculous in terms of power consumption, and has almost no overclocking headroom on air unlike the Intel gear.
How did you end up with a Bulldozer in the first place?

  • Locked thread