Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Ervin K
Nov 4, 2010

by Jeffrey of YOSPOS
God drat, AMD processors are so absurdly complicated. I've never bought an AMD cpu before and I'm trying to build a cheap ($300-400) PC with one since AMD is considered the low cost option. I can't even begin to comprehend what I should buy, I haven't spent nearly as much time choosing an Intel CPU for my gaming rig. Anybody have suggestions?

e: I should mention that likely the most powerful program it will run is photoshop.

Ervin K fucked around with this message at 12:25 on Oct 11, 2012

Adbot
ADBOT LOVES YOU

Nystral
Feb 6, 2002

Every man likes a pretty girl with him at a skeleton dance.
What's you CPU + Mobo budget?

I just pulled the trigger for a hackintosh build but I also bought:

AMD A6 3650 with a BIOSTAR TA75M which is roughly $165 at Newegg. This is the older Lanlo(?) FM1 processors

Alternatively I was going to pick up a new Trinity-based FM2 socket rig which was a A10-5600K paired with a GA-F2A85X which would have been $260.


NOTE: My bidget was a bit higher - $500 without a new case - so there are probably better options for you based on your needs. Look @ the stickied hardware parts thread for more opinions.

But then I went for a hackintosh because all my computers at home must run OSX or some dumb poo poo and I'm working on getting that issue currently sorted out :downs:

InstantInfidel
Jan 9, 2010

BEST :10bux: I EVER SPENT
If you're on a shoe-string budget in the short term, then you can pick up an AMD motherboard + CPU at a Microcenter for a really good price. You're still better off to choose Intel if you can swing it, however.

Gwaihir
Dec 8, 2009
Hair Elf

Ervin K posted:

God drat, AMD processors are so absurdly complicated. I've never bought an AMD cpu before and I'm trying to build a cheap ($300-400) PC with one since AMD is considered the low cost option. I can't even begin to comprehend what I should buy, I haven't spent nearly as much time choosing an Intel CPU for my gaming rig. Anybody have suggestions?

e: I should mention that likely the most powerful program it will run is photoshop.

Just get the cheap pentium branded sandy/ivy bridge chip and call it a day.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Ervin K posted:

God drat, AMD processors are so absurdly complicated. I've never bought an AMD cpu before and I'm trying to build a cheap ($300-400) PC with one since AMD is considered the low cost option. I can't even begin to comprehend what I should buy, I haven't spent nearly as much time choosing an Intel CPU for my gaming rig. Anybody have suggestions?

e: I should mention that likely the most powerful program it will run is photoshop.
You want the system-building, upgrading, and parts-picking megathread.

SYSV Fanfic
Sep 9, 2003

by Pragmatica
AMD Is preparing to lay off 20-30% of their workforce: http://allthingsd.com/20121012/exclusive-amd-to-cut-up-to-30-percent-of-workforce/.

syzygy86
Feb 1, 2008


Well, it was bound to happen sooner or later based on their financial performance for the past few years. Ultimately, I think AMD will survive for quite a while in some form. If the worst does happen, AMD could be a good acquisition target if someone wanted to vertically integrate their entire computing process including chip design.

If Intel is worried about anti-trust concerns, they could make an investment in AMD such as what Microsoft did with Apple in the late 90's. I'm not sure this would happen though since it would require a significant investment. Especially if ARM can scale up performance to match that of lower end x86 chips, that may provide enough "competition" to keep regulators at bay.

pixaal
Jan 8, 2004

All ice cream is now for all beings, no matter how many legs.


I'm hoping they at least continue to make video cards. The last thing we need is no Intel being the number 2 for video cards, Nvidia would love to just stand still and charge $800 for a 2% increase every year. I at least have some faith that large investor will help evolve Intel for super computers and servers because of demand and that will trickle down even if its just Intel in the market for processors. We will lose in prices but at least we will continue to see improvements that can be stolen and reproduced by some new company later.

WhyteRyce
Dec 30, 2001


I like the guy in the comments who was being totally serious about trying to get a hold of ARM management to give them advise

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

WhyteRyce posted:

I like the guy in the comments who was being totally serious about trying to get a hold of ARM management to give them advise

I am very puzzled when that guy refers to AMD's "ATI 3D Graphics Engine." Its like he maybe knew what video cards did at the turn of the millennium and then stopped paying attention. Reading further, apparently he is an idiot who cannot distinguish between hardware, drivers, game engines, and porting software to Linux.

Anyway, chew on this, see how it tastes: Intel Radeon HD 8000 series.

Chuu
Sep 11, 2004

Grimey Drawer

Factory Factory posted:

Anyway, chew on this, see how it tastes: Intel Radeon HD 8000 series.

That seems like the express road to an antitrust case considering how hard they've fought against nVidia acquiring an x86 license.

Mr.Radar
Nov 5, 2005

You guys aren't going to believe this, but that guy is our games teacher.

Chuu posted:

That seems like the express road to an antitrust case considering how hard they've fought against nVidia acquiring an x86 license.

Yeah, the only way I could see Intel being able to get the AMD graphics division is if someone else acquired the AMD processor division, including their x86 patents and perpetual license. The one company that I think has both the resources and inclination to do so is nVidia which is the last company Intel wants to have to compete against in the processor space.

Edit: Thinking about it a little more, Apple could also make a play for the AMD processor division, so they wouldn't be dependent on Intel for the chips in their desktops and laptops (similarly to how they make their own tablet/phone SoCs). They certainly have the resources, but Apple and Intel seem to have a very good relationship so I don't know if Apple has the motivation to do that right now.

Mr.Radar fucked around with this message at 21:58 on Oct 14, 2012

Goon Matchmaker
Oct 23, 2003

I play too much EVE-Online

Mr.Radar posted:

Edit: Thinking about it a little more, Apple could also make a play for the AMD processor division, so they wouldn't be dependent on Intel for the chips in their desktops and laptops (similarly to how they make their own tablet/phone SoCs). They certainly have the resources, but Apple and Intel seem to have a very good relationship so I don't know if Apple has the motivation to do that right now.

Two reasons this will never happen:

1) Intel's power consumption in the mobile space annihilates what AMD currently has and with Haswell there won't be competition.

2) Apple seems to be heavily investing in ARM. They're even making their own processor designs. If anything I see Apple scaling their processors up to the point where they're comparable to whatever Intel offers and then giving Intel the finger.

I wouldn't be surprised if AMD gets eaten piecemeal by nVidia for the x86 license and Apple buying out the graphics division. Then we'll have two big processor manufacturers, Intel and Apple. Unless Nvidia can do something magical, which I think they could. If anything it looks like in the long run ARM is going to come to dominate and x86 will fall to the wayside.

Ragingsheep
Nov 7, 2009
AMD's x86 licence is potentially void if its taken over or bankrupted unless Intel says otherwise though.

Goon Matchmaker
Oct 23, 2003

I play too much EVE-Online

Ragingsheep posted:

AMD's x86 licence is potentially void if its taken over or bankrupted unless Intel says otherwise though.

That reeks of anti-trust lawsuit though.

hobbesmaster
Jan 28, 2008

Ragingsheep posted:

AMD's x86 licence is potentially void if its taken over or bankrupted unless Intel says otherwise though.

And what would this mean for Intel's AMD64 license?

Nintendo Kid
Aug 4, 2011

by Smythe

Goon Matchmaker posted:

If anything I see Apple scaling their processors up to the point where they're comparable to whatever Intel offers and then giving Intel the finger.

This is close to impossible, ARM stuff is way slower than what Intel can put out and Apple does not have the money or R&D knowledge to remedy that on their own.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Does anybody? Nvidia?

Chuu
Sep 11, 2004

Grimey Drawer

Install Gentoo posted:

This is close to impossible, ARM stuff is way slower than what Intel can put out and Apple does not have the money or R&D knowledge to remedy that on their own.

Apple has over $100B in cash, and the Apple A6 CPU shows their hardware engineers know what they're doing.

I really doubt that Apple is looking to sever their relationship with Intel, and Haswell looks like it's going to be an incredible architecture for mobile computing, but if Apple really was looking at developing a high performance ARM processor they have the resources.

edit: now it makes sense.

Chuu fucked around with this message at 04:39 on Oct 15, 2012

Nintendo Kid
Aug 4, 2011

by Smythe

Chuu posted:

Apple has over $100B in cash, and the Apple A6 CPU shows their hardware engineers know what they're doing.

I really doubt that Intel is looking to sever their relationship with Intel, and Haswell looks like it's going to be an incredible architecture for mobile computing, but if Intel really was looking at developing a high performance ARM processor they have the resources.

Do you have any idea how much money it took to get Intel or AMD processors to the performance they are today? $100 billion is nothing in comparison, and frankly the A6 doesn't show poo poo as far as getting an ARM based architecture to x86-64 performance in laptop/desktop applications.

If I remember right the fastest ARM-based cpu anyone's got still performs like a Core Duo or early Core 2 Duo while sucking down more power - and it certainly wasn't Apple who made it.

Chuu
Sep 11, 2004

Grimey Drawer

Install Gentoo posted:

Do you have any idea how much money it took to get Intel or AMD processors to the performance they are today? $100 billion is nothing in comparison, and frankly the A6 doesn't show poo poo as far as getting an ARM based architecture to x86-64 performance in laptop/desktop applications.

If I remember right the fastest ARM-based cpu anyone's got still performs like a Core Duo or early Core 2 Duo while sucking down more power - and it certainly wasn't Apple who made it.

There is no market for high performance ARM hence no good high performance ARM processor. Apple doesn't really have a motive to try to create said processor since Intel is working so closely with them already. It's sort of a pointless trying to argue about what would be possible if we lived in a world where High Performance computing was anything but a niche market.

WhyteRyce
Dec 30, 2001

Install Gentoo posted:

Do you have any idea how much money it took to get Intel or AMD processors to the performance they are today? $100 billion is nothing in comparison, and frankly the A6 doesn't show poo poo as far as getting an ARM based architecture to x86-64 performance in laptop/desktop applications.


Also just throwing a pile of money at something is not guaranteed to get results. Although that is a lot of money.

I figured if Apple were to throw that money at something, it would be at fabs.

WhyteRyce fucked around with this message at 04:56 on Oct 15, 2012

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Install Gentoo posted:

If I remember right the fastest ARM-based cpu anyone's got still performs like a Core Duo or early Core 2 Duo while sucking down more power - and it certainly wasn't Apple who made it.
I don't think anyone's got ARM processors that are at Core 2 Duo performance yet, but they are CERTAINLY vastly more power efficient. The power usage thing is a factoid that got made up by scaling the power of entire SoCs by the factor of performance someone wanted, which is ridiculous. ARM Cortex A15 will significantly improve things, but that doesn't change the fact that ARM processors are aimed at the low power market. However, the reality is that low power is where the x86 market is anyway, so the loss of the market share currently held by Atom and potentially some other low-end products is rather significant.

ARM-based servers are also a thing that will happen, which is going to seriously impact both Intel and AMD.

Professor Science
Mar 8, 2006
diplodocus + mortarboard = party

Alereon posted:

ARM-based servers are also a thing that will happen, which is going to seriously impact both Intel and AMD.
how much of an impact there will be is debatable. the flat cost of x86 instruction decode is pretty minimal these days--otherwise Intel wouldn't have a chance in mobile, but devices thus far have been OK power consumption wise. in servers, ARM isn't going to have a massive power consumption advantage. the advantage ARM will have is different (and disruptive) kinds of servers that compete with virtualization, which might hurt Intel's margins going forward if people are buying commoditized ARM instead of Super Fancy Xeon 9000.

WhyteRyce
Dec 30, 2001

Professor Science posted:

the flat cost of x86 instruction decode is pretty minimal these days

You mean those guys who post in the comments on Engadet and Anandtech are wrong??

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

WhyteRyce posted:

You mean those guys who post in the comments on Engadet and Anandtech are wrong??

Only SA posts consistently get it right when it comes to internet manbabies posting authoritatively about CPUs. :smuggo:

By the way, what's the difference between superscalar and vector?

Gwaihir
Dec 8, 2009
Hair Elf

WhyteRyce posted:

You mean those guys who post in the comments on Engadet and Anandtech are wrong??

That myth got pretty solidly busted as soon as Intel put out their first Medfield based phone, but the whole "OMG x86 CAN NEVER BE AS POWER EFFICIENT AS ARM" thing still persists. Iirc that phone got above average (although not spectacular class leading) battery life relative to battery capacity, putting it ahead of several of the arm based phones. Add in Intel's process advantage over everyone else in the industry (Current medfield chips are build on their last gen process, not 22nm), and the picture isn't terrible for them at all.

Zhentar
Sep 28, 2003

Brilliant Master Genius
There are even advantages to the modern X86 way of doing things - the ISA the CPU pipeline runs on is not part of its published interface. Which means the chip designers can change whatever they want when it's convenient for them, without requiring any compiler/software changes.

Factory Factory posted:

By the way, what's the difference between superscalar and vector?

Superscalar means running executing more than one instruction in parallel, and is for the most part just magic that happens behind the scenes. Vector is a single instruction with multiple sets of arguments, and only happens by a compiler saying so.

Goon Matchmaker posted:

That reeks of anti-trust lawsuit though.

The FTC approved clause is that Intel is only required to enter good faith negotiations with the purchaser.

Killer robot
Sep 6, 2010

I was having the most wonderful dream. I think you were in it!
Pillbug

Zhentar posted:


Superscalar means running executing more than one instruction in parallel, and is for the most part just magic that happens behind the scenes. Vector is a single instruction with multiple sets of arguments, and only happens by a compiler saying so.

To give a further example here, x86 processors have been superscalar since the original Pentium. If you had stuff written for the scalar 386 or 486 you didn't need to change anything, the Pentium just plain ran it faster by being able to deal with multiple instructions in a clock cycle.

Vector instructions started off with MMX, and continued with SSE. Those let one instruction do multiple things, but you need applications that are written/compiled with the feature in mind. Otherwise a Pentium MMX was just a Pentium, and a Pentium III was just a Pentium II at the same clock speed, as far as performance goes.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Gwaihir posted:

That myth got pretty solidly busted as soon as Intel put out their first Medfield based phone, but the whole "OMG x86 CAN NEVER BE AS POWER EFFICIENT AS ARM" thing still persists. Iirc that phone got above average (although not spectacular class leading) battery life relative to battery capacity, putting it ahead of several of the arm based phones. Add in Intel's process advantage over everyone else in the industry (Current medfield chips are build on their last gen process, not 22nm), and the picture isn't terrible for them at all.
By "busted" I guess you meant "confirmed", as Medfield solidly proved that it will take a completely new microarchitecture for Intel x86 offerings to be competitive with ARM. Keep in mind that the phones 32nm Medfield was tested against were using last-gen 45nm dual-core Cortex A9 processors. Intel did write excellent x86 browser code that is responsible for a meaningful browsing performance difference versus older ARM products, but currently-shipping ARM processors beat it because they are just that much faster (and other SoC vendors can and do write custom ARM browser code to improve performance). When you consider that the GPU (PowerVR SGX 540) was obsolete at launch (and Intel doesn't seem likely to make any improvements in the future), and the poor battery life compared to dual-core 45nm ARM SoCs, it's clear why Intel isn't a player in the smartphone market.

The problem is that the Atom architecture is in-order, and in-order designs are fundamentally slow and power-inefficient compared to out-of-order designs. As soon as the Cortex A9 came out (and AMD Bobcat in the x64 world), Atom was obsolete. There's nothing Intel can do with process shrinks or clockspeed scaling to fix this, they are just hosed until the Silvermont microarchitecture in 2013. Intel could have ADDRESSED the issue and played to their strengths by combining the CPU with some good licensed IP blocks, but Medfield was their first real attempt at this and it's obvious how half-assed it was. The key question is whether Intel is willing to make a sincere attempt at the smartphone market with their 2013 architectures or if they'll keep phoning it in.

nmfree
Aug 15, 2001

The Greater Goon: Breaking Hearts and Chains since 2006

Alereon posted:

The key question is whether Intel is willing to make a sincere attempt at the smartphone market with their 2013 architectures or if they'll keep phoning it in.
I'm sorry, but I can't stop :v: at this. Well done.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
Goddamn I totally did not do that on purpose. I don't know if that makes it better or worse though.

Zhentar
Sep 28, 2003

Brilliant Master Genius

Alereon posted:

in-order designs are fundamentally slow compared to out-of-order designs.

While this is true...

Alereon posted:

in-order designs are fundamentally power-inefficient compared to out-of-order designs.

This is not (necessarily). It takes a lot of extra logic on the chip to make out-of-order work. You will spend significantly more power doing your computations on an out-of-order architecture. You can still come out ahead if finishing sooner lets you stop wasting power on other stuff, but it's not inherently better.


Also, in-order-ness is not the source of Intel's low power offering problems. Every processor they're competing with in in-order. There won't be any shipping out-of-order ARM processors until sometime next year, with the A15, and it sucks up enough power that they had to come up with big.LITTLE to offset it (not that big.LITTLE isn't a good idea).

Edit: d'oh, misread the chart. The A9 is OOO as well. A15's problem is that it shoots for desktop level pipeline depth and width.

Zhentar fucked around with this message at 01:06 on Oct 16, 2012

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Zhentar posted:

This is not (necessarily). It takes a lot of extra logic on the chip to make out-of-order work. You will spend significantly more power doing your computations on an out-of-order architecture. You can still come out ahead if finishing sooner lets you stop wasting power on other stuff, but it's not inherently better.
I'll certainly agree that there are minimum die area and power requirements to run an OoO architecture, but if you can afford those minimums an OoO architecture will use the hardware more efficiently and thus do more work in the same amount of time and/or energy compared to an in-order architecture.

JawnV6
Jul 4, 2004

So hot ...
It's all just race to idle.

WhyteRyce
Dec 30, 2001

Alereon posted:

By "busted" I guess you meant "confirmed", as Medfield solidly proved that it will take a completely new microarchitecture for Intel x86 offerings to be competitive with ARM.

No one was talking about performance of Medfield vs. ARM's latest offerings, just that opinion of a lot of arm chair architects seemed (or still seems) to be that x86 is just too inefficient and wasteful to be ever used in a small, low power applications.

Killer robot
Sep 6, 2010

I was having the most wonderful dream. I think you were in it!
Pillbug

WhyteRyce posted:

No one was talking about performance of Medfield vs. ARM's latest offerings, just that opinion of a lot of arm chair architects seemed (or still seems) to be that x86 is just too inefficient and wasteful to be ever used in a small, low power applications.

It's sort of the fallback position for the people who spent the 1990s saying that the x86 was doomed and hopeless in the performance/desktop market because reasons.

Not Wolverine
Jul 1, 2007
I think Apple certainly has enough $$$ to buy AMD, and I don't think there are many other tech companies with that much cash. Assuming Apple buought AMD and the R/D department, I think Apple has the funds needed to compete against Intel. I don't think it would ever happen. Why would Apple want to design their own CPUs? It would mean diverting funds away from developing crazy poo poo like retina displays. But more importantly, to make a profit off research $$$, Apple would have to sell CPU designs to 3rd parties and I really don't believe Apple wants to do that.

Would AMD be in a better position financially if they did not spin off GloFo? What is the future outlook for GloFo?

VVVV No poo poo, who financed the research.

Not Wolverine fucked around with this message at 03:55 on Oct 17, 2012

Nintendo Kid
Aug 4, 2011

by Smythe
Apple didn't develop high dpi displays they don't even make them. They just buy them.

Colonel Sanders posted:

No poo poo, who financed the research.

Not Apple, dude.

Nintendo Kid fucked around with this message at 05:41 on Oct 17, 2012

Adbot
ADBOT LOVES YOU

forbidden dialectics
Jul 26, 2005





Killer robot posted:

It's sort of the fallback position for the people who spent the 1990s saying that the x86 was doomed and hopeless in the performance/desktop market because reasons.

It's the classic RISC vs. CISC argument from way back when RISC architectures were nearing the 1 IPC asymptote. Modern x86 processors are functionally RISC anyways due to micro/macro-ops fusion stuff and the actual circuitry in no way resembles the original 8086. Also x86 basically hit the 1 IPC per pipeline in the P5 days, so it's even more irrelevant now that it's 20 years later.

  • Locked thread