|
God drat, AMD processors are so absurdly complicated. I've never bought an AMD cpu before and I'm trying to build a cheap ($300-400) PC with one since AMD is considered the low cost option. I can't even begin to comprehend what I should buy, I haven't spent nearly as much time choosing an Intel CPU for my gaming rig. Anybody have suggestions? e: I should mention that likely the most powerful program it will run is photoshop. Ervin K fucked around with this message at 12:25 on Oct 11, 2012 |
# ? Oct 11, 2012 11:27 |
|
|
# ? May 18, 2024 09:35 |
|
What's you CPU + Mobo budget? I just pulled the trigger for a hackintosh build but I also bought: AMD A6 3650 with a BIOSTAR TA75M which is roughly $165 at Newegg. This is the older Lanlo(?) FM1 processors Alternatively I was going to pick up a new Trinity-based FM2 socket rig which was a A10-5600K paired with a GA-F2A85X which would have been $260. NOTE: My bidget was a bit higher - $500 without a new case - so there are probably better options for you based on your needs. Look @ the stickied hardware parts thread for more opinions. But then I went for a hackintosh because all my computers at home must run OSX or some dumb poo poo and I'm working on getting that issue currently sorted out
|
# ? Oct 11, 2012 16:20 |
|
If you're on a shoe-string budget in the short term, then you can pick up an AMD motherboard + CPU at a Microcenter for a really good price. You're still better off to choose Intel if you can swing it, however.
|
# ? Oct 11, 2012 18:20 |
|
Ervin K posted:God drat, AMD processors are so absurdly complicated. I've never bought an AMD cpu before and I'm trying to build a cheap ($300-400) PC with one since AMD is considered the low cost option. I can't even begin to comprehend what I should buy, I haven't spent nearly as much time choosing an Intel CPU for my gaming rig. Anybody have suggestions? Just get the cheap pentium branded sandy/ivy bridge chip and call it a day.
|
# ? Oct 11, 2012 18:45 |
|
Ervin K posted:God drat, AMD processors are so absurdly complicated. I've never bought an AMD cpu before and I'm trying to build a cheap ($300-400) PC with one since AMD is considered the low cost option. I can't even begin to comprehend what I should buy, I haven't spent nearly as much time choosing an Intel CPU for my gaming rig. Anybody have suggestions?
|
# ? Oct 11, 2012 20:42 |
|
AMD Is preparing to lay off 20-30% of their workforce: http://allthingsd.com/20121012/exclusive-amd-to-cut-up-to-30-percent-of-workforce/.
|
# ? Oct 14, 2012 02:07 |
|
keyvin posted:AMD Is preparing to lay off 20-30% of their workforce: http://allthingsd.com/20121012/exclusive-amd-to-cut-up-to-30-percent-of-workforce/. Well, it was bound to happen sooner or later based on their financial performance for the past few years. Ultimately, I think AMD will survive for quite a while in some form. If the worst does happen, AMD could be a good acquisition target if someone wanted to vertically integrate their entire computing process including chip design. If Intel is worried about anti-trust concerns, they could make an investment in AMD such as what Microsoft did with Apple in the late 90's. I'm not sure this would happen though since it would require a significant investment. Especially if ARM can scale up performance to match that of lower end x86 chips, that may provide enough "competition" to keep regulators at bay.
|
# ? Oct 14, 2012 02:38 |
|
I'm hoping they at least continue to make video cards. The last thing we need is no Intel being the number 2 for video cards, Nvidia would love to just stand still and charge $800 for a 2% increase every year. I at least have some faith that large investor will help evolve Intel for super computers and servers because of demand and that will trickle down even if its just Intel in the market for processors. We will lose in prices but at least we will continue to see improvements that can be stolen and reproduced by some new company later.
|
# ? Oct 14, 2012 02:43 |
|
keyvin posted:AMD Is preparing to lay off 20-30% of their workforce: http://allthingsd.com/20121012/exclusive-amd-to-cut-up-to-30-percent-of-workforce/. I like the guy in the comments who was being totally serious about trying to get a hold of ARM management to give them advise
|
# ? Oct 14, 2012 03:50 |
|
WhyteRyce posted:I like the guy in the comments who was being totally serious about trying to get a hold of ARM management to give them advise I am very puzzled when that guy refers to AMD's "ATI 3D Graphics Engine." Its like he maybe knew what video cards did at the turn of the millennium and then stopped paying attention. Reading further, apparently he is an idiot who cannot distinguish between hardware, drivers, game engines, and porting software to Linux. Anyway, chew on this, see how it tastes: Intel Radeon HD 8000 series.
|
# ? Oct 14, 2012 05:29 |
|
Factory Factory posted:Anyway, chew on this, see how it tastes: Intel Radeon HD 8000 series. That seems like the express road to an antitrust case considering how hard they've fought against nVidia acquiring an x86 license.
|
# ? Oct 14, 2012 21:21 |
|
Chuu posted:That seems like the express road to an antitrust case considering how hard they've fought against nVidia acquiring an x86 license. Yeah, the only way I could see Intel being able to get the AMD graphics division is if someone else acquired the AMD processor division, including their x86 patents and perpetual license. The one company that I think has both the resources and inclination to do so is nVidia which is the last company Intel wants to have to compete against in the processor space. Edit: Thinking about it a little more, Apple could also make a play for the AMD processor division, so they wouldn't be dependent on Intel for the chips in their desktops and laptops (similarly to how they make their own tablet/phone SoCs). They certainly have the resources, but Apple and Intel seem to have a very good relationship so I don't know if Apple has the motivation to do that right now. Mr.Radar fucked around with this message at 21:58 on Oct 14, 2012 |
# ? Oct 14, 2012 21:51 |
|
Mr.Radar posted:Edit: Thinking about it a little more, Apple could also make a play for the AMD processor division, so they wouldn't be dependent on Intel for the chips in their desktops and laptops (similarly to how they make their own tablet/phone SoCs). They certainly have the resources, but Apple and Intel seem to have a very good relationship so I don't know if Apple has the motivation to do that right now. Two reasons this will never happen: 1) Intel's power consumption in the mobile space annihilates what AMD currently has and with Haswell there won't be competition. 2) Apple seems to be heavily investing in ARM. They're even making their own processor designs. If anything I see Apple scaling their processors up to the point where they're comparable to whatever Intel offers and then giving Intel the finger. I wouldn't be surprised if AMD gets eaten piecemeal by nVidia for the x86 license and Apple buying out the graphics division. Then we'll have two big processor manufacturers, Intel and Apple. Unless Nvidia can do something magical, which I think they could. If anything it looks like in the long run ARM is going to come to dominate and x86 will fall to the wayside.
|
# ? Oct 15, 2012 02:47 |
|
AMD's x86 licence is potentially void if its taken over or bankrupted unless Intel says otherwise though.
|
# ? Oct 15, 2012 02:51 |
|
Ragingsheep posted:AMD's x86 licence is potentially void if its taken over or bankrupted unless Intel says otherwise though. That reeks of anti-trust lawsuit though.
|
# ? Oct 15, 2012 02:53 |
|
Ragingsheep posted:AMD's x86 licence is potentially void if its taken over or bankrupted unless Intel says otherwise though. And what would this mean for Intel's AMD64 license?
|
# ? Oct 15, 2012 02:58 |
|
Goon Matchmaker posted:If anything I see Apple scaling their processors up to the point where they're comparable to whatever Intel offers and then giving Intel the finger. This is close to impossible, ARM stuff is way slower than what Intel can put out and Apple does not have the money or R&D knowledge to remedy that on their own.
|
# ? Oct 15, 2012 03:03 |
|
Does anybody? Nvidia?
|
# ? Oct 15, 2012 04:21 |
|
Install Gentoo posted:This is close to impossible, ARM stuff is way slower than what Intel can put out and Apple does not have the money or R&D knowledge to remedy that on their own. Apple has over $100B in cash, and the Apple A6 CPU shows their hardware engineers know what they're doing. I really doubt that Apple is looking to sever their relationship with Intel, and Haswell looks like it's going to be an incredible architecture for mobile computing, but if Apple really was looking at developing a high performance ARM processor they have the resources. edit: now it makes sense. Chuu fucked around with this message at 04:39 on Oct 15, 2012 |
# ? Oct 15, 2012 04:21 |
|
Chuu posted:Apple has over $100B in cash, and the Apple A6 CPU shows their hardware engineers know what they're doing. Do you have any idea how much money it took to get Intel or AMD processors to the performance they are today? $100 billion is nothing in comparison, and frankly the A6 doesn't show poo poo as far as getting an ARM based architecture to x86-64 performance in laptop/desktop applications. If I remember right the fastest ARM-based cpu anyone's got still performs like a Core Duo or early Core 2 Duo while sucking down more power - and it certainly wasn't Apple who made it.
|
# ? Oct 15, 2012 04:28 |
|
Install Gentoo posted:Do you have any idea how much money it took to get Intel or AMD processors to the performance they are today? $100 billion is nothing in comparison, and frankly the A6 doesn't show poo poo as far as getting an ARM based architecture to x86-64 performance in laptop/desktop applications. There is no market for high performance ARM hence no good high performance ARM processor. Apple doesn't really have a motive to try to create said processor since Intel is working so closely with them already. It's sort of a pointless trying to argue about what would be possible if we lived in a world where High Performance computing was anything but a niche market.
|
# ? Oct 15, 2012 04:37 |
|
Install Gentoo posted:Do you have any idea how much money it took to get Intel or AMD processors to the performance they are today? $100 billion is nothing in comparison, and frankly the A6 doesn't show poo poo as far as getting an ARM based architecture to x86-64 performance in laptop/desktop applications. Also just throwing a pile of money at something is not guaranteed to get results. Although that is a lot of money. I figured if Apple were to throw that money at something, it would be at fabs. WhyteRyce fucked around with this message at 04:56 on Oct 15, 2012 |
# ? Oct 15, 2012 04:53 |
|
Install Gentoo posted:If I remember right the fastest ARM-based cpu anyone's got still performs like a Core Duo or early Core 2 Duo while sucking down more power - and it certainly wasn't Apple who made it. ARM-based servers are also a thing that will happen, which is going to seriously impact both Intel and AMD.
|
# ? Oct 15, 2012 04:52 |
|
Alereon posted:ARM-based servers are also a thing that will happen, which is going to seriously impact both Intel and AMD.
|
# ? Oct 15, 2012 05:13 |
|
Professor Science posted:the flat cost of x86 instruction decode is pretty minimal these days You mean those guys who post in the comments on Engadet and Anandtech are wrong??
|
# ? Oct 15, 2012 16:33 |
|
WhyteRyce posted:You mean those guys who post in the comments on Engadet and Anandtech are wrong?? Only SA posts consistently get it right when it comes to internet manbabies posting authoritatively about CPUs. By the way, what's the difference between superscalar and vector?
|
# ? Oct 15, 2012 16:44 |
|
WhyteRyce posted:You mean those guys who post in the comments on Engadet and Anandtech are wrong?? That myth got pretty solidly busted as soon as Intel put out their first Medfield based phone, but the whole "OMG x86 CAN NEVER BE AS POWER EFFICIENT AS ARM" thing still persists. Iirc that phone got above average (although not spectacular class leading) battery life relative to battery capacity, putting it ahead of several of the arm based phones. Add in Intel's process advantage over everyone else in the industry (Current medfield chips are build on their last gen process, not 22nm), and the picture isn't terrible for them at all.
|
# ? Oct 15, 2012 19:05 |
|
There are even advantages to the modern X86 way of doing things - the ISA the CPU pipeline runs on is not part of its published interface. Which means the chip designers can change whatever they want when it's convenient for them, without requiring any compiler/software changes. Factory Factory posted:By the way, what's the difference between superscalar and vector? Superscalar means running executing more than one instruction in parallel, and is for the most part just magic that happens behind the scenes. Vector is a single instruction with multiple sets of arguments, and only happens by a compiler saying so. Goon Matchmaker posted:That reeks of anti-trust lawsuit though. The FTC approved clause is that Intel is only required to enter good faith negotiations with the purchaser.
|
# ? Oct 15, 2012 21:16 |
|
Zhentar posted:
To give a further example here, x86 processors have been superscalar since the original Pentium. If you had stuff written for the scalar 386 or 486 you didn't need to change anything, the Pentium just plain ran it faster by being able to deal with multiple instructions in a clock cycle. Vector instructions started off with MMX, and continued with SSE. Those let one instruction do multiple things, but you need applications that are written/compiled with the feature in mind. Otherwise a Pentium MMX was just a Pentium, and a Pentium III was just a Pentium II at the same clock speed, as far as performance goes.
|
# ? Oct 15, 2012 22:25 |
|
Gwaihir posted:That myth got pretty solidly busted as soon as Intel put out their first Medfield based phone, but the whole "OMG x86 CAN NEVER BE AS POWER EFFICIENT AS ARM" thing still persists. Iirc that phone got above average (although not spectacular class leading) battery life relative to battery capacity, putting it ahead of several of the arm based phones. Add in Intel's process advantage over everyone else in the industry (Current medfield chips are build on their last gen process, not 22nm), and the picture isn't terrible for them at all. The problem is that the Atom architecture is in-order, and in-order designs are fundamentally slow and power-inefficient compared to out-of-order designs. As soon as the Cortex A9 came out (and AMD Bobcat in the x64 world), Atom was obsolete. There's nothing Intel can do with process shrinks or clockspeed scaling to fix this, they are just hosed until the Silvermont microarchitecture in 2013. Intel could have ADDRESSED the issue and played to their strengths by combining the CPU with some good licensed IP blocks, but Medfield was their first real attempt at this and it's obvious how half-assed it was. The key question is whether Intel is willing to make a sincere attempt at the smartphone market with their 2013 architectures or if they'll keep phoning it in.
|
# ? Oct 16, 2012 00:16 |
|
Alereon posted:The key question is whether Intel is willing to make a sincere attempt at the smartphone market with their 2013 architectures or if they'll keep phoning it in.
|
# ? Oct 16, 2012 00:25 |
|
Goddamn I totally did not do that on purpose. I don't know if that makes it better or worse though.
|
# ? Oct 16, 2012 00:33 |
|
Alereon posted:in-order designs are fundamentally slow compared to out-of-order designs. While this is true... Alereon posted:in-order designs are fundamentally power-inefficient compared to out-of-order designs. This is not (necessarily). It takes a lot of extra logic on the chip to make out-of-order work. You will spend significantly more power doing your computations on an out-of-order architecture. You can still come out ahead if finishing sooner lets you stop wasting power on other stuff, but it's not inherently better. Edit: d'oh, misread the chart. The A9 is OOO as well. A15's problem is that it shoots for desktop level pipeline depth and width. Zhentar fucked around with this message at 01:06 on Oct 16, 2012 |
# ? Oct 16, 2012 00:54 |
|
Zhentar posted:This is not (necessarily). It takes a lot of extra logic on the chip to make out-of-order work. You will spend significantly more power doing your computations on an out-of-order architecture. You can still come out ahead if finishing sooner lets you stop wasting power on other stuff, but it's not inherently better.
|
# ? Oct 16, 2012 01:33 |
|
It's all just race to idle.
|
# ? Oct 16, 2012 15:47 |
|
Alereon posted:By "busted" I guess you meant "confirmed", as Medfield solidly proved that it will take a completely new microarchitecture for Intel x86 offerings to be competitive with ARM. No one was talking about performance of Medfield vs. ARM's latest offerings, just that opinion of a lot of arm chair architects seemed (or still seems) to be that x86 is just too inefficient and wasteful to be ever used in a small, low power applications.
|
# ? Oct 16, 2012 16:21 |
|
WhyteRyce posted:No one was talking about performance of Medfield vs. ARM's latest offerings, just that opinion of a lot of arm chair architects seemed (or still seems) to be that x86 is just too inefficient and wasteful to be ever used in a small, low power applications. It's sort of the fallback position for the people who spent the 1990s saying that the x86 was doomed and hopeless in the performance/desktop market because reasons.
|
# ? Oct 16, 2012 20:42 |
|
I think Apple certainly has enough $$$ to buy AMD, and I don't think there are many other tech companies with that much cash. Assuming Apple buought AMD and the R/D department, I think Apple has the funds needed to compete against Intel. I don't think it would ever happen. Why would Apple want to design their own CPUs? It would mean diverting funds away from developing crazy poo poo like retina displays. But more importantly, to make a profit off research $$$, Apple would have to sell CPU designs to 3rd parties and I really don't believe Apple wants to do that. Would AMD be in a better position financially if they did not spin off GloFo? What is the future outlook for GloFo? VVVV No poo poo, who financed the research. Not Wolverine fucked around with this message at 03:55 on Oct 17, 2012 |
# ? Oct 17, 2012 00:36 |
|
Apple didn't develop high dpi displays they don't even make them. They just buy them.Colonel Sanders posted:No poo poo, who financed the research. Not Apple, dude. Nintendo Kid fucked around with this message at 05:41 on Oct 17, 2012 |
# ? Oct 17, 2012 00:39 |
|
|
# ? May 18, 2024 09:35 |
|
Killer robot posted:It's sort of the fallback position for the people who spent the 1990s saying that the x86 was doomed and hopeless in the performance/desktop market because reasons. It's the classic RISC vs. CISC argument from way back when RISC architectures were nearing the 1 IPC asymptote. Modern x86 processors are functionally RISC anyways due to micro/macro-ops fusion stuff and the actual circuitry in no way resembles the original 8086. Also x86 basically hit the 1 IPC per pipeline in the P5 days, so it's even more irrelevant now that it's 20 years later.
|
# ? Oct 17, 2012 01:08 |