Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
SwissArmyDruid
Feb 14, 2014

by sebmojo

Rastor posted:

They are also developing a new x86 core which is "a new design built from the ground up". Rumor has it that for this new core AMD is giving up on the CMT design used in Bulldozer/Piledriver/Steamroller/Excavator and will instead go back to something more like what Intel has been using.

14nm Samsung FinFET K13 with no more than four physical cores? I can only hope.

SwissArmyDruid fucked around with this message at 21:28 on May 5, 2014

Adbot
ADBOT LOVES YOU

SwissArmyDruid
Feb 14, 2014

by sebmojo

PC LOAD LETTER posted:

AMD always planned on having some sort of new arch. out by 2015/2016 since Excavator was supposed to be the last BD revision. They probably knew they were hosed all the way back in mid to late 2011 since by then they'd have been able to do plenty of testing on samples from the fabs to see what yields they could get and how well power and performance scaled with clocks. It just takes a long time to design a new architecture and K10h was probably out of scaling room so they had to go with what they had and try to make the best of it.

The combo of patent issues, long development times, highly competent competition, and high production costs in making a high end x86 chip are brutally risky from a business perspective which is why no one but AMD tries (tried?) anymore to compete in that arena against Intel.

Let's not kid ourselves, while at the same time AMD was spending money like a company much larger than its real size, Intel was busy bribing the PC manufacturers and engaged in anticompetitive behaviour.

Edit: I sometimes think that AMD actually died in 2009 when they spun off GloFo and that they just don't know it yet, and that the purchase of ATI was the life support that's keeping the coma patient going. $1.25 billion in lawsuit settlement doesn't exactly pay for much.

SwissArmyDruid fucked around with this message at 23:15 on May 8, 2014

SwissArmyDruid
Feb 14, 2014

by sebmojo

Alereon posted:

I feel like AMD's biggest mistake in recent years was not crushing Intel in the low-power market. The netbook was born with the launch of the Intel Atom in 2008, and then withered and died because it took Intel six years to release a decent successor. AMD Brazos processors were launched at the beginning of 2011 and were the perfect netbook SoCs, yet AMD devoted limited manufacturing resources to Brazos and availability was poor, so it mostly appeared in low-end notebooks. I believe that if AMD had really pushed Brazos in the netbook segment they would still be considered a viable mainstream form factor today.

Selling cheap netbook SoCs probably isn't very profitable which is likely why they didn't do this versus just making more Radeon GPUs with the TSMC wafers, but it seems like there's value in keeping AMD's CPU division operating outside of consoles.

See, I like APUs as an idea. For your thin-and-lights and your ultrabooks, it makes 100% sense. But where AMD keeps killing me is that they keep trying to scale this stupid poo poo upwards, and instead of having more die space they could be devoting to improving single-threaded performance-per-core on chips that are most certainly going to be paired with a discrete GPU anyways, they keep wasting it on anemic iGPU on-die.

And every few years or so, they keep trotting out the old hybrid Crossfire buzzword again, (and let's face it, when has hybrid Crossfire ever meant poo poo?) and just like every other time, they haven't done squat with it again this time with APUs either.

SwissArmyDruid
Feb 14, 2014

by sebmojo

PC LOAD LETTER posted:

They can't really do much to improve single threaded performance without a major revision (which is what Excavator is supposed to be) which takes a long time (years) to do.

They keep throwing more die space at the iGPU for their APU's because for some reason they seem totally unable to increase the bandwidth to the iGPU and can't do much to improve CPU performance beyond what they're already doing so more iGPU is about the only way left for them to differentiate their product vs Intel's offerings and their own older APU's.

The funny thing is if they could just feed it enough data the iGPU would actually be getting some fairly respectable performance and it would add a lot of value to their products even if they continued to do little to improve single thread performance on the CPU side. Many of their cheaper APU's in the thin n' light category are incredibly hamstrung by single channel DDR3 1333 levels of bandwidth though.

Quad channel DDR3 probably isn't practical for them to do for cost reasons, same goes for on package DDR3, but maybe they could've hung some extra DDR3 or GDDR5 off of the chipset to feed their iGPU's better. They did do something like that for a while back when they were still putting the iGPU in the north bridge in the 780G or some such.

Well, DDR4 is starting to ship.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Ozz81 posted:

lack of a heatspreader and the very high likelihood of either damaging the die, or jamming a flathead screwdriver into your motherboard putting the drat heatsink clip in place.

And to think, people get terrified of loving up their processors NOW with the Intel LGA sockets.

SwissArmyDruid
Feb 14, 2014

by sebmojo

lovely Treat posted:

Imagine those people having to do a CPU pin mod to overclock instead of being able to just raise a multiplier a little and get an extra GHz clockspeed.

Sticker, wire-wrap, or solder? :colbert:

SwissArmyDruid
Feb 14, 2014

by sebmojo
I think we all already knew that our K13 with no more than four physical cores, no integrated graphics, and 14nm Samsung FinFET was in the pipeline, but apparantly the ambidextrous x86/ARM thing is not it.

http://fudzilla.com/home/item/34769-amd-plans-new-architecture-for-2016

The big takeaway: CMT is dead.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Alereon posted:

I did this too, but multipliers kept dropping off after a couple months until I put a square of scotch tape over the L1 bridges. I guess it was due to the graphite flaking off slowly, as it shouldn't have been hot enough for it to evaporate.

And that's why conductive paint was used.

SwissArmyDruid
Feb 14, 2014

by sebmojo

HalloKitty posted:

Intel has the money, but at the end of the day, if AMD can push out some CPUs that get recommended in the parts picking thread from top to bottom (much like their GPUs), then that would be a good start.

They'd need to significantly increase single thread performance and significantly reduce power usage to do this.

This is why GloFo licensing out Samsung's 14nm FINFET is so exciting. The process shrink means they'll be able to increase transistor count in general, the new direction on their high end parts they'll do by cutting core count (increasing area for more transistors-per-core).

SwissArmyDruid
Feb 14, 2014

by sebmojo

Crotch Fruit posted:

All this time I thought "better luck next platform" was just a joke. :( Is there any AMD CPU I should buy for any reason? Or at least an excuse I can tell people for why I didn't just buy Intel? I'm still holding out hope for big green. :smith:

Only if you want an inexpensive SFF HTPC. That's about all that AMD APU chips are good for, right now, since they'll give you... oh, let's say 75% of the power that an intel chip will have, with equal or better graphics, (Intel has really stepped up their game in the past year or so) without the price premium that Intel commands. And since it'll be plugged into the mains, no considerations about battery life need be made.

edit: That said, I *would* consider the use of non-APU parts in gaming computers on a very destitute budget. The Pentium name still holds too many bad memories for me.

SwissArmyDruid fucked around with this message at 21:44 on May 24, 2014

SwissArmyDruid
Feb 14, 2014

by sebmojo

Alereon posted:

And this isn't even so compelling now that Intel is including the QuickSync and ClearVideo engines on Celeron and Pentium processors, rather than just the Core i3. A Celeron G1840 for $42 is a pretty capable HTPC CPU now.

Heh, funny you mention the Pentiums and Celerons. Those names hold very painful memories for me, having worked in IT during the P4 HT days. loving Prescott.

SwissArmyDruid
Feb 14, 2014

by sebmojo
Alright, so mobile Kaveri? Kinda sexy.



If someone takes an Intel ultrabook, rips out the Intel bits and drops AMD bits into the space created and drops the price accordingly, without adding additional cost-cutting measures like screen quality, changing the SSD for an inferior HDD, and cutting other corners? That's a pretty nice laptop for a pretty good price. If only I didn't already have the Precision...

http://techreport.com/review/26528/a-first-look-at-amd-kaveri-apu-for-notebooks

http://www.anandtech.com/show/8119/amd-launches-mobile-kaveri-apus

SwissArmyDruid fucked around with this message at 14:10 on Jun 4, 2014

SwissArmyDruid
Feb 14, 2014

by sebmojo

HalloKitty posted:

Yeah, read the article earlier, looks pretty decent. No battery life results yet, though.

I'm just expecting there to be no good AMD laptops though, simply because normally AMD laptops are built cheaply, which is a shame. For my money, I'm liking the AMD APU balance for laptops (worse CPU, better GPU) because overall it's better in light gaming use cases, where the lower end Intel graphics still fall apart.

Agreed. But I think it's got a shot in the next laptop refresh cycle. If this succeeds, it will not be on its own merits, I think, so much as it will be thanks to Intel's continued delay of their 14nm parts.

SwissArmyDruid
Feb 14, 2014

by sebmojo
Welp, spoke too soon. HP's got three Kaveri-based Elitebooks for under 8 bills, and look to be *exactly* that thing that I just said about being an ultrabook minus the Intel and no cost-cutting. http://techreport.com/news/26566/kaveri-apus-land-in-three-hp-elitebook-laptops

SwissArmyDruid
Feb 14, 2014

by sebmojo

Malderi posted:

The AMD whitebook everyone tested a few weeks ago was apparently quoted at being able to retail at about $700. It was a 15.6" 1080p screen, top of the line FX-7600p, and a 256GB SSD. The build quality was reasonable if not perfect. If someone comes out with a machine that hits those specs and doesn't suck in some other way, I can imagine it selling quite well.

Ding ding! That is, in fact, what these HPs are retailing at! STarting at $799 for the smallest, $739 for the 14", and $749 for the 15.6"!

SwissArmyDruid
Feb 14, 2014

by sebmojo
EU appeals court upholds $1.4b fine against Intel.

http://www.engadget.com/2014/06/12/intel-loses-eu-antitrust-appeal/

SwissArmyDruid fucked around with this message at 00:48 on Jun 13, 2014

SwissArmyDruid
Feb 14, 2014

by sebmojo

El Scotch posted:

Is any of that going to AMD?

....oh poo poo, you know what? I never thought to check. I thought that the EU courts were dysfunctional and purely the tool of corporate lawyers like they are in the US instead of actually regulating business. >.<

I need to research if any of that is going to AMD, and I'll edit my last post.

SwissArmyDruid
Feb 14, 2014

by sebmojo
A10-7800 is reviewed.

Second verse, same as the first. Nothing to see here folks.

SwissArmyDruid
Feb 14, 2014

by sebmojo

orange juche posted:

AMD APUs in laptops definitely aren't all that terrible as long as you keep in mind what their strengths are. I just recommended my sister buy a 15 inch HP Envy with a Richland A10 in it. It is faster than her old laptop (a 2010 C2D ULV chip) and she can run games on it at an acceptable quality (TF2, other source games)

[emphasis mine] That's more thanks to the engine, with how well it scales to hardware, than it is to the actual hardware.

SwissArmyDruid
Feb 14, 2014

by sebmojo

sweart gliwere posted:

And yeah, Source is crazy versatile, mostly because it was mass-market practically a decade ago.

I'd note that Titanfall uses Source, but I'm not convinced that Respawn hasn't ripped out and recoded > 50% of the engine.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Alereon posted:

The Source engine (and TF2 in particular) is actually pretty good at scaling across multiple cores, I just tested and during multiplayer combat I had pretty even load across four logical cores, somewhat less load on a fifth, and light load across the other three (I verified all were flat before launching TF2). Multi-core rendering used to be disabled by default due to hitching and freezing issues, though. I think there's room for surprisingly good performance on Broadwell-Y, though "surprisingly good" may not mean playable given that we're talking about single-digit watts. I'd mention some AMD products, but it almost seems like they've given up selling the products they do launch.

Just waiting on "K13", something that doesn't have loving wasted die space being used for graphics that they could be using on stronger cores, and 10-series chipset.

SwissArmyDruid fucked around with this message at 22:37 on Aug 18, 2014

SwissArmyDruid
Feb 14, 2014

by sebmojo

Agreed posted:

Anyone feel like helping me gain some perspective on this development if you feel my read is unnecessarily bleak?

Little bit of column A, little bit of column B. Should also be noted that AMD also just recently gave Khronos the entire Mantle spec for free and said, "here, use whatever you like for OpenGL."

SwissArmyDruid
Feb 14, 2014

by sebmojo

orange juche posted:

Oh god, 2 more years without an architectural refresh is really gonna put the hurt on AMD.

As if not having products they can sell NOW isn't putting the hurt on them already. I have already replaced one AMD system with an Intel, and will be doing the same with another very shortly in order to resolve a seasonal thermal issue with the one I'm typing on as we speak.

One of these machines will not be updated or replaced again within the next five years. The other remains to be seen.

SwissArmyDruid fucked around with this message at 20:58 on Oct 6, 2014

SwissArmyDruid
Feb 14, 2014

by sebmojo

orange juche posted:

There's still a year and change between now and Zen

☜(゚ヮ゚☜)

SwissArmyDruid
Feb 14, 2014

by sebmojo
How likely is it that we are to see AMD make a concerted jump straight to Samsung's 14nm-or-whatever-sub-20nm process? GPU, APU, and CPU, I mean? I might actually entertain buying an AMD product at that point, but right now, I want to take my AMD and AMD rig and fling it off the top of my building. (no CPUs worth writing home about, gently caress atikmdag.sys and destinythegame.com's web video being a 100% guaranteed BSOD on my computer/drivers/video card/whatever.)

SwissArmyDruid fucked around with this message at 04:50 on Oct 18, 2014

SwissArmyDruid
Feb 14, 2014

by sebmojo

Aleksei Vasiliev posted:

Vine videos on Tumblr in Firefox BSOD me within ~1 minute of watching them, but Chrome is immune. Might want to try switching browsers at least for some content/sites if this is the same for you.

Vine doesn't do any of that to me, but Chrome is the only browser I use and it isn't immune to BSOD.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Civil posted:

Anyone else have this friend? He blows $1000+ at Frys and has to show off his build. I tried talking him into exchanging his CPU before opening everything up. He got a r9 270, so the slight benefits of getting an APU are irrelevant.

This is AMD's customer.



AMD's current offerings are what I recommend to people I don't like very much, but am obligated to support for whatever reason.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Beautiful Ninja posted:

Is you plan to slowly roast a person in his home using an AMD processor?

My plan is to trap them on 990FX.

SwissArmyDruid
Feb 14, 2014

by sebmojo

canyoneer posted:

I used to be a fanboy, but once you take a life in defense of your fandom you become a fanman. No coming back from that.

There are a couple subreddits for building PCs, and people in there recommend AMD stuff all the time. I have no idea why.

I think that there is a niche that, for anyone with a modicum of computer experience, is occupied by the Pentium AE. Because why would you go for the Athlon II X4 whatever or FX 6- or 4-core bullshit, when you can get an AE and crank that poo poo to 4.2Ghz on air with a modest aftermarket cooler like a Hyper 212?

For anyone who *doesn't* have this experience, and if you're asking in a PC-building subreddit, you probably don't, sure, I can see where the AMD would make more sense on a non-overclocking price/performance basis. Because it's not really until you put the spurs to the G3258 that it shines.

SwissArmyDruid fucked around with this message at 19:11 on Oct 22, 2014

SwissArmyDruid
Feb 14, 2014

by sebmojo

keyvin posted:

Its a recurring theme that when people ITT want to look at something positive we start talking about Intel.

And that's why the Pentium AE was such a genius masterstroke. It singlehandedly cuts the legs out from under AMD in the only remaining price/performance category that they had going for them.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Civil posted:

Oh, I dunno. I still love the 1090t machine I handed down to my parents. It even competes favorably against AMD's 2014 parts.

That's not a good thing, and it's also the reason why I'm still on a 965 BE and 880FX.

SwissArmyDruid
Feb 14, 2014

by sebmojo

keyvin posted:

You sure its not because you are a techno-masochist?

My workload and the games I am currently playing have not yet demanded that I upgrade. I do, however, have a new machine budgeted for if and when Star Citizen comes out.

SwissArmyDruid
Feb 14, 2014

by sebmojo
Well, this could either be absolutely brilliant, or complete folly for AMD. It's hard to tell at this point.

http://techreport.com/news/27259/cpu-startup-claims-to-achieve-3x-ipc-gains-with-visc-architecture

It should be noted that AMD is a major investor in this venture, as is Mubadala (the company that owns GloFo, who just bought IBM's chip unit, read: RISC business that they have leverage with which to shift to VISC)

If AMD can take this VISC architecture, and then integrate this with the already-existing HSA work they've done, yeah, they will completely obviate the need for things like OpenCL libraries, because the virtualized core's combined VISC/HSA middleware will ideally be composed of one or more CPU cores mixed with one or more GPU cores, and then just break out whatever appropriate work needs doing out to the GPU, all while presenting a single nondescript virtual core to applications for ease of programming.

This could also mean an obviation of the need for making applications more multithreaded as well.

Mad props for AMD if this was the end game all along. I'm excited.

SwissArmyDruid fucked around with this message at 22:25 on Nov 4, 2014

SwissArmyDruid
Feb 14, 2014

by sebmojo

Thanks, Menacer! I can always count on you to cut through the fluff and crap and zoom in on what's really important!

SwissArmyDruid
Feb 14, 2014

by sebmojo
So, imagine my surprise when I heard that it was AMD's HBM projects that they were working on with Hynix that actually panned out, as opposed to Nvidia's HMC.

Now, let's face it, it's new technology. It's not going to be cheap. Nvidia may have even announced that they are even going to use AMD/Hynix's HBM, but they're going to be a year behind in getting these products out the door. Furthermore, like all new technologies the initial rollout is going to be expensive.

But, 12 months down the line, when Nvidia starts migrating to stacked memory, we could see AMD APUs with a single stack of HBM on the package as dedicated graphics memory for use in low-end non-enthusiast devices. I'm even thinking that eventually, this makes its way into consoles via AMD's semi-custom silicon business, since they just won another contract with Nintendo to provide the hardware for entire ecosystem.

Discuss.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Bloody Antlers posted:

One of my favorite things to daydream about is imagining what the best AMD engineers could come up with if they had the same funding and process nodes as Intel.

Or reverse that situation and take the amount of Intel engineers you could pay with AMD's salary limitations and give them GF or TSMC (when it was struggling with 32nm) to design for with a similar R&D budget.

Given how incredibly mismanaged AMD was under the former CEO and how comparatively little cash they've had for R&D, you just have to KNOW there are some bad assed engineers on board to have gotten AMD to a point where they briefly delivered a superior platform vs Intel and managed to stay alive this long after all the anti-competitive practices Intel used against them.

It reminds me of the space race in a way.

Everybody loves an underdog.

SwissArmyDruid
Feb 14, 2014

by sebmojo

HalloKitty posted:

Oh, nice, didn't hear about this.

It would be amusing if Nintendo, with its forgotten and little-loved Wii U, suddenly had the most powerful of the 3 machines. Maybe they could include a controller designed for human hands with a good battery life!

I need to clarify my previous statement. It appears that AMD has only won a contract to provide the guts for the next living room console, which differs from what I was originally told.

SwissArmyDruid
Feb 14, 2014

by sebmojo

I am not a book posted:

I run various linux devel channels. I want an AMD proc specifically because fewer people buy them.

You're a saint. I think FX6350 and FX8320s are in that range. 6 and 8 cores, respectively.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Riso posted:

My Athlon II x4 640 is still good enough to play all games I want at 1080p, thanks.

Likewise, but with a Phenom II 965 BE. The only problem is that there are no AM3 motherboards in an ITX form factor, so I can't repurpose this perfectly good chip into a smallish HTPC a la Silverstone ML07 and use a riser card for the GPU. It's all gotta be mid-tower sized cases turned on their sides, and that's just rubbish.

Fun fact, I recommend AMD parts to people I don't like. I get to maintain the appearance of helping them, while at the same time, trapping them on AM3 and 880FX.

Adbot
ADBOT LOVES YOU

SwissArmyDruid
Feb 14, 2014

by sebmojo

keyvin posted:

When will AMD update their roadmap?

I would assume some time in February. The confluence of a shift to HBM products, potential announcement of the 300-series of GPUs, shift to 20nm process in 2015, and the scheduled investor relations conference are probably a good sign. They've been hinting that we might not see any 16nm FinFET products until 2016, though.

  • Locked thread