Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Ryokurin
Jul 14, 2001

Wanna Die?

Nostrum posted:

AMD was also given a golden ticket by nVidia; I would be willing to bet that there are still a ton of people out there running A7N8X motherboards with a Barton 3500+.

Well, when you think about it, it's pretty easy to shine in a sea of poo poo. VIA was having some problems somewhere with every chipset from the KT400 on (AGP or overclocking stability) SiS chipsets were hit or miss, but were mostly like the early ATI chipsets and suffered from various performance issues. nVidia chipsets had their own problems that people admitted later on but they were clearly the best at the time.

HalloKitty posted:

I stand corrected, that time when RDRAM was on P3 before the early P4s was something I forgot about..

Most people forgot because they stuck with VIA chipsets or overclocked BX boards. That was the golden age of Abit and Oskar Wu designed boards that pulled off the impossible.

Alereon posted:

It seems like it's going to come down to how efficient AMD's Turbo Core 2.0 is. Turbo Boost on the Sandy Bridge has turned into a huge platform advantage, and there's basically nothing out there that's going to let AMD leverage more than four cores. AMD has been trying ever since Cool'n'Quiet without much success, but maybe they closed the gap. I'm more excited about the Llano APU, it's like getting a free Radeon 6500-class videocard with your Athlon II X4, which isn't a bad deal. The CPU can also probably stand up with an external videocard a lot better than an i3.

Didn't they admit a few weeks back that Cool'n'Quiet came kind of late in the Athlon64's design phase and as a result they have only been able to do minor improvements, but they were able to make changes to make it better from the start this time? I really hope that it is truly better as I would love to update my htpc with these, or build a fileserver

Adbot
ADBOT LOVES YOU

Ryokurin
Jul 14, 2001

Wanna Die?

spasticColon posted:

So let me get this straight, you can put a BD chip in an AM3 motherboard with a bios update but you can't put a AM3 CPU in an AM3+ motherboard?

No, an AM3 chip can go into a AM3+ board similar to how you could do AM2 into AM2+.

What it sounds like is that they may have added in a undocumented feature that allows some AM3 boards to work with an AM3+ chip, but probably at some small disadvantage, similar to how some K6/2 chips internally mapped a multiplier higher so old 66mhz boards could use it (but obviously at a speed disadvantage compared to a reall 100mhz board)

Keep in mind they did make some AM3 boards off of 790 chipsets and I highly doubt that there will be bios updates to those boards, and even then having a 800 series chipset isn't a guarantee.

Ryokurin
Jul 14, 2001

Wanna Die?

conntrack posted:

Sounded like most computer stores before the K7 cpus from amd. Sure you could buy amd if you wished your system to catch fire and kill your dog or any such helpful hints.

Some computer users too. I remember a heated argument I had with one guy back in the Maximum PC Delphi Forums back circa 1999 about how he swore AMD processors was incompatible with some part of x86. He would post something as proof and I would post a screenshot with it running along with the win98 system properties screen running to show the processor. This went on at least 7-8 times and he still swore that I was doing something to make it work.

The time before K7 was interesting indeed when people swore that anything that wasn't Intel sucked.

Ryokurin
Jul 14, 2001

Wanna Die?

TreFitty posted:

Yea, SiS was always around and always horrible. They were around even pre-K7. AMD even had their own chipset for the first-gen K7's to fill the void which was also bad for I forgot what reasons.

If I recall correctly, it did not support 4x AGP, and they tended to burn out, especially on systems past 1ghz.

Ryokurin
Jul 14, 2001

Wanna Die?
Yep. Tapeing out to a shipping product can take a year http://www.fudzilla.com/processors/item/13286-from-cpu-tape-out-to-shipping-takes-a-year

And it's very unlikely that AMD is going to want to be first in line for TSMC's 28nm product as even though the claim to be ahead of where they thought they would be right now.

Ryokurin
Jul 14, 2001

Wanna Die?
Unless Intel steps it up Ivy Bridge won't happen until next year. I don't expect AMD to match but at least shorten the gap. if it's within 10% Ill be happy.

Ryokurin
Jul 14, 2001

Wanna Die?

eames posted:

Then it’ll probably be their last screwup, because I don’t see how they have a chance in the x86 game once Ivy Bridge is out.
See Intel’s 22nm Tri-Gate Transistors and AMD’s lack of having something comparable within the near future.

FinFETs have been around for a while and wasn't invented by Intel so the only thing they can block is trade secrets on how they design chips around their use. TSMC also demoed similar technology around the same time Intel did a decade ago and have said they will start to use it at 20nm. The big hold out is Global Foundries which hasn't really said much about their finFET plans at all.

Then there's also the statement of one of the developers of finFETs saying that they won't result in better performance once we go below 22nm anyways. I think the key is going to be if Intel's process is to allow a performance difference sub 22nm or if they are pushing it just to lower power to better compete with ARM.

Ryokurin
Jul 14, 2001

Wanna Die?

Agreed posted:

That's horrible loving news. I haven't built an AMD system in years and years but they're the only balancing factor keeping the arms race going for consumers. Who else is even remotely positioned to offer an alternative to Intel? What's the future of AMD (and ATI) if they lose the CPU race this many generations in a row?

While it's likely is correct that it's going to disappoint, you need to take Theo Valich articles with a grain of salt. He's been dead wrong several times in the past.

Ryokurin
Jul 14, 2001

Wanna Die?
Yep. If I recall Paul Thurott stated that programs will be able to be recompiled to work on ARM, but otherwise, there is no compatibility layer or emulator. Even during the D8 demo, they stated that office was recompiled to work on ARM. In other words, you'll be at the mercy of developers providing a binary.

Ryokurin
Jul 14, 2001

Wanna Die?

Alereon posted:

nVidia's GPUs are overly compute-focused for the console market, and their power efficiency is significantly worse. This article from SemiAccurate about the development of nVidia's upcoming Kepler GPU has a bit more information, especially about the manufacturing and engineering challenges nVidia is facing on TSMC's 28nm process that AMD seems to have surpassed. Basically AMD can deliver a graphics solution that's smaller, cooler, and less expensive for the same amount of performance, and they seem to be in a better position to actually deliver upcoming products sooner (remember how late the Geforce 400-series was). I'm mostly comparing the Radeon HD 6870 to the Geforce GTX 560 Ti here since I don't think anyone's putting a GTX 580/R6970 in a console, but I think the same relative proportions are likely to apply to the next generation of mid-range parts.

It will be interesting which generation of GPU they use, VLIW4 or vector + scalar which is what Southern Isles is alleged to be based off of. VLIW5 and 4 was much more efficient than Nvidia's designs, but was also pretty hard to fully utilize which explains why it never really walked all over anything Nvidia put out. Vector + Scalar is more close to what Nvidia uses, but with some improvements. I doubt it will be hotter than what Nvidia will produce but we also don't know yet if the improvements they have done will actually be seen in the real world either.

Ryokurin
Jul 14, 2001

Wanna Die?
If AMD ever got to the point where bankruptcy was a real possibility Intel would probably throw them a indirect lifeline similar to how Microsoft purchased Apple stock in the 90s simply because they are the only thing that keeps an antitrust inquiry from gaining traction. Not to mention they can't afford to have patents go to another party who may not be as flexible when it comes to agreements.

Unless it's a world where x86 is dieing or AMD slides so much they are well below 10% like everyone else they'll survive somehow.

Ryokurin
Jul 14, 2001

Wanna Die?

Space Gopher posted:

Sure, but when it was designed Microsoft placed a lot more importance on getting something out there to compete now than on an optimum solution. They went with x86 not because it was the best choice for the task at hand, but because they had a lot of people very good with x86, and they could do it all with off-the-shelf parts. Nobody pretended it was the best choice.

When it came time to develop the 360, which wasn't nearly as much of a rush job, they went with a custom Power chip just like the rest of the industry. Now, if they are in fact going with a Bulldozer-based APU, it seems like their best and brightest have sat down and said, "starting from a clean sheet, x86 is clearly the best option for our new console." If AMD's starting to do some really interesting stuff with CPU/GPU hybrids behind closed doors, then it might be realistic, but it still feels kind of weird.

About the only thing I can say that leads to it not being true is how the main reason they killed the original Xbox so quickly was because Nvidia wanted more money for their chips and there was rumors that Intel didn't want to make that processor any longer. What's stopping AMD from making a similar call a few years into Xbox Next production? The 360 wasn't really about following everyone else, it was making sure they owned as much of the process as possible.

And then again, they need to make sure whatever they make isn't more than the current system. The days of charging $600 for a console are over and bulldozer is probably one of the easiest ways to do one for $400

Ryokurin
Jul 14, 2001

Wanna Die?

Mr Chips posted:

How much of this is a result of AMD spinning off their fabs?

28nm has been hard for everyone, and they were having trouble getting 32nm going for Llano so AMD should have saw this coming. That fact gives me pause about the idea that they would go with them for Wichita and Krishna, especially since TSMC built Brazos. It just seems weird to switch like that generation wise.

edit: completing a thought.

Ryokurin fucked around with this message at 16:25 on Nov 22, 2011

Ryokurin
Jul 14, 2001

Wanna Die?

rscott posted:

TMSC having process issues? Well I never!

Looks like it's the opposite. They are droping Global for TSMC. http://www.extremetech.com/computing/106217-manufacturing-bombshell-amd-cancels-28nm-apus-starts-from-scratch-at-tsmc

Ryokurin
Jul 14, 2001

Wanna Die?
They may very well be, as that market is going to get smaller over the years or at the very least become harder and harder to justify an upgrade every 2-3 years as it has been. Either way I don't see it happening in the near (that is the next year) as the alternatives still need work. It would be different if they didn't sell off their ARM division a few years back, still made their own memory instead of rebranding or had someone who could execute their low powered chip production properly. They owe it to their shareholders and their future to take a hard look at where they see things going.

Ryokurin
Jul 14, 2001

Wanna Die?

Install Gentoo posted:

There really isn't a good reason to assume the market for x86 computers is going to get smaller, only that it will continue to get bigger while other stuff also gets bigger faster, but without reducing the x86 market.

I was talking more about the desktop computer market than just the x86. Notebooks already outsell Desktops and Tablets will eventually become a bigger part of the market. Desktops won't go away but they will be more of a server role, or for heavy duty tasks that need a higher resolution or power than the average Notebook/Tablet can provide.

Ryokurin
Jul 14, 2001

Wanna Die?

Ragingsheep posted:

Does the ARM version of Win8 need specifically complied binaries or can it run x86 code?

ARM will only run metro (WinRT) applications.

Ryokurin
Jul 14, 2001

Wanna Die?

Devian666 posted:

Most failed companies diversify into other areas when they aren't profitable at their core businesses. They should be focusing on making ati profitable first. Then look at getting more profit out of the CPU business. At least they have admitted they are really competing with intel anymore.

Look at the 3rd quarter results. Their biggest area for growth has been mobile where they can't keep up with demand for Llano. Revenue increased 35% over the previous quarter. ATI is profitable, but logically it depends on if they are the fastest that quarter or not and that market is not going to grow and may possible shrink as more people are exposed to integrated graphics. Going towards mobile is the right choice right now.

Ryokurin
Jul 14, 2001

Wanna Die?

Coredump posted:

The headline and the first part of the article make it sound like AMD reduced the number of transistors on the chip instead of just updating the incorrect information on the number of transistors. Its misleading with how they present the information and good for a laugh with how misleading it is.

A classic definition of Linkbait. Unforunately most people don't tend to read further. That's what started the whole "AMD is leaving behind x86" business in the first place.

Ryokurin
Jul 14, 2001

Wanna Die?

SRQ posted:

Does it work on a wacky logarithmic warp scale or is it just linear?

I don't think they have officially said, but they did state when Vista came out that the expect to add a number every 12 to 18 months but that's to account for new hardware and does not mean that your numbers will eventually lower to 1.

Adbot
ADBOT LOVES YOU

Ryokurin
Jul 14, 2001

Wanna Die?

SwissCM posted:

That wasn't really an issue anymore when the nforce series was released though.

Nforce didn't have some of the stability issues, but it did tend to have other issues along it's Ethernet, firewall and real time clock. For an AMD chipset it was awesome but it still didn't rival Intel's best at the time.

  • Locked thread