Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Oh, Lord, they're copying the K from Intel. :ughh:

Still, if the secondhand marketing mumbo-jumbo is correct, they both look like very attractive alternatives to an i3 at that price.

Adbot
ADBOT LOVES YOU

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Which they really need. If AMD can stop making the ancient crap they're making now, they don't have to stomp all over it for performance, just make money off it. Having a better, competitively priced, well-performing option for the low end that brings in more profit and doesn't tie up manufacturing with crap they don't need to make? Sounds like a feasible path to getting some black ink, and working into brand names better. Wal-Mart PCs, you know what I mean, but it'd be a big step up from i3 for graphically mildly intensive stuff without requiring a separate card, would it not?

Could be grasping at straws, I just want AMD to make some money so I can stop feeling like an Intel fanboy for putting forward what I feel is a pretty solid argument that there isn't a good reason to make an AMD-based computer at any budget :(

TOOT BOOT
May 25, 2010

I was building a mid-range gaming PC with a fairly strict budget and got a new quad core Phenom II for like $60. I haven't regretted it.

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

Factory Factory posted:

Oh, Lord, they're copying the K from Intel. :ughh:

Still, if the secondhand marketing mumbo-jumbo is correct, they both look like very attractive alternatives to an i3 at that price.

AMD was actually doing the Black Edition unlocked-multiplier thing well before Intel brought out the K models. I don't really see a point in this price range, though. Even if an unlocked multiplier is only a few bucks, it's money better spent saving up for a discrete video card.

e: :doh: You're talking about the model numbers themselves, aren't you?

Agreed posted:

Which they really need. If AMD can stop making the ancient crap they're making now, they don't have to stomp all over it for performance, just make money off it. Having a better, competitively priced, well-performing option for the low end that brings in more profit and doesn't tie up manufacturing with crap they don't need to make? Sounds like a feasible path to getting some black ink, and working into brand names better. Wal-Mart PCs, you know what I mean, but it'd be a big step up from i3 for graphically mildly intensive stuff without requiring a separate card, would it not?

Could be grasping at straws, I just want AMD to make some money so I can stop feeling like an Intel fanboy for putting forward what I feel is a pretty solid argument that there isn't a good reason to make an AMD-based computer at any budget :(

The mobile APUs are actually really good for low- and mid-budget netbooks and laptops, and the low-end desktop CPUs are still a decent budget choice. The bigger problem for AMD is the future: their new high-end architecture is a tremendous flop, they're going to get squeezed hard on the low end (where they really do have a compelling product) by ARM, and they still haven't managed much with the Fusion stuff beyond putting a decent GPU and a decent CPU on the same piece of silicon.

Space Gopher fucked around with this message at 19:35 on Dec 17, 2011

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Space Gopher posted:

AMD was actually doing the Black Edition unlocked-multiplier thing well before Intel brought out the K models. I don't really see a point in this price range, though. Even if an unlocked multiplier is only a few bucks, it's money better spent saving up for a discrete video card.

Yes, but they denoted that with Black Edition or BE. There doesn't seem any reason to go with K other than to leech off the success of Intel's unlocked CPUs and blur the line between brands.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
Uh, you could say the K does stand for black, as in CMYK. Key, but in reality, black.

Oh never mind, this is a winding path of pointlessness. Of course they stole it from Intel.

Why not just U for Unlocked?

KillHour
Oct 28, 2007


HalloKitty posted:

Uh, you could say the K does stand for black, as in CMYK. Key, but in reality, black.

Oh never mind, this is a winding path of pointlessness. Of course they stole it from Intel.

Why not just U for Unlocked?

Honestly, I'm glad they "stole" it like that. It makes it easier for laymen to decipher what the hell all the model numbers mean if the unlocked versions are the same across vendors.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

KillHour posted:

Honestly, I'm glad they "stole" it like that. It makes it easier for laymen to decipher what the hell all the model numbers mean if the unlocked versions are the same across vendors.

The common person doesn't buy a motherboard that allows him to overclock, let alone would they try overclocking even if they knew what it was.

In reality it makes very little difference, or that's my feeling. It's not important in the slightest, but my point is that an overclocker, picking out a chip for this purpose, isn't going to get confused by any of the numbers and letters, since they'll do their research anyway.

Everyone else just buys whatever piece of poo poo they happen to see in the store that has a big yellow sign saying that the PC is on a cheap deal.

Moey
Oct 22, 2010

I LIKE TO MOVE IT

HalloKitty posted:

Everyone else just buys whatever piece of poo poo they happen to see in the store that has a big yellow sign saying that the PC is on a cheap deal.

That's my logic too. 90% of people who use computers don't understand any of the "voodoo" that goes on inside. Do companies like Dell and HP use unlocked processors in their machines? If so, wtf, I still have not seen a bios from a computer manufacturer that allows any real changes.

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

HalloKitty posted:

Why not just U for Unlocked?

Hard consonants are more x-treme.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Moey posted:

That's my logic too. 90% of people who use computers don't understand any of the "voodoo" that goes on inside. Do companies like Dell and HP use unlocked processors in their machines? If so, wtf, I still have not seen a bios from a computer manufacturer that allows any real changes.

Dell offers unlocked processors on its Alienware machines. It also offers non-unlocked processors, but gets more juice out of them by sticking them on P67 boards for +4 bins of Turbo.

Of course, by default, the overclock is only 1 bin, or a whopping 1x multiplier/100 MHz. But the BIOS does allow adjustments.

Spiderdrake
May 12, 2001



Space Gopher posted:

Hard consonants are more x-treme.
Ultra is extreme, but Unlocked is unextreme!

Joking aside isn't U part of 'UM' or 'ULV' or whatever the low power mobile chips are? I think those have been around for ages, haven't they?

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Spiderdrake posted:

Ultra is extreme, but Unlocked is unextreme!

Joking aside isn't U part of 'UM' or 'ULV' or whatever the low power mobile chips are? I think those have been around for ages, haven't they?

ULV and CULV do indeed signify ultra low voltage and.. I think consumer ultra low voltage (although that doesn't make much sense), on Intel chips.

It was only a suggestion. Maybe they could use an X. X is just loving extreme. Hell, Z is even more extreme. It's right at the end and it still doesn't give a gently caress.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Next time on Dragon Core Z, our heroes gather the seven dragon cores and assemble them onto one die to revive Goku, who has learned the Turbo technique from King Kai.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Alereon posted:

This has been pulled due to unexpected performance drops.
So turns out the Bulldozer scheduling patch was Microsoft's fuckup. They released a partial patch that was missing components, which is what caused the performance drop. They'll be releasing a correct patch at some point in the near future.

evil_bunnY
Apr 2, 2003

They can't seem to catch a break :(

Chas McGill
Oct 29, 2010

loves Fat Philippe

Factory Factory posted:

Next time on Dragon Core Z, our heroes gather the seven dragon cores and assemble them onto one die to revive Goku, who has learned the Turbo technique from King Kai.

I'd watch that.

People have talked about hardware for games consoles being part of AMD's revenue stream, but how significant is it? Is it quite a low margin enterprise?

movax
Aug 30, 2008

Chas McGill posted:

I'd watch that.

People have talked about hardware for games consoles being part of AMD's revenue stream, but how significant is it? Is it quite a low margin enterprise?

At this point, I think it's a decent cash-stream. They were taped out years ago, process improvements are always marching on, and you have guaranteed customer until they EOL the console and stop production.

As the consoles shrink too, you can migrate those chips to your newer processes so you can decommission your older lines/furnaces/lithography/etc.

Chafe
Dec 17, 2009
Some Chinese leaks of HD 7900 slides. If these features actually work, without any huge driver issues, I'll definitely be buying one of these cards if the price is right.

tijag
Aug 6, 2002

Alereon posted:

I'm really pretty excited about the Radeon 7800-series myself. While the 7900-series looks cool, it's a brand new architecture so I'd rather wait until the kinks get ironed out. The leaked specs for the 7800-series seem to be exactly what I was hoping for, a die-shrunk 6900-series with much lower power consumption (and hopefully great overclocking!).

The 77xx, 78xx, and 79xx are all based on the new GCN architecture. Although rumors are that the 78xx will have exactly the same amount of CU's as 69xx series, which will allow us to make some interesting comparisons.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

tijag posted:

The 77xx, 78xx, and 79xx are all based on the new GCN architecture. Although rumors are that the 78xx will have exactly the same amount of CU's as 69xx series, which will allow us to make some interesting comparisons.
While we probably won't know for sure until Thursday, if not later, all indications are that only the 7900-series will be GCN, with all other new cards being VLIW4. Just like how only the 6900-series was VLIW4 and all other models were VLIW5. There's been a lot of confusing press generated by this SemiAccurate story, though that seems like an overinterpretation of information about which cards will be new 28nm models versus recycled 40nm VLIW5. For example, you can see a Radeon HD 7700 card here, and it has the 128-bit memory bus expected of a VLIW4 card, not the 192-bit bus expected of a GCN card.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
As long as they all have die shrinks and clock bumps, I'm sure they'll be priced well and still be very competitive. This could be an interesting round..

Fuzzy Mammal
Aug 15, 2001

Lipstick Apathy
I'm not the biggest fan of tomshardware but the results of the latest system builder guide are just so depressing...

http://www.tomshardware.com/reviews/fx-overclock-crossfire-ssd,3098.html

Shaocaholica
Oct 29, 2002

Fig. 5E

Fuzzy Mammal posted:

I'm not the biggest fan of tomshardware but the results of the latest system builder guide are just so depressing...

http://www.tomshardware.com/reviews/fx-overclock-crossfire-ssd,3098.html



Err, I'd rather have that than whatever you think would be 'not depressing'.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
^^ Sorry, I think you misunderstood the whole point of it. It's a build at $1000 based around Bulldozer to see what happens vs the same cost of machine in September (based on Intel).

It only completely cements how bad Bulldozer is. There really is absolutely no place in which it can fit, other than the waste disposal.

Tom's Hardware posted:

PCMark clearly favors the Intel-based system, and even the overclocked 4.5 GHz FX-6100 cannot match the stock Core i5-2500K's results.

I think we should go back to talking about Southern Islands instead of Bulldozer. Bulldozer is far too depressing.

HalloKitty fucked around with this message at 21:15 on Dec 20, 2011

Goon Matchmaker
Oct 23, 2003

I play too much EVE-Online
Bulldozer is the action that needs to be done with the chip. As in bulldoze the plans into a landfill.

Shaocaholica
Oct 29, 2002

Fig. 5E
Oh, the 'Sept. 2011' system is the older system? I thought it was the newer one. Thats pretty lame.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Shaocaholica posted:

Oh, the 'Sept. 2011' system is the older system? I thought it was the newer one. Thats pretty lame.

Yeah, that's how badly bulldozer performs, at an extraordinary power draw. Twice the watts (hope you've got a beefy power supply), very poor performance scaling. Kind of a nightmare for AMD in the enthusiast arena. The server parts aren't much better, even going against Intel's last-gen hardware (soon to be two generations old).

Shaocaholica
Oct 29, 2002

Fig. 5E
Are Intel's server parts always this far behind? My entire workplace uses Xeons for workstations and servers and I guess even though my Z800 is new its still using last gen CPUs.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
I think you have that backwards. Intel is dominating in everything but per-socket performance server-load per-watt performance, and AMD is only competitive there because it crams 12 or 16 cores onto a chip where Intel maxes out at 10.

Per core, Intel parts are faster and more power efficient.

Star War Sex Parrot
Oct 2, 2003

Factory Factory posted:

Per core, Intel parts are faster and more power efficient.
He means that the Xeons are "behind" because they haven't been refreshed to the latest architecture.

Intel usually introduces new architectures in the enterprise market first, but has moved away from that in recent years. I can only assume that's for business reasons.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Shaocaholica posted:

Are Intel's server parts always this far behind? My entire workplace uses Xeons for workstations and servers and I guess even though my Z800 is new its still using last gen CPUs.

"Behind" is a relative concept and has to be weighed against performance offered by their main competitor, who has ~5% of the market (down from a peak of 25% back when their parts were making GBS threads all over Intel's in every category). Introducing badass new Sandy Bridge server parts would be great, but we've already seen with Sandy Bridge-E that they are having some trouble delivering 8-core Sandy Bridge parts whereas previous-gen server processors have a reliable and profitable 10-core manufacturing process.

Given that Sandy Bridge-E's top end parts right now are shipping with two cores disabled, here's some total speculation. Maybe Sandy Bridge-E isn't just a more-money-than-sense consumer trap. It could also be Intel's opportunity to make some money off of what would otherwise be a process-tweaking, spindown-til-it's-right engineering headache. That'd be a pretty good use of an otherwise silly dick waving enthusiast platform. I haven't paid enough attention to Intel's extreme top end lineup or to their server roadmap to see when they do plan to bring newer processors to the server market.

At the moment, in terms of competition, Intel just doesn't need do more than what they're doing, because what they have is powerful enough to maintain dominance over AMD. There's really no impetus, no rush for them to end-of-life parts that they can make much more reliably and profitably. It's a monopoly in all but name, don't expect a lot of tooth and nail scrounging for performance from Intel when they're currently winning the server market, the desktop market, and even putting in great showings in the low end market (though hopefully future Llano-successors will change that). They're sucking at Atom, but that's kinda who cares, really, not their money maker.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Factory Factory posted:

I think you have that backwards. Intel is dominating in everything but per-socket performance server-load per-watt performance, and AMD is only competitive there because it crams 12 or 16 cores onto a chip where Intel maxes out at 10.

Per core, Intel parts are faster and more power efficient.

I think all he's getting at that Xeons tend to trail the desktop chips in the generation, so Sandy Bridge hits, Sandy Bridge Xeons don't hit until later, and so on.

Used to be the other way round with AMD.

Zhentar
Sep 28, 2003

Brilliant Master Genius

Star War Sex Parrot posted:

Intel usually introduces new architectures in the enterprise market first, but has moved away from that in recent years. I can only assume that's for business reasons.

You sure about that? Looking back as far as Netburst on Wikipedia, the Xeons look like they've lagged at least 3-6 months behind the desktop parts.

As far as why they lag, I'd guess it mainly comes down to the design/testing/validation for the extra Xeon features & layouts added onto the base architecture. Probably just extra testing & validation in general, for that matter. And given the level of pressure they're getting from AMD, their timelines probably err on the side of safety.

Star War Sex Parrot
Oct 2, 2003

Back to Southern Islands chat:

new snapshots of the card, as well as performance numbers from a leaked AMD slide. Take the numbers with a grain of salt, since they're certainly "best case scenario" numbers.



Star War Sex Parrot
Oct 2, 2003

Lost the graph somehow.

1.6x the performance of a GTX 580 is certainly appealing. :holy:

Only registered members can see post attachments!

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Star War Sex Parrot posted:

Lost the graph somehow.

1.6x the performance of a GTX 580 is certainly appealing. :holy:



If nVidia can't match that, and the numbers bear out, count me in. I'd frankly love to give AMD some money right now, enough bad news.

ohgodwhat
Aug 6, 2005

That is an awfully misleading graph. Granted, that seems to be par for the course with benchmarks.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

ohgodwhat posted:

That is an awfully misleading graph. Granted, that seems to be par for the course with benchmarks.

In what way? The only thing I can think is that the GTX 580 is much more limited in high resolution gaming by the 1.5GB frame buffer, which can skew results dramatically depending. But those are games which we have a pretty good idea of how they perform at the given settings with current benches, so if they're being genuine at all and the numbers do match their comparisons, that's an impressive jump in performance for a single-GPU card, and would be tough for nVidia to beat, although I'd expect them to at least match it given that Fermi is a pretty old architecture at this point and Kepler ought to come out swinging unless they've had their heads completely up their asses.

Adbot
ADBOT LOVES YOU

Star War Sex Parrot
Oct 2, 2003

Agreed posted:

In what way?
They made the vertical axis origin non-zero to make the AMD bars tower over the NVIDIA bars. The AMD bar is at times 4x larger than the NVIDIA bar when that really only represents 1.6x faster.

PR departments are poo poo (especially AMD's). What else is new. That's why I said take it with a grain of salt.

  • Locked thread