Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Zhentar
Sep 28, 2003

Brilliant Master Genius

Shaocaholica posted:

So is BD AMDs Netburst? Or is it better/worse than that analogy?

Yes, that does seem to be a reasonable comparison. AMD has been pretty tight-lipped about architectural details, so it's not clear how apt that analogy is, or how bad it is - AMD has made some Netburst-like design decisions, and it may well suffer the same fundamental architectural flaws of Netburst for the same reasons. Or, its failings may be much smaller; Piledriver could bring a refinement that resolves implementation flaws that are holding back Bulldozer and bring much more competitive performance to the table. It's hard to say; there are arguments supporting both views but not much concrete information available to sway things one way or the other.

Adbot
ADBOT LOVES YOU

Coredump
Dec 1, 2002

HalloKitty posted:

But they say the correct thing right there, that the company was just correcting a mistake in handed out specs.

It's not news item worthy on a site like Engadget..

The headline and the first part of the article make it sound like AMD reduced the number of transistors on the chip instead of just updating the incorrect information on the number of transistors. Its misleading with how they present the information and good for a laugh with how misleading it is.

JawnV6
Jul 4, 2004

So hot ...

HalloKitty posted:

But they say the correct thing right there, that the company was just correcting a mistake in handed out specs.

It's not news item worthy on a site like Engadget..

The guy who wrote that seriously doesn't understand it was a correction. He quotes AMD's PR on it and that correctly identifies the situation, but his usage of words like "the reduction," "depriving," and "loss" reveal a lovely understanding.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Yeah, he misidentifies it as them removing 800,000,000 transistors that were previously part of the design without affecting the CPU, which would be impossible. Then quotes the company's accurate description.

Why AMD claimed 2 billion transistors in the first place when they're only using 1.2 billion is what's weird to me.

Ryokurin
Jul 14, 2001

Wanna Die?

Coredump posted:

The headline and the first part of the article make it sound like AMD reduced the number of transistors on the chip instead of just updating the incorrect information on the number of transistors. Its misleading with how they present the information and good for a laugh with how misleading it is.

A classic definition of Linkbait. Unforunately most people don't tend to read further. That's what started the whole "AMD is leaving behind x86" business in the first place.

Jabor
Jul 16, 2010

#1 Loser at SpaceChem

Agreed posted:

Why AMD claimed 2 billion transistors in the first place when they're only using 1.2 billion is what's weird to me.

A miscommunication at some point seems likely.

For example, someone in marketing asks an engineer how many transistors there are, he says "one point two billion", but at that exact moment a train goes past and the marketing guy doesn't hear the "one point" part.

Shaocaholica
Oct 29, 2002

Fig. 5E

Jabor posted:

A miscommunication at some point seems likely.

For example, someone in marketing asks an engineer how many transistors there are, he says "one point two billion", but at that exact moment a train goes past and the marketing guy doesn't hear the "one point" part.

Thats a possibility but if it were the case, it makes them all look incompetent because thats what incompetent marketing would do.

movax
Aug 30, 2008

If I recall correctly, most textbook examples of implementing memory of a type that would be similar to what you want to use for cache use something like six transistors per bit.

I am going 100% off memory at a bar right now, I can check my VLSI textbook when I get home to make sure, though someone like JawnV6 or the other chip design goons could comment on it in more detail, I've only done small mixed signal designs.

sleepy gary
Jan 11, 2006

movax posted:

If I recall correctly, most textbook examples of implementing memory of a type that would be similar to what you want to use for cache use something like six transistors per bit.

I am going 100% off memory at a bar right now, I can check my VLSI textbook when I get home to make sure, though someone like JawnV6 or the other chip design goons could comment on it in more detail, I've only done small mixed signal designs.

I'm not sure why you're bringing this up, but generally on-die cache is SRAM, which is generally a minimum of 6 transistors per bit. Sometimes more. Rarely less. Contrast to standard DRAM which is 1 transistor per bit.

Shaocaholica
Oct 29, 2002

Fig. 5E
So newegg has the '4 module' BD listed as 'Eight-core'. Isn't that a bit misleading since it doesn't have 8 FPUs? The 4C/8T Intel chips are still listed as 'four-core'. For FPU intensive work, does the extra hardware in BD really add any performance? How does it fare for something like folding at home?

Nintendo Kid
Aug 4, 2011

by Smythe

Shaocaholica posted:

For FPU intensive work, does the extra hardware in BD really add any performance? How does it fare for something like folding at home?

N0 and terrible.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Shaocaholica posted:

So newegg has the '4 module' BD listed as 'Eight-core'. Isn't that a bit misleading since it doesn't have 8 FPUs? The 4C/8T Intel chips are still listed as 'four-core'. For FPU intensive work, does the extra hardware in BD really add any performance? How does it fare for something like folding at home?

Trying to pin it down as 4-core or 8-core is tough, because there is actually some interesting, distinctive hardware that can do operations in kind of a neat way depending on thread scheduling and what sorts of tasks are being handled. Theoretically, it's actually a pretty neat processor. If code were written to take advantage of it, we'd (probably) be seeing remarkably competitive processing power for multi-threaded tasks considering AMD's comparatively limited resources. It was supposed to catch up to Westmere/Beckton, but it missed that, remember...

It's a processor that's a bit ahead of its time, too, tuned for multi-threaded tasks and also things that just aren't widely implemented yet. It isn't helped by the fact that Intel manages to leverage a smaller transistor budget, a greatly reduced power draw and thermal envelope, and more conventional threaded execution (even with their current gen hyperthreading) to outperform it in pretty much everything.

Piledriver could be a really powerful followup, depending on what they do with it. Bulldozer's a disappointment but unless they just give up, there's still a lot of neat stuff going on that might pan out into something competitive and interesting. Or it might not, and Intel wins, and we see the effects of a virtual monopoly in things. Like Intel not bringing their higher performing, more efficient Sandy Bridge architecture to servers because previous-gen CPUs are good enough to embarrass AMD, so why go to the trouble of switching over from profitable, mature process they're pumping out to new ones that will require more engineering to get up and running?

Dark days for everybody if AMD really pulls out of the race. I want to give them as much hope and credit as possible but they've been stepping in dog poo poo with each new move lately, it's like watching a clown in a minefield, just totally hosed up.

movax
Aug 30, 2008

DNova posted:

I'm not sure why you're bringing this up, but generally on-die cache is SRAM, which is generally a minimum of 6 transistors per bit. Sometimes more. Rarely less. Contrast to standard DRAM which is 1 transistor per bit.

Ah, so I was remembering correctly :woop:. I was pretty sure it was SRAM, but I was impaired and couldn't think straight so I left it to someone else to clarify.

I figured that the massive amount of cache on Bulldozer would have contributed a significant amount to the various transistor count figures bandied about by marketing/engineering. Not that it matters, because 1.2 billion or 2 billion transistors, it still screwed AMD either way.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
AMD has announced their low-end Radeon HD 7000M-series for laptops: It's a lovely re-release of their 40nm Radeon HD 6000M-series, just at different clockspeeds and configurations. nVidia is doing them one better and simply renaming the existing Geforce 500M-series to 600M, with no changes whatsoever.

Good to see AMD and nVidia taking the opportunity to gently caress us once again.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
Yeah, I saw that article, and was immediately disappointed.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
That was incredibly disappointing.

evil_bunnY
Apr 2, 2003

Bob Morales posted:

More like their Pentium Pro. Huge die size, slower on existing code but faster on code optimized for it and geared more towards future code.
Also, the poo poo branch predictor and prediction miss cycle penalty makes it a p hilarious comparison.

Zhentar
Sep 28, 2003

Brilliant Master Genius

Shaocaholica posted:

For FPU intensive work, does the extra hardware in BD really add any performance?

Theoretically, yes. Even the heaviest FP loads involve a significant amount of integer work; for example, things like pointer math or loop counters and the such. Back in the day, I ran some simulations of a simple superscalar in-order processor, and in a lot of the floating point benchmarks the FP units spent more time idle than not. Out-of-order execution helps with that a lot, Hyperthreading helps with that a little, but there's still plenty of room for improvement. Feeding the FPU with two threads on separate integer cores will achieve higher FPU utilization.

As for how it fares in F@H, I haven't seen any benchmarks, but I can make a pretty decent guess; it'll put up a pretty decent fight against the i5-2500K, possibly even beating it. If they put out a F@H client compiled with FMA, it'll almost certainly beat the i5-2500K, perhaps even by a significant margin. But it'll eat up a lot more power in the process, and suck at most tasks, so it's still not worth it.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Alereon posted:

AMD has announced their low-end Radeon HD 7000M-series for laptops: It's a lovely re-release of their 40nm Radeon HD 6000M-series, just at different clockspeeds and configurations. nVidia is doing them one better and simply renaming the existing Geforce 500M-series to 600M, with no changes whatsoever.

Good to see AMD and nVidia taking the opportunity to gently caress us once again.

Isn't this the second time in a row for nVidia? Fermi 4ever!

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Agreed posted:

Isn't this the second time in a row for nVidia? Fermi 4ever!

No no, they've done worse than rebranding the same chip twice.

How about these three:
Geforce 9800 GTX+
Geforce GTS 150
Geforce GTS 250

Or these three:
Geforce 310
Geforce 210
Geforce 405

greasyhands
Oct 28, 2006

Best quality posts,
freshly delivered
Nvidia is just saying 'go gently caress yourself' at this point. Not only is this the 2nd consecutive rebrand of the same product, they are shifting the identifiers around too (555m goes to 635m instead of 655m, 540m goes to 630m) in an attempt to completely befuddle anyone who doesn't spend their afternoons reading about hardware specs. Maybe they shifted the model numbers down to make room for an improved midrange, but for the 5xx generation they widened the number gap between similar performance parts so who knows. These 6xx parts have essentially regressed to the 4xx naming scheme. gently caress them. This rebadge also doesn't bode well for those Kepler low/mid range parts to be launching anytime soon.

greasyhands fucked around with this message at 18:54 on Dec 7, 2011

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
Nvidia is the worst offender of the two by far when it comes to loving you by rebranding old poo poo.

There's not even any need for it. OEMs might demand it, but why? People who are fooled by these numbers don't know a goddamn thing about it anyway, so the numbers may as well stay the same.

The people who do 10 minutes of research know they're being bent over a pole by being resold the same stuff, and make their decision accordingly. Either way, it doesn't aid brand image in any sense I can see..

Shaocaholica
Oct 29, 2002

Fig. 5E
Is there a nicely formatted wikipedia article listing all the GPU families and the hosed up re-naming? I would think that would be one of the best forms of information against the practice.

movax
Aug 30, 2008

Shaocaholica posted:

Is there a nicely formatted wikipedia article listing all the GPU families and the hosed up re-naming? I would think that would be one of the best forms of information against the practice.

I'm not sure of a family comparison in particular, but each generation generally has a pretty good table listing all the specs, plus launch pricing and launch dates, which is a very good place to start.

I gave up trying to understand laptop GPU naming long ago (I have no idea what my MBP's GT330M compares too), but they are plenty guilty of this on the desktop as well. The low-end parts are often based on last-generation stuff and share very little in common with their higher-end buddies.

I wish that nvidia and AMD would just stop bothering, seeing as how integrated graphics are growing in penetration (especially thanks to you know, shipping *with* the CPU) and I don't see very many use cases where you'd need a low-end, discrete GPU over the IGP in your CPU and chipset.

Of course they won't, and people will still pay $100 for a POS discrete card that performs maybe 10% better than integrated graphics, but still falls into the dead-zone of being plenty enough for regular computing, but useless for playing games.

e: I actually went and looked up some stuff for nvidia in particular. The GeForce article has a table at the bottom that attempts to make sense of part names. Here is a giant spec table of nvidia GPUs. The most important thing to pay attention to there is the code-name, which can give you a relative idea of what generation is at play. And here are Red Team's charts.

movax fucked around with this message at 19:18 on Dec 7, 2011

Civil
Apr 21, 2003

Do you see this? This means "Have a nice day".

Shaocaholica posted:

Is there a nicely formatted wikipedia article listing all the GPU families and the hosed up re-naming? I would think that would be one of the best forms of information against the practice.
Once Tom's updates their graphics card heirarchy, that's probably one of the best places.

http://www.tomshardware.com/reviews/fastest-graphics-card-radeon-geforce,3067-7.html

I hadn't built a system for 4 - 5 years, and was completely out of the loop when buying a new card, and this helped straighten things out for me. Nvidia and AMD certainly aren't interested in laying this out plainly.

Probably the reason you'll see a card with a 64-bit interface and 1GB of VRAM next to a card with a 128-bit interface and 512MB. How is Joe Consumer supposed to make sense out of that poo poo without scouring forums for weeks prior to purchase?

It would be nice if both companies (and Intel, I guess) would adopt a performance rating system that helped consumers out with basic raw performance. You'll still have DX feature support, outputs, interfaces, and all that to worry about, but it would be a nice goodwill gesture.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Civil posted:

It would be nice if both companies (and Intel, I guess) would adopt a performance rating system that helped consumers out with basic raw performance. You'll still have DX feature support, outputs, interfaces, and all that to worry about, but it would be a nice goodwill gesture.

That basically already exists in Windows Experience Index, and it was clearly designed to fulfill this role - but it never gets used.

Basically, marketing always wins. They want to shift more cards to gullible consumers, and give OEMs a new number to stick on their spec charts every 6 months. That's all there is to it.

movax
Aug 30, 2008

HalloKitty posted:

That basically already exists in Windows Experience Index, and it was clearly designed to fulfill this role - but it never gets used.

Basically, marketing always wins. They want to shift more cards to gullible consumers, and give OEMs a new number to stick on their spec charts every 6 months. That's all there is to it.

3DMark tried, but therein came the issue of driver cheating and optimizing for certain benchmarks as well.

Basically, fire all marketing departments into the sun. Impossible demands and promising poo poo we can't deliver as engineers, and then confusing the poo poo out of the customer. :argh:

Shaocaholica
Oct 29, 2002

Fig. 5E

Civil posted:

Once Tom's updates their graphics card heirarchy, that's probably one of the best places.

http://www.tomshardware.com/reviews/fastest-graphics-card-radeon-geforce,3067-7.html


That list is still a bit hard to read. Maybe I should make my own on wikipedia.

movax
Aug 30, 2008

Shaocaholica posted:

That list is still a bit hard to read. Maybe I should make my own on wikipedia.

The avalanche of [citation needed] heard around the world...

Fortuitous Bumble
Jan 5, 2007

Shaocaholica posted:

Is there a nicely formatted wikipedia article listing all the GPU families and the hosed up re-naming? I would think that would be one of the best forms of information against the practice.

Not exactly the same, but wikipedia has this hilarious article where someonee tried to go through and explain what every ATI video card suffix meant, from back when they used them instead of changing the numbers all the time. I had no idea there were this many: http://en.wikipedia.org/wiki/ATI_video_card_suffixes

It's filled with these wonderful explanations that make me so thankful that they stopped with this poo poo:

quote:

GT (Gran Turismo)

'GT' cards are among the less powerful and expensive suffixes, though they are only applied to graphics cores designed for the enthusiast/performance segment, resulting in a card made for the mainstream segment.

quote:

RX

'RX' is identical to 'GT'; a company might produce a card and give it the 'RX' suffix, or the 'GT' suffix, and never sell cards using both labels; because of this, they can both be considered effectively the same.
[edit]

quote:

(no suffix)

Contrary as to what it may imply, the lack of a suffix, for most modern video cards, does NOT indicate that it is the "default" model. Rather, it simply lies roughly in the middle in terms of performance and price.

Edward IV
Jan 15, 2006

At the very least, it's only the low end GPUs that got rebranded. There is hope that AMD and NVIDIA has something better for their high end products although there is no evidence of that yet. Of course, the OEMs are probably going to use the low end GPUs in more of their products and save the real brand new GPUs, if any, for their real high end and expensive laptops. :sigh:

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

FWIW, at the low resolution that my 15" screen laptop runs, its 540m seems like a pretty sweet spot for performance in games and not being a nuisance the rest of the time (Optimus is awesome). Not going to be playing The Witcher 2 any time soon, but it'll run Starcraft with a lot of settings turned up.

~Coxy
Dec 9, 2003

R.I.P. Inter-OS Sass - b.2000AD d.2003AD

Fortuitous Bumble posted:

Not exactly the same, but wikipedia has this hilarious article where someonee tried to go through and explain what every ATI video card suffix meant, from back when they used them instead of changing the numbers all the time. I had no idea there were this many: http://en.wikipedia.org/wiki/ATI_video_card_suffixes

It's filled with these wonderful explanations that make me so thankful that they stopped with this poo poo:

If I'm remember my old Radeons correctly there was a particular card with both a "LE" model with a half-wdth memory bus and another "LE" model which was Mac only and actually had slightly improved clockspeed or somesuch. :negative:

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Agreed posted:

FWIW, at the low resolution that my 15" screen laptop runs, its 540m seems like a pretty sweet spot for performance in games and not being a nuisance the rest of the time (Optimus is awesome). Not going to be playing The Witcher 2 any time soon, but it'll run Starcraft with a lot of settings turned up.
I'd put the cutoff at the GTX 560M, or maybe the GT 555M if you can find the GF106 version. Below that point you only get 4 ROPs and very little memory bandwidth, and while that doesn't really ruin things for you at 768p except in the newest games, it begs the question of why you would go with a laptop with a dedicated GPU at all for the marginal performance you get.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Alereon posted:

I'd put the cutoff at the GTX 560M, or maybe the GT 555M if you can find the GF106 version. Below that point you only get 4 ROPs and very little memory bandwidth, and while that doesn't really ruin things for you at 768p except in the newest games, it begs the question of why you would go with a laptop with a dedicated GPU at all for the marginal performance you get.

Dramatically superior performance compared to the previous-gen integrated graphics, especially in terms of minimum framerate (and maximum framerate, and support for DX/OpenGL features that Intel ignores completely) and not a strong emphasis on gaming. The laptop's card was basically free, and just kind of a bonus that it's there - it was not a consideration going in, "gaming laptops" are a poo poo idea from the word go if you ask me. It's just kind of cool that I can load up Steam and play stuff that the integrated graphics couldn't handle worth a drat at playable framerates. I mean, the hardware isn't anything special by any means, but it's about as powerful as what's in current consoles, and with so many ports that means turn a few hoggish features down and you can actually play games.

Anything that has lower than a 128-bit memory bus is a complete waste of money, though. If you don't get it for free, you're getting screwed.

Devian666
Aug 20, 2008

Take some advice Chris.

Fun Shoe
One thing I've noticed is that if you don't have a gpu there are heaps of games that will not run at all. Games that would otherwise run alright on even a 540M.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Shaocaholica posted:

That list is still a bit hard to read. Maybe I should make my own on wikipedia.

Good luck creating any new page on Wikipedia, there is an army of people with nothing better to do than stamp down on anything that might be interesting, whilst at the same time adding lengthy prose to articles about toys, child abuse and anime.

I recently created a page, which was short, but potentially interesting, and along came Wikipedia's finest; a sample of his user page includes such delights as "this user is pagan", "this user is a furry"..

SRQ
Nov 9, 2009

HalloKitty posted:

That basically already exists in Windows Experience Index, and it was clearly designed to fulfill this role - but it never gets used.

Basically, marketing always wins. They want to shift more cards to gullible consumers, and give OEMs a new number to stick on their spec charts every 6 months. That's all there is to it.

My hard drive rating has stayed the same since 2006, even though I've gone through five.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

SRQ posted:

My hard drive rating has stayed the same since 2006, even though I've gone through five.

Let me guess. 5.9. I'm pretty sure mechanical disks are capped there.
That said, Vista and 7 WEI are not comparable.

Adbot
ADBOT LOVES YOU

SRQ
Nov 9, 2009

Does it work on a wacky logarithmic warp scale or is it just linear?

  • Locked thread