Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
I don't think I can even bear to read it. Poor AMD.

Adbot
ADBOT LOVES YOU

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Killer robot posted:

That's a weird story. I saw AnandTech just announced today that the Bulldozer transistor count is actually 1.2 billion rather than 2 billion as earlier reported. Obviously this doesn't change the performance of the chip itself, but I guess the "2 billion transistors and this is all we get?" arguments do go down a little when they become "1.2 transistors and this is all we get?" instead.Or more to the point, turn into "that much die space on that process and..." since they're working on a significantly lower transistor density than has been earlier estimated.

It makes it sound slightly less worse engineered. Still doesn't change the fact of the benchmarks/power & heat. I guess they're trying any kind of damage control right now.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Coredump posted:

Ha ha look how wrong Engadget gets it.

But they say the correct thing right there, that the company was just correcting a mistake in handed out specs.

It's not news item worthy on a site like Engadget..

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
Yeah, I saw that article, and was immediately disappointed.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Agreed posted:

Isn't this the second time in a row for nVidia? Fermi 4ever!

No no, they've done worse than rebranding the same chip twice.

How about these three:
Geforce 9800 GTX+
Geforce GTS 150
Geforce GTS 250

Or these three:
Geforce 310
Geforce 210
Geforce 405

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
Nvidia is the worst offender of the two by far when it comes to loving you by rebranding old poo poo.

There's not even any need for it. OEMs might demand it, but why? People who are fooled by these numbers don't know a goddamn thing about it anyway, so the numbers may as well stay the same.

The people who do 10 minutes of research know they're being bent over a pole by being resold the same stuff, and make their decision accordingly. Either way, it doesn't aid brand image in any sense I can see..

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Civil posted:

It would be nice if both companies (and Intel, I guess) would adopt a performance rating system that helped consumers out with basic raw performance. You'll still have DX feature support, outputs, interfaces, and all that to worry about, but it would be a nice goodwill gesture.

That basically already exists in Windows Experience Index, and it was clearly designed to fulfill this role - but it never gets used.

Basically, marketing always wins. They want to shift more cards to gullible consumers, and give OEMs a new number to stick on their spec charts every 6 months. That's all there is to it.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Shaocaholica posted:

That list is still a bit hard to read. Maybe I should make my own on wikipedia.

Good luck creating any new page on Wikipedia, there is an army of people with nothing better to do than stamp down on anything that might be interesting, whilst at the same time adding lengthy prose to articles about toys, child abuse and anime.

I recently created a page, which was short, but potentially interesting, and along came Wikipedia's finest; a sample of his user page includes such delights as "this user is pagan", "this user is a furry"..

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

SRQ posted:

My hard drive rating has stayed the same since 2006, even though I've gone through five.

Let me guess. 5.9. I'm pretty sure mechanical disks are capped there.
That said, Vista and 7 WEI are not comparable.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

adorai posted:

You should get the same score with the same hardware, the scale just goes higher in 7.

I don't think this is the case.

http://www.winsupersite.com/article/windows-7/windows-7-feature-focus-windows-experience-index posted:


Microsoft has actually changed how it rates hardware since Windows Vista, so identical hardware components could actually receive different subscores on each OS, making Vista-to-Windows 7 comparisons impossible.

http://windows.microsoft.com/en-gb/windows7/achieving-a-perfect-windows-experience-index-score-in-windows-7#section_3 posted:


People running Windows Vista who earned high scores under the 5.9 index were often surprised when their scores went down after they upgraded to Windows 7. The Windows Experience Index is calibrated to measure the latest technology. So a machine rated as fast a few years ago will fall in ranking as new and faster hardware becomes available.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Q_res posted:

On the subject of Video card suffix horseshit, I think the XT suffix has to be my hands down favorite. ATI brings out the XT label for its high-end Radeons, nVidia turns around an slaps XT on it's low-end poo poo cards. Such a fantastic rear end in a top hat move, you almost have to admire it.

ATI doesn't actually advertise the cards with alphabet soup after the names any more. They made a decision not to in more recent years.
I see your point though, in the past that's exactly what happened.

On the topic of Southern Islands, though, look at that thing! 3GB VRAM. People running eyefinity setups are probably getting a little bit too excited at the moment, since there's nothing else to compete with that.

That said, most people at the moment probably don't need more than 1.5~2, depending on their monitor resolution and the games they play. Maybe we could have socketable video RAM in future, as needs will differ massively from 3x 2560x1600 to the guy with a single cheap 1920x1080 screen.

HalloKitty fucked around with this message at 10:58 on Dec 16, 2011

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
Uh, you could say the K does stand for black, as in CMYK. Key, but in reality, black.

Oh never mind, this is a winding path of pointlessness. Of course they stole it from Intel.

Why not just U for Unlocked?

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

KillHour posted:

Honestly, I'm glad they "stole" it like that. It makes it easier for laymen to decipher what the hell all the model numbers mean if the unlocked versions are the same across vendors.

The common person doesn't buy a motherboard that allows him to overclock, let alone would they try overclocking even if they knew what it was.

In reality it makes very little difference, or that's my feeling. It's not important in the slightest, but my point is that an overclocker, picking out a chip for this purpose, isn't going to get confused by any of the numbers and letters, since they'll do their research anyway.

Everyone else just buys whatever piece of poo poo they happen to see in the store that has a big yellow sign saying that the PC is on a cheap deal.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Spiderdrake posted:

Ultra is extreme, but Unlocked is unextreme!

Joking aside isn't U part of 'UM' or 'ULV' or whatever the low power mobile chips are? I think those have been around for ages, haven't they?

ULV and CULV do indeed signify ultra low voltage and.. I think consumer ultra low voltage (although that doesn't make much sense), on Intel chips.

It was only a suggestion. Maybe they could use an X. X is just loving extreme. Hell, Z is even more extreme. It's right at the end and it still doesn't give a gently caress.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
As long as they all have die shrinks and clock bumps, I'm sure they'll be priced well and still be very competitive. This could be an interesting round..

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
^^ Sorry, I think you misunderstood the whole point of it. It's a build at $1000 based around Bulldozer to see what happens vs the same cost of machine in September (based on Intel).

It only completely cements how bad Bulldozer is. There really is absolutely no place in which it can fit, other than the waste disposal.

Tom's Hardware posted:

PCMark clearly favors the Intel-based system, and even the overclocked 4.5 GHz FX-6100 cannot match the stock Core i5-2500K's results.

I think we should go back to talking about Southern Islands instead of Bulldozer. Bulldozer is far too depressing.

HalloKitty fucked around with this message at 21:15 on Dec 20, 2011

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Factory Factory posted:

I think you have that backwards. Intel is dominating in everything but per-socket performance server-load per-watt performance, and AMD is only competitive there because it crams 12 or 16 cores onto a chip where Intel maxes out at 10.

Per core, Intel parts are faster and more power efficient.

I think all he's getting at that Xeons tend to trail the desktop chips in the generation, so Sandy Bridge hits, Sandy Bridge Xeons don't hit until later, and so on.

Used to be the other way round with AMD.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Agreed posted:

In what way?

Where does it say "times the perfomance"?

For all we know, it could be 1fps vs 1.6fps!

Or, the difference between the framerates at the very top of the graph, as opposed to actually the whole bar - I mean look, it starts at 0.8, I think this graph is most definitely misleading.

Doesn't mean I'm making GBS threads on AMD though, and yes, the GTX580 is limited at 1.5GB, very much so.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
Don't get me wrong, I'm eagerly awaiting it. But let's be honest, we need reviews from the likes of AnandTech before making a real call.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
The multi-monitor performance is almost certainly spot on. A 3GB card making GBS threads on a 1.5GB card is easy to believe when they're talking about 3x 30 inchers.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
There is a 2GB version, but I agree, it is a limiting factor :(

Star War Sex Parrot posted:

http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review

edit: So it's a paper launch. gently caress that noise. Expected availability on January 9, so I might as well wait to see how the 7950 turns out.

But coming in and making GBS threads on all existing single GPU cards, even if expected, is very nice.
Idle temps/power consumption = loving awesome.

The review is missing CrossFire and Eyefinity performance though, I'd like to see the tipping point at which that 3GB comes into its own.

HalloKitty fucked around with this message at 10:43 on Dec 22, 2011

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
Yup. This is seriously, seriously impressive.

Legit Reviews posted:

With the new BIOS installed we were able to reach 1165MHz on the core and 1625MHz on the memory with full stability. If we went up to 1170MHz on the core we would see artifacts in the benchmark, so without increasing the core voltage this is the best that we are able to get today. Remember, the Radeon HD 7970 started out with a 925MHz core clock, so this is an increase of 240MHz or 26%.

Seems like it scales like crazy. Seems like going straight for a 79xx nomenclature is almost jumping the gun. Seems like they have headroom to play with on one GPU.

Now if only the ATI guys could teach the AMD guys a thing about chip design; the irony being that every available AMD CPU, no matter how overclocked, is going to bottleneck the ever-loving poo poo out of this card.

HalloKitty fucked around with this message at 20:19 on Dec 22, 2011

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Star War Sex Parrot posted:

It'll be interesting to see if partners go crazy with it and ship $650 factory overclocked versions if the architecture really has that much headroom.

Yeah, this is my thought. It's going to be worth waiting a little while to see the cards with massive coolers and beefy overclocks from the factory that will basically push it into a much higher performance class.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
ZeroCore is seriously awesome. Almost no power consumption when the screen is blanked, and the fan goes off.

Anyone who leaves their machine on to do torrenting or anything like that will massively benefit from this.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
Holy loving balls. We knew it had headroom and scaled WELL, but a factory one overclocked to 1335Mhz from 925? Mother of god that is going to be fast; as in, likely faster than any dual GPU/single card solution currently available.

I hope this sheet is close to the truth.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
Good thing they have hit today on the sites I checked. I spec'd out a machine for a friend last night with everything except the graphics card.

7970 completes that picture, and fits in the budget given. Yay for building monstrous machines!

Edit: There's an XFX out with a GPU clock of 1GHz, so this bodes well for Sapphire's crazy sheet.

HalloKitty fucked around with this message at 09:59 on Jan 9, 2012

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Agreed posted:

But the figures for a single 7970 even heavily overclocked aren't past that point. Two 580s still outperforms one 7970 like crazy in games that are well supported.

http://www.anandtech.com/show/5314/xfxs-radeon-hd-7970-black-edition-double-dissipation-the-first-semicustom-7970/6

I could cherry pick Metro: 2033 here and show that to be eerily close to not being the case, even though I of course realise SLI 580s are faster than a 590.

Point is, the 7970 is going to guzzle a hell of a lot less power.

I never really thought SLI or CF was a good idea, though, and I tend not to recommend it, for power & cost reasons mainly, but also the galling fact that a few years down the line you have a pile of expensive graphics cards with no use, just so you could gain a few FPS here and there (odd issues with FPS loss and drivers notwithstanding), even though most of your time was spent using the cards in aero, which Intel onboard would have sufficed for.

HalloKitty fucked around with this message at 22:29 on Jan 9, 2012

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
Bear in mind, in that HardOCP article, the GTX 580 used is factory overclocked to a hefty degree. 772 to 860..

Edit: hah, you edited your post. I preferred the old one, that was clear amazement. :D

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

freeforumuser posted:

Bjorn3D 3870K review

Superbly disappointing overclocking numbers. 1.4625V and can't even break 3.6GHz. I don't want to know the power consumption anymore, it's just so sad.

Ah, christ, for a minute I poo poo myself. I saw 3xxxK, and thought.. gently caress, Ivy Bridge sucks, how?
Oh, AMD, copying a little name recognition are we?

HalloKitty fucked around with this message at 13:26 on Jan 11, 2012

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
I wonder what the results will be. I'm still dubious, seeing as sites have already overclocked the processors and received poor results.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
The only way you could think Bulldozer was awesome is if you somehow completely ignored all available evidence.

Quite a feat.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

text editor posted:

You should have asked him if he lapped the CPU too.

Oh god, now we're talking. I remember when heat spreaders were thought of as just another thing in the way, and people would prise that poo poo off to get a direct core contact. Maybe even lapping the core.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
Oh, that's just glorious. If there was ever a simple example of having more money than sense, that would be it.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Install Gentoo posted:

I did explicitly say he was running XP home and that that is 32 bit only, yes. Also he'd already told the forum that he'd added more RAM recently to speed things up and it was "totally working" even though it of course couldn't be.

I'm not amazed. I knew a guy who bought an Alienware desktop running XP 32 and 8GB RAM, years ago, when Geforce 6800 was hot poo poo, if memory serves.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Meaningless speculation mixed in with some traffic boosting bullshit is the way I see it.

Even if Kepler is amazing, this isn't a way to predict it.

HalloKitty fucked around with this message at 23:18 on Jan 19, 2012

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Longinus00 posted:

When's the last time ATI/AMD has been "on top" anyway? The only thing that would be surprising is if Nvidia was able to price match ATI/AMDs top card.

You'd have to define that more carefully.

Technically, ATI has been on top for single cards for quite a while.

The fastest card last generation was the Radeon 6990. (It's the fastest now, until dual Tahiti hits, but I'd still get a single 7970 rather than a 6990 now).
Before that, it was the Radeon 5970.
I don't know how you want to deal with the other gaps in time, such as the Radeon 4870X2 vs Geforce 9800 series, the Radeon obviously winning initially, but later it had to stand against the Geforce GTX 280, 295, until the 5xxx series hit.

Basically, AMD has been aggressive in getting out pretty fast stuff soon, then doubling up, to give you the fastest single card. NVIDIA tended to come later with a monolithic design, having troubles when sticking two of them on a card.

This is no more obvious than with the Geforce 590 vs the 6990. The Geforce 590 should be faster, there's no doubt Geforce 580 SLI is faster than a 6990. But it had to be clocked down significantly, evaporating a lot of the performance.

HalloKitty fucked around with this message at 11:17 on Jan 20, 2012

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
Viewed that way, NVIDIA has a rosier track record. This was intentional, though.




Agreed on the issues with Crossfire/SLI, more trouble than its worth, usually.

HalloKitty fucked around with this message at 18:26 on Jan 20, 2012

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

VorpalFish posted:

AMD has been kicking rear end for a long, long time now, at least with regards to graphics.

It's funny, because there's truth to it, yet the Steam survey shows NVIDIA ahead in share by a decent margin.

Maybe a lot of people still think it's the drivers :v:

HalloKitty fucked around with this message at 19:29 on Jan 20, 2012

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

grumperfish posted:

Kyro was a PowerVR part.

My first 3D card was a voodoo banshee. It died and got replace by an abomination that S3 poo poo out. The 3D demo included with the card dragged it down to a slideshow. :effort:

I think XGI attempted to be a thing at one point too with the Volari line. Didn't last long IIRC.

oh Banshee

I remember looking at that, and thinking it was poo poo hot, back when I was at school. 2D AND 3D all in one card?

Adbot
ADBOT LOVES YOU

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Setzer Gabbiani posted:

Pretty much just gently caress the entire Virge family of chipsets in general

I had an S3 Virge. haha, it sucked.

I actually SOLD it though, so I guess it can't be all bad.

I remember playing Half-Life in software rendering because the card just wouldn't support it. What's the point?

  • Locked thread