|
I don't think I can even bear to read it. Poor AMD.
|
# ¿ Nov 29, 2011 19:35 |
|
|
# ¿ May 13, 2024 23:46 |
|
Killer robot posted:That's a weird story. I saw AnandTech just announced today that the Bulldozer transistor count is actually 1.2 billion rather than 2 billion as earlier reported. Obviously this doesn't change the performance of the chip itself, but I guess the "2 billion transistors and this is all we get?" arguments do go down a little when they become "1.2 transistors and this is all we get?" instead.Or more to the point, turn into "that much die space on that process and..." since they're working on a significantly lower transistor density than has been earlier estimated. It makes it sound slightly less worse engineered. Still doesn't change the fact of the benchmarks/power & heat. I guess they're trying any kind of damage control right now.
|
# ¿ Dec 3, 2011 12:58 |
|
Coredump posted:Ha ha look how wrong Engadget gets it. But they say the correct thing right there, that the company was just correcting a mistake in handed out specs. It's not news item worthy on a site like Engadget..
|
# ¿ Dec 6, 2011 19:02 |
|
Yeah, I saw that article, and was immediately disappointed.
|
# ¿ Dec 7, 2011 10:29 |
|
Agreed posted:Isn't this the second time in a row for nVidia? Fermi 4ever! No no, they've done worse than rebranding the same chip twice. How about these three: Geforce 9800 GTX+ Geforce GTS 150 Geforce GTS 250 Or these three: Geforce 310 Geforce 210 Geforce 405
|
# ¿ Dec 7, 2011 18:40 |
|
Nvidia is the worst offender of the two by far when it comes to loving you by rebranding old poo poo. There's not even any need for it. OEMs might demand it, but why? People who are fooled by these numbers don't know a goddamn thing about it anyway, so the numbers may as well stay the same. The people who do 10 minutes of research know they're being bent over a pole by being resold the same stuff, and make their decision accordingly. Either way, it doesn't aid brand image in any sense I can see..
|
# ¿ Dec 7, 2011 18:50 |
|
Civil posted:It would be nice if both companies (and Intel, I guess) would adopt a performance rating system that helped consumers out with basic raw performance. You'll still have DX feature support, outputs, interfaces, and all that to worry about, but it would be a nice goodwill gesture. That basically already exists in Windows Experience Index, and it was clearly designed to fulfill this role - but it never gets used. Basically, marketing always wins. They want to shift more cards to gullible consumers, and give OEMs a new number to stick on their spec charts every 6 months. That's all there is to it.
|
# ¿ Dec 7, 2011 19:28 |
|
Shaocaholica posted:That list is still a bit hard to read. Maybe I should make my own on wikipedia. Good luck creating any new page on Wikipedia, there is an army of people with nothing better to do than stamp down on anything that might be interesting, whilst at the same time adding lengthy prose to articles about toys, child abuse and anime. I recently created a page, which was short, but potentially interesting, and along came Wikipedia's finest; a sample of his user page includes such delights as "this user is pagan", "this user is a furry"..
|
# ¿ Dec 8, 2011 10:29 |
|
SRQ posted:My hard drive rating has stayed the same since 2006, even though I've gone through five. Let me guess. 5.9. I'm pretty sure mechanical disks are capped there. That said, Vista and 7 WEI are not comparable.
|
# ¿ Dec 8, 2011 15:48 |
|
adorai posted:You should get the same score with the same hardware, the scale just goes higher in 7. I don't think this is the case. http://www.winsupersite.com/article/windows-7/windows-7-feature-focus-windows-experience-index posted:
http://windows.microsoft.com/en-gb/windows7/achieving-a-perfect-windows-experience-index-score-in-windows-7#section_3 posted:
|
# ¿ Dec 8, 2011 23:48 |
|
Q_res posted:On the subject of Video card suffix horseshit, I think the XT suffix has to be my hands down favorite. ATI brings out the XT label for its high-end Radeons, nVidia turns around an slaps XT on it's low-end poo poo cards. Such a fantastic rear end in a top hat move, you almost have to admire it. ATI doesn't actually advertise the cards with alphabet soup after the names any more. They made a decision not to in more recent years. I see your point though, in the past that's exactly what happened. On the topic of Southern Islands, though, look at that thing! 3GB VRAM. People running eyefinity setups are probably getting a little bit too excited at the moment, since there's nothing else to compete with that. That said, most people at the moment probably don't need more than 1.5~2, depending on their monitor resolution and the games they play. Maybe we could have socketable video RAM in future, as needs will differ massively from 3x 2560x1600 to the guy with a single cheap 1920x1080 screen. HalloKitty fucked around with this message at 10:58 on Dec 16, 2011 |
# ¿ Dec 16, 2011 10:37 |
|
Uh, you could say the K does stand for black, as in CMYK. Key, but in reality, black. Oh never mind, this is a winding path of pointlessness. Of course they stole it from Intel. Why not just U for Unlocked?
|
# ¿ Dec 17, 2011 19:56 |
|
KillHour posted:Honestly, I'm glad they "stole" it like that. It makes it easier for laymen to decipher what the hell all the model numbers mean if the unlocked versions are the same across vendors. The common person doesn't buy a motherboard that allows him to overclock, let alone would they try overclocking even if they knew what it was. In reality it makes very little difference, or that's my feeling. It's not important in the slightest, but my point is that an overclocker, picking out a chip for this purpose, isn't going to get confused by any of the numbers and letters, since they'll do their research anyway. Everyone else just buys whatever piece of poo poo they happen to see in the store that has a big yellow sign saying that the PC is on a cheap deal.
|
# ¿ Dec 17, 2011 20:04 |
|
Spiderdrake posted:Ultra is extreme, but Unlocked is unextreme! ULV and CULV do indeed signify ultra low voltage and.. I think consumer ultra low voltage (although that doesn't make much sense), on Intel chips. It was only a suggestion. Maybe they could use an X. X is just loving extreme. Hell, Z is even more extreme. It's right at the end and it still doesn't give a gently caress.
|
# ¿ Dec 18, 2011 01:54 |
|
As long as they all have die shrinks and clock bumps, I'm sure they'll be priced well and still be very competitive. This could be an interesting round..
|
# ¿ Dec 19, 2011 23:29 |
|
^^ Sorry, I think you misunderstood the whole point of it. It's a build at $1000 based around Bulldozer to see what happens vs the same cost of machine in September (based on Intel). It only completely cements how bad Bulldozer is. There really is absolutely no place in which it can fit, other than the waste disposal. Tom's Hardware posted:PCMark clearly favors the Intel-based system, and even the overclocked 4.5 GHz FX-6100 cannot match the stock Core i5-2500K's results. I think we should go back to talking about Southern Islands instead of Bulldozer. Bulldozer is far too depressing. HalloKitty fucked around with this message at 21:15 on Dec 20, 2011 |
# ¿ Dec 20, 2011 21:08 |
|
Factory Factory posted:I think you have that backwards. Intel is dominating in everything but per-socket performance server-load per-watt performance, and AMD is only competitive there because it crams 12 or 16 cores onto a chip where Intel maxes out at 10. I think all he's getting at that Xeons tend to trail the desktop chips in the generation, so Sandy Bridge hits, Sandy Bridge Xeons don't hit until later, and so on. Used to be the other way round with AMD.
|
# ¿ Dec 21, 2011 01:30 |
|
Agreed posted:In what way? Where does it say "times the perfomance"? For all we know, it could be 1fps vs 1.6fps! Or, the difference between the framerates at the very top of the graph, as opposed to actually the whole bar - I mean look, it starts at 0.8, I think this graph is most definitely misleading. Doesn't mean I'm making GBS threads on AMD though, and yes, the GTX580 is limited at 1.5GB, very much so.
|
# ¿ Dec 21, 2011 10:26 |
|
Don't get me wrong, I'm eagerly awaiting it. But let's be honest, we need reviews from the likes of AnandTech before making a real call.
|
# ¿ Dec 21, 2011 17:04 |
|
The multi-monitor performance is almost certainly spot on. A 3GB card making GBS threads on a 1.5GB card is easy to believe when they're talking about 3x 30 inchers.
|
# ¿ Dec 22, 2011 00:36 |
|
There is a 2GB version, but I agree, it is a limiting factor Star War Sex Parrot posted:http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review But coming in and making GBS threads on all existing single GPU cards, even if expected, is very nice. Idle temps/power consumption = loving awesome. The review is missing CrossFire and Eyefinity performance though, I'd like to see the tipping point at which that 3GB comes into its own. HalloKitty fucked around with this message at 10:43 on Dec 22, 2011 |
# ¿ Dec 22, 2011 10:03 |
|
Yup. This is seriously, seriously impressive.Legit Reviews posted:With the new BIOS installed we were able to reach 1165MHz on the core and 1625MHz on the memory with full stability. If we went up to 1170MHz on the core we would see artifacts in the benchmark, so without increasing the core voltage this is the best that we are able to get today. Remember, the Radeon HD 7970 started out with a 925MHz core clock, so this is an increase of 240MHz or 26%. Seems like it scales like crazy. Seems like going straight for a 79xx nomenclature is almost jumping the gun. Seems like they have headroom to play with on one GPU. Now if only the ATI guys could teach the AMD guys a thing about chip design; the irony being that every available AMD CPU, no matter how overclocked, is going to bottleneck the ever-loving poo poo out of this card. HalloKitty fucked around with this message at 20:19 on Dec 22, 2011 |
# ¿ Dec 22, 2011 20:11 |
|
Star War Sex Parrot posted:It'll be interesting to see if partners go crazy with it and ship $650 factory overclocked versions if the architecture really has that much headroom. Yeah, this is my thought. It's going to be worth waiting a little while to see the cards with massive coolers and beefy overclocks from the factory that will basically push it into a much higher performance class.
|
# ¿ Dec 22, 2011 23:19 |
|
ZeroCore is seriously awesome. Almost no power consumption when the screen is blanked, and the fan goes off. Anyone who leaves their machine on to do torrenting or anything like that will massively benefit from this.
|
# ¿ Dec 23, 2011 12:14 |
|
Holy loving balls. We knew it had headroom and scaled WELL, but a factory one overclocked to 1335Mhz from 925? Mother of god that is going to be fast; as in, likely faster than any dual GPU/single card solution currently available. I hope this sheet is close to the truth.
|
# ¿ Jan 6, 2012 21:24 |
|
Good thing they have hit today on the sites I checked. I spec'd out a machine for a friend last night with everything except the graphics card. 7970 completes that picture, and fits in the budget given. Yay for building monstrous machines! Edit: There's an XFX out with a GPU clock of 1GHz, so this bodes well for Sapphire's crazy sheet. HalloKitty fucked around with this message at 09:59 on Jan 9, 2012 |
# ¿ Jan 9, 2012 09:56 |
|
Agreed posted:But the figures for a single 7970 even heavily overclocked aren't past that point. Two 580s still outperforms one 7970 like crazy in games that are well supported. http://www.anandtech.com/show/5314/xfxs-radeon-hd-7970-black-edition-double-dissipation-the-first-semicustom-7970/6 I could cherry pick Metro: 2033 here and show that to be eerily close to not being the case, even though I of course realise SLI 580s are faster than a 590. Point is, the 7970 is going to guzzle a hell of a lot less power. I never really thought SLI or CF was a good idea, though, and I tend not to recommend it, for power & cost reasons mainly, but also the galling fact that a few years down the line you have a pile of expensive graphics cards with no use, just so you could gain a few FPS here and there (odd issues with FPS loss and drivers notwithstanding), even though most of your time was spent using the cards in aero, which Intel onboard would have sufficed for. HalloKitty fucked around with this message at 22:29 on Jan 9, 2012 |
# ¿ Jan 9, 2012 22:27 |
|
Bear in mind, in that HardOCP article, the GTX 580 used is factory overclocked to a hefty degree. 772 to 860.. Edit: hah, you edited your post. I preferred the old one, that was clear amazement.
|
# ¿ Jan 9, 2012 23:44 |
|
freeforumuser posted:Bjorn3D 3870K review Ah, christ, for a minute I poo poo myself. I saw 3xxxK, and thought.. gently caress, Ivy Bridge sucks, how? Oh, AMD, copying a little name recognition are we? HalloKitty fucked around with this message at 13:26 on Jan 11, 2012 |
# ¿ Jan 11, 2012 09:54 |
|
I wonder what the results will be. I'm still dubious, seeing as sites have already overclocked the processors and received poor results.
|
# ¿ Jan 11, 2012 21:29 |
|
The only way you could think Bulldozer was awesome is if you somehow completely ignored all available evidence. Quite a feat.
|
# ¿ Jan 15, 2012 23:03 |
|
text editor posted:You should have asked him if he lapped the CPU too. Oh god, now we're talking. I remember when heat spreaders were thought of as just another thing in the way, and people would prise that poo poo off to get a direct core contact. Maybe even lapping the core.
|
# ¿ Jan 15, 2012 23:38 |
|
Oh, that's just glorious. If there was ever a simple example of having more money than sense, that would be it.
|
# ¿ Jan 18, 2012 11:13 |
|
Install Gentoo posted:I did explicitly say he was running XP home and that that is 32 bit only, yes. Also he'd already told the forum that he'd added more RAM recently to speed things up and it was "totally working" even though it of course couldn't be. I'm not amazed. I knew a guy who bought an Alienware desktop running XP 32 and 8GB RAM, years ago, when Geforce 6800 was hot poo poo, if memory serves.
|
# ¿ Jan 18, 2012 19:58 |
|
tijag posted:Not sure what to make of this. Meaningless speculation mixed in with some traffic boosting bullshit is the way I see it. Even if Kepler is amazing, this isn't a way to predict it. HalloKitty fucked around with this message at 23:18 on Jan 19, 2012 |
# ¿ Jan 19, 2012 22:58 |
|
Longinus00 posted:When's the last time ATI/AMD has been "on top" anyway? The only thing that would be surprising is if Nvidia was able to price match ATI/AMDs top card. You'd have to define that more carefully. Technically, ATI has been on top for single cards for quite a while. The fastest card last generation was the Radeon 6990. (It's the fastest now, until dual Tahiti hits, but I'd still get a single 7970 rather than a 6990 now). Before that, it was the Radeon 5970. I don't know how you want to deal with the other gaps in time, such as the Radeon 4870X2 vs Geforce 9800 series, the Radeon obviously winning initially, but later it had to stand against the Geforce GTX 280, 295, until the 5xxx series hit. Basically, AMD has been aggressive in getting out pretty fast stuff soon, then doubling up, to give you the fastest single card. NVIDIA tended to come later with a monolithic design, having troubles when sticking two of them on a card. This is no more obvious than with the Geforce 590 vs the 6990. The Geforce 590 should be faster, there's no doubt Geforce 580 SLI is faster than a 6990. But it had to be clocked down significantly, evaporating a lot of the performance. HalloKitty fucked around with this message at 11:17 on Jan 20, 2012 |
# ¿ Jan 20, 2012 11:12 |
|
Viewed that way, NVIDIA has a rosier track record. This was intentional, though. Agreed on the issues with Crossfire/SLI, more trouble than its worth, usually. HalloKitty fucked around with this message at 18:26 on Jan 20, 2012 |
# ¿ Jan 20, 2012 18:22 |
|
VorpalFish posted:AMD has been kicking rear end for a long, long time now, at least with regards to graphics. It's funny, because there's truth to it, yet the Steam survey shows NVIDIA ahead in share by a decent margin. Maybe a lot of people still think it's the drivers HalloKitty fucked around with this message at 19:29 on Jan 20, 2012 |
# ¿ Jan 20, 2012 19:23 |
|
grumperfish posted:Kyro was a PowerVR part. oh Banshee I remember looking at that, and thinking it was poo poo hot, back when I was at school. 2D AND 3D all in one card?
|
# ¿ Jan 21, 2012 18:05 |
|
|
# ¿ May 13, 2024 23:46 |
|
Setzer Gabbiani posted:Pretty much just gently caress the entire Virge family of chipsets in general I had an S3 Virge. haha, it sucked. I actually SOLD it though, so I guess it can't be all bad. I remember playing Half-Life in software rendering because the card just wouldn't support it. What's the point?
|
# ¿ Jan 22, 2012 17:49 |