|
HalloKitty posted:Bear in mind, in that HardOCP article, the GTX 580 used is factory overclocked to a hefty degree. 772 to 860.. Oh, no, I'm still totally at the performance, but reading that it showed marked artifacts and instability without the fan pegged (and those god damned fans are loud) cooled me down some - there's no way everyone's going to get that kind of clock. But they don't have to, the performance delta is already ridiculous at any point between "the clock that every review site gets at stock voltage" and the over-volted values. It just seems, I dunno, loving impossible for a card to actually be between 48% and 80% better than a card of the previous generation. 860mhz is a middlin' overclock for the 580, I had that at stock voltage. As I mentioned Fermi scales well with GPU/shaders. Cards tend to hit a GPU/shader wall before voltage, so my max of 925mhz/1850mhz GPU/Shaders and 1050mhz GDDR5 at 1.125V is good but not especially so for a 580. Lots of people can get them up to 940-950. But, jesus, the 7970 at the max voltage, what can you do but movax posted:SemiAccurate had a quiet article that mentioned Kepler slipping further to 2013, or maybe I was drunk and confused "Kepler slipping to 2012" with the year still be 2011, and beginning to sob helplessly. If nVidia doesn't show Kepler with lots of "gently caress YOU, ATI! HAHA!" fanfare at CES, it'll almost certainly be a Summer release I'd think. And we might see a generation where ATI wins performance, not price:performance. Still, Fermi was/is a monstrously powerful architecture, nVidia's been at work on Kepler for longer than ATI has been at work on Southern Isles, they're both using TSMC 28nm process, surely... No way in hell I'm going to over graphics cards, but they're not making the process of picking a path very easy. With ATI's sub-7900 cards looking less appealing on specs (and some of them just continuing the tradition of carrying over previous gen models with a new SKU), it's really all-in or avoid for now, at least, and I would strongly prefer an nVidia card for CUDA accelerated DSP. But that requires them to come out with one, and is at war with my inner child over sparkly poo poo. Agreed fucked around with this message at 00:00 on Jan 10, 2012 |
# ? Jan 9, 2012 23:53 |
|
|
# ? May 14, 2024 11:39 |
|
Only tangentially related to AMD, but SemiAccurate reports that Global Foundries' Fab 8 in New York is now up and running at 32nm. It seems likely that this fab is going to be making IBM POWER-based APUs for upcoming next-gen consoles.
|
# ? Jan 10, 2012 04:50 |
|
Bjorn3D 3870K review Superbly disappointing overclocking numbers. 1.4625V and can't even break 3.6GHz. I don't want to know the power consumption anymore, it's just so sad.
|
# ? Jan 11, 2012 08:56 |
|
freeforumuser posted:Bjorn3D 3870K review Ah, christ, for a minute I poo poo myself. I saw 3xxxK, and thought.. gently caress, Ivy Bridge sucks, how? Oh, AMD, copying a little name recognition are we? HalloKitty fucked around with this message at 13:26 on Jan 11, 2012 |
# ? Jan 11, 2012 09:54 |
|
freeforumuser posted:Bjorn3D 3870K review God loving drat it. I'm not saying stick a fork in AMD's CPUs or anything, but just god loving drat it. This is lovely, lovely news. Feed it voltage that would be definitely unsafe on Intel's same-sized process, I really don't buy that AMD's processors are that much less prone to electromigration... And it still won't hit 4 GHz? how do you expect to compete? I bet a lot of engineers are hearing something like "Piledriver needs to loving KNOCK IT OUT OF THE PARK, FIX THIS!!!" Edit: That is a horrible review. It gets beat by K10 processors at similar clocks (which they have at stock speeds), while it is overclocked to the edge of stability. The most common faint praise: "Here we can see that the overclocked A8-3870K is almost as fast as the Core i3 2100" Real winner there, gently caress everything I said about them needing to invest more in Llano. They need to make Bulldozer competitive. It wouldn't take -that- much to fix the single-threaded performance, would it? From a u-architectural perspective, it's not total poo poo, not as efficient as it could be but not unfixable, and even skeptics have to admit that there's some stuff it really does well at. Maybe with Piledriver they can put out a genuinely competitive part on more than just raw super-multithreaded, int heavy tasks. They have to get the single-threaded performance to compete, because Intel stands up very well on multi-threaded with their higher end parts while also just dominating them on single-threaded. drat it, AMD, pull up, the plane's gonna crash into the mountain. Agreed fucked around with this message at 15:01 on Jan 11, 2012 |
# ? Jan 11, 2012 14:24 |
|
HalloKitty posted:Ah, christ, for a minute I poo poo myself. I saw 3xxxK, and thought.. gently caress, Ivy Bridge sucks, how? Same here, hah, took me a second to remember which thread I was in.
|
# ? Jan 11, 2012 16:37 |
|
To be fair, I don't think Bjorn3D overclocked the memory, which is a big source of performance increases for Llanos. I'd like to see what a site like Anandtech is able to pull off with an unlocked processor. But yeah, K10 is still not very efficient, especially without L3. Hopefully Piledriver is at least not humiliatingly bad.
|
# ? Jan 11, 2012 19:13 |
|
Isn't higher clocked memory only relevant to the iGPU performance? It shouldn't do anything to remedy a lackluster CPU.
|
# ? Jan 11, 2012 19:17 |
|
VorpalFish posted:Isn't higher clocked memory only relevant to the iGPU performance? It shouldn't do anything to remedy a lackluster CPU. While it definitely has large performance gains for the integrated GPU, K10-based are just not as memory efficient as Sandy Bridge and faster memory does show real-world performance. I want to say it's around 1866mhz for Llano is 1333mhz for Sandy Bridge.
|
# ? Jan 11, 2012 19:23 |
|
Eh, regardless, llano never struck me as making a lot of sense for a desktop platform, where you can drop in lower-midrange discrete graphics so easily and cheaply. Seems like the best niche for it is and always was ultra cheap gaming laptops.
|
# ? Jan 11, 2012 19:30 |
|
I think it's still the best option for lower-end systems where you won't be using a dGPU and for HTPC applications, but yeah laptops are where it's very compelling. It sucks that desktop Llanos don't support Turbo Core (except the low-power models), and that laptop Llanos don't support DDR3-1866.
|
# ? Jan 11, 2012 19:38 |
|
The scheduler fix for Windows 7 and Windows Server 2008 is out, for real this time.
|
# ? Jan 11, 2012 21:23 |
|
I wonder what the results will be. I'm still dubious, seeing as sites have already overclocked the processors and received poor results.
|
# ? Jan 11, 2012 21:29 |
|
AMD demoed their upcoming Trinity APUs at CES, they're expected to be available this summer in 17W and 35W models. Trinity contains two Piledriver modules (four cores) and an as-yet unannounced GPU. Their demo was a laptop running three monitors, on one was a DX11 game (Dirt 3), on the second was a GPU-accelerated video transcode, and on the third was a full-screen video.
|
# ? Jan 13, 2012 02:48 |
|
They also demoed a single-cable DisplayPort, USB 3.0, and power connector. It'll take a bit longer than Trinity to come out, though.
|
# ? Jan 13, 2012 05:39 |
|
That's pretty slick, though the 90MB/sec (720mbps) cap on USB throughput may end up being a limiting factor. That said, it will let you hook up a monitor with a built-in USB hub and USB devices without the need for another cable. Most importantly its essentially free versus Thunderbolt, as USB3.0 controllers are already built into modern chipsets. I worry about using the same port causing confusion, however.
|
# ? Jan 13, 2012 08:06 |
|
Look out, Ultrabooks.quote:AMD aiming to undercut Ultrabooks with $500 Trinity ultrathins This will be huge for AMD if they can keep them between $500-$700. The Intel ultrabooks we have seen so far aren't much cheaper than the Macbook Airs, which takes most of the appeal away from them.
|
# ? Jan 13, 2012 15:42 |
|
That's a much more winnable war than anything they've got going on in high-performance processors right now. Good news, hope they can bring it down and keep it below Intel's prices.
|
# ? Jan 13, 2012 17:41 |
|
AMD has confirmed that Trinity will be VLIW4, just like the Radeon HD 69000-series. Since the shaders come in blocks of 64, I'm betting Trinity will have 384 shaders in its high-end configuration. This should make things run more efficiently, and give us the image quality and geometry throughput benefits that come from the 6900-series cards. They'll still be held back by memory bandwidth, but maybe AMD will debut DDR3-2133 support? It would be cool if VLIW4 found its way to Brazos 2.0 as well. Most of the article is about the Radeon HD 7000M-series, confirming that everything from the 7700M on up will be GCN, while the lower cards will be a new stepping and core/clock configurations of the HD 6000M-series.
|
# ? Jan 15, 2012 08:46 |
|
I met an actual Bulldozer fanboy yesterday He had a nice watercooling setup going (red flag #1) so I asked what he was running, it was a 6-core Phenom. I mentioned offhandedly about Bulldozer being somewhat of a failure when compared to the 2500/2600, and he got really upset and was like "WHAT NO BRO, ITS AWESOME, I NEED THOSE CORES, IVE GOT LIKE 50 WINDOWS OPEN AT ALL TIMES!" At that point I wasn't going to try to discuss further, but there are AMD customers out there somewhere!
|
# ? Jan 15, 2012 22:46 |
|
The only way you could think Bulldozer was awesome is if you somehow completely ignored all available evidence. Quite a feat.
|
# ? Jan 15, 2012 23:03 |
|
HalloKitty posted:The only way you could think Bulldozer was awesome is if you somehow completely ignored all available evidence. What does this scrub know I mean the FX series is awesome. I need those cor...
|
# ? Jan 15, 2012 23:10 |
|
I have a nice watercooling setup going (but I got a dud 2500k that 'only' does 4.4Ghz. I like w/c because its almost silent at full load)
|
# ? Jan 15, 2012 23:20 |
|
dud root posted:I have a nice watercooling setup going I got water cooling once I no longer have it (sprung a leak the rug still has a green stain from the fluid). It looked really cool. It was mostly because I was bringing my computer to the dorm where the entire building was programming/IT majors, so people actually saw and thought it looked cool. Oh god I spent so much money on making the outside look cool, but I guess people spent even more making their cars look cool in highschool and I avoided that poo poo.
|
# ? Jan 15, 2012 23:25 |
|
pixaal posted:I got water cooling once I no longer have it (sprung a leak the rug still has a green stain from the fluid). It looked really cool. It was mostly because I was bringing my computer to the dorm where the entire building was programming/IT majors, so people actually saw and thought it looked cool. Oh god I spent so much money on making the outside look cool, but I guess people spent even more making their cars look cool in highschool and I avoided that poo poo. Yeah I used to water-cool back in college as well, but once heat-pipe coolers hit the stage, I went back to air. My Ultra 120 has been going strong, and I'm confident that adapters will keep coming out for upcoming CPU sockets.
|
# ? Jan 15, 2012 23:26 |
|
movax posted:I met an actual Bulldozer fanboy yesterday You should have asked him if he lapped the CPU too.
|
# ? Jan 15, 2012 23:32 |
|
text editor posted:You should have asked him if he lapped the CPU too. Oh god, now we're talking. I remember when heat spreaders were thought of as just another thing in the way, and people would prise that poo poo off to get a direct core contact. Maybe even lapping the core.
|
# ? Jan 15, 2012 23:38 |
|
HalloKitty posted:Oh god, now we're talking. I remember when heat spreaders were thought of as just another thing in the way, and people would prise that poo poo off to get a direct core contact. Maybe even lapping the core. I believe AMD Opterons were the most popular for this, IIRC there were people on DFI forums who were slicing off the heatsink and replacing it with a goddamn sanded down penny to fill the gap to the cooler, so it was a constant copper piece from the top of core to the copper core of the heatsink. AMD 'fanboys' never change.
|
# ? Jan 16, 2012 00:10 |
|
text editor posted:I believe AMD Opterons were the most popular for this, IIRC there were people on DFI forums who were slicing off the heatsink and replacing it with a goddamn sanded down penny to fill the gap to the cooler, so it was a constant copper piece from the top of core to the copper core of the heatsink. Tell me they weren't bright enough to know you need an old penny for that. That's all I need.
|
# ? Jan 16, 2012 00:24 |
|
I think the ones bright enough to know that were using silver dollars instead.
|
# ? Jan 16, 2012 01:01 |
|
movax posted:Yeah I used to water-cool back in college as well, but once heat-pipe coolers hit the stage, I went back to air. My Ultra 120 has been going strong, and I'm confident that adapters will keep coming out for upcoming CPU sockets. They just came out with a new revision of it, so I would imagine.
|
# ? Jan 16, 2012 01:57 |
|
text editor posted:I believe AMD Opterons were the most popular for this, IIRC there were people on DFI forums who were slicing off the heatsink and replacing it with a goddamn sanded down penny to fill the gap to the cooler, so it was a constant copper piece from the top of core to the copper core of the heatsink.
|
# ? Jan 16, 2012 02:29 |
|
Agreed posted:It's worth if if you are can afford, are willing to, and intend to continually upgrade. If it's just part of your cycle, then you can buy in each generation at a nice sweet spot (this generation, two 6950s or two 570s; next generation, who knows) and get dramatically superior performance than a top end card for a same or marginally higher outlay. If you go top of the line every year, you're spending a lot of money trying to push the highest numbers. I'm usually a subscriber to the "high end sucks" newsletter, but if this really does 4W idle that changes a lot. The power envelope, at least to me, is the main reason that the high end really sucks. I couldn't care less if it guzzles an insane amount of power when I need it if during the other 90% of the time I don't need it it is at 4W, and 4W idle means I don't have to pay the power premium in a couple years when mainstream cards catch up. I'm building out a computer right now since I'm on a Opteron 170 /w a Radeon 4800 series card since my main box died; and if this really does sip 4W during the 16 hours or so when it is on and I'm not using it, then it is going onto the spec sheet. Assuming I can even find one that is.
|
# ? Jan 16, 2012 07:48 |
|
Chuu posted:I'm usually a subscriber to the "high end sucks" newsletter, but if this really does 4W idle that changes a lot. The power envelope, at least to me, is the main reason that the high end really sucks. I couldn't care less if it guzzles an insane amount of power when I need it if during the other 90% of the time I don't need it it is at 4W, and 4W idle means I don't have to pay the power premium in a couple years when mainstream cards catch up. I've had more time to think about it and look at the results people are getting, and that's changed my opinion a bit, especially as far as giving advice goes - I'll probably be waiting for Kepler if they manage to get it out in a reasonably timely manner, since I need the CUDA, but here's what I'm feeling about the 7970 now without needing to crosspost/repost: One Two I do agree the very aggressive power gating is extremely cool, hopefully nVidia can bring some real power savings to the table as well since they've stated that as a big goal for Kepler going on some time now.
|
# ? Jan 16, 2012 08:08 |
|
text editor posted:AMD 'fanboys' never change. Fanboys are retarded - but none so perfectly as a guy I saw on a forum, who bought a brand new Intel P4 EE and prised the heatspreader off it - not realising it was soldered on. £900 dead chip.
|
# ? Jan 18, 2012 01:23 |
|
Oh, that's just glorious. If there was ever a simple example of having more money than sense, that would be it.
|
# ? Jan 18, 2012 11:13 |
|
I totally remember that post on Hardforums. The innards of the chip were ripped out and you could see the different metal layers of the chip. It looked like an x-ray version of some of the VLSI diagrams put out in press kits but in a vertical cross section form.
|
# ? Jan 18, 2012 18:43 |
|
wipeout posted:Fanboys are retarded - but none so perfectly as a guy I saw on a forum, who bought a brand new Intel P4 EE and prised the heatspreader off it - not realising it was soldered on. £900 dead chip. Reminds me of this kid who in 2011 was running XP Home (32 bit of course since 64 bit doesn't exist for it) on a 6 core AMD high end CPU, with 16 GB of RAM and two video cards, which needed to have power leads plugged into them, not having those plugged in so they were struggling by on PCI-Express power, and bragging about the system. And in a separate forum on that same site, was asking for help for why his machine wasn't performing as good as it should, while insisting running a 32 bit 10 year old OS made it faster.
|
# ? Jan 18, 2012 18:47 |
|
Install Gentoo posted:And in a separate forum on that same site, was asking for help for why his machine wasn't performing as good as it should, while insisting running a 32 bit 10 year old OS made it faster. It was made for computers built 10 years ago. Computers are a gajillion times faster now, so it HAS to be faster than if I were to run Windows 7 .
|
# ? Jan 18, 2012 18:53 |
|
|
# ? May 14, 2024 11:39 |
|
Install Gentoo posted:Reminds me of this kid who in 2011 was running XP Home (32 bit of course since 64 bit doesn't exist for it) on a 6 core AMD high end CPU, with 16 GB of RAM and two video cards, which needed to have power leads plugged into them, not having those plugged in so they were struggling by on PCI-Express power, and bragging about the system. A little bit of knowledge is a dangerous thing. Seriously give someone just a bit of computer knowledge and they think they are kings of everything and do really loving stupid poo poo. At least logically more RAM should always work, and the extra power connectors? Nah man that's for if you have 2 power supplies for some reason. gently caress it I can't even figure out the logic in not plugging in a power connector beyond not seeing it. I've forgotten to plug them in before the cards don't even turn on, and a nice scary red light starts blinking, or at least that's what happened with the x1950 e: unpronounceable posted:It was made for computers built 10 years ago. Computers are a gajillion times faster now, so it HAS to be faster than if I were to run Windows 7 . I see your logic, I'm going to run windows 3.1, and if I need DirectX I'll boot into windows ME.
|
# ? Jan 18, 2012 18:55 |