Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

HalloKitty posted:

Bear in mind, in that HardOCP article, the GTX 580 used is factory overclocked to a hefty degree. 772 to 860..

Edit: hah, you edited your post. I preferred the old one, that was clear amazement. :D

Oh, no, I'm still totally :stare: at the performance, but reading that it showed marked artifacts and instability without the fan pegged (and those god damned fans are loud) cooled me down some - there's no way everyone's going to get that kind of clock. But they don't have to, the performance delta is already ridiculous at any point between "the clock that every review site gets at stock voltage" and the over-volted values.

It just seems, I dunno, loving impossible for a card to actually be between 48% and 80% better than a card of the previous generation.

860mhz is a middlin' overclock for the 580, I had that at stock voltage. As I mentioned Fermi scales well with GPU/shaders. Cards tend to hit a GPU/shader wall before voltage, so my max of 925mhz/1850mhz GPU/Shaders and 1050mhz GDDR5 at 1.125V is good but not especially so for a 580. Lots of people can get them up to 940-950. But, jesus, the 7970 at the max voltage, what can you do but :stare:

movax posted:

SemiAccurate had a quiet article that mentioned Kepler slipping further to 2013, or maybe I was drunk and confused "Kepler slipping to 2012" with the year still be 2011, and beginning to sob helplessly.

Also, I think Agreed should :toxx: himself somehow with regards to the 7970, for our entertainment :toot:

If nVidia doesn't show Kepler with lots of "gently caress YOU, ATI! HAHA!" fanfare at CES, it'll almost certainly be a Summer release I'd think. And we might see a generation where ATI wins performance, not price:performance. Still, Fermi was/is a monstrously powerful architecture, nVidia's been at work on Kepler for longer than ATI has been at work on Southern Isles, they're both using TSMC 28nm process, surely...

No way in hell I'm going to :toxx: over graphics cards, but they're not making the process of picking a path very easy. With ATI's sub-7900 cards looking less appealing on specs (and some of them just continuing the tradition of carrying over previous gen models with a new SKU), it's really all-in or avoid for now, at least, and I would strongly prefer an nVidia card for CUDA accelerated DSP. But that requires them to come out with one, and is at war with my inner child over sparkly poo poo.

Agreed fucked around with this message at 00:00 on Jan 10, 2012

Adbot
ADBOT LOVES YOU

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
Only tangentially related to AMD, but SemiAccurate reports that Global Foundries' Fab 8 in New York is now up and running at 32nm. It seems likely that this fab is going to be making IBM POWER-based APUs for upcoming next-gen consoles.

freeforumuser
Aug 11, 2007
Bjorn3D 3870K review

Superbly disappointing overclocking numbers. 1.4625V and can't even break 3.6GHz. I don't want to know the power consumption anymore, it's just so sad.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

freeforumuser posted:

Bjorn3D 3870K review

Superbly disappointing overclocking numbers. 1.4625V and can't even break 3.6GHz. I don't want to know the power consumption anymore, it's just so sad.

Ah, christ, for a minute I poo poo myself. I saw 3xxxK, and thought.. gently caress, Ivy Bridge sucks, how?
Oh, AMD, copying a little name recognition are we?

HalloKitty fucked around with this message at 13:26 on Jan 11, 2012

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

freeforumuser posted:

Bjorn3D 3870K review

Superbly disappointing overclocking numbers. 1.4625V and can't even break 3.6GHz. I don't want to know the power consumption anymore, it's just so sad.

God loving drat it.

I'm not saying stick a fork in AMD's CPUs or anything, but just god loving drat it. This is lovely, lovely news. Feed it voltage that would be definitely unsafe on Intel's same-sized process, I really don't buy that AMD's processors are that much less prone to electromigration... And it still won't hit 4 GHz? how do you expect to compete?

I bet a lot of engineers are hearing something like "Piledriver needs to loving KNOCK IT OUT OF THE PARK, FIX THIS!!!" :negative:

Edit: That is a horrible review. It gets beat by K10 processors at similar clocks (which they have at stock speeds), while it is overclocked to the edge of stability. The most common faint praise: "Here we can see that the overclocked A8-3870K is almost as fast as the Core i3 2100"

Real winner there, gently caress everything I said about them needing to invest more in Llano. They need to make Bulldozer competitive. It wouldn't take -that- much to fix the single-threaded performance, would it? From a u-architectural perspective, it's not total poo poo, not as efficient as it could be but not unfixable, and even skeptics have to admit that there's some stuff it really does well at. Maybe with Piledriver they can put out a genuinely competitive part on more than just raw super-multithreaded, int heavy tasks. They have to get the single-threaded performance to compete, because Intel stands up very well on multi-threaded with their higher end parts while also just dominating them on single-threaded. drat it, AMD, pull up, the plane's gonna crash into the mountain.

Agreed fucked around with this message at 15:01 on Jan 11, 2012

movax
Aug 30, 2008

HalloKitty posted:

Ah, christ, for a minute I poo poo myself. I saw 3xxxK, and thought.. gently caress, Ivy Bridge sucks, how?
Oh, AMD, copying a little name recognition are we?

Same here, hah, took me a second to remember which thread I was in.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
To be fair, I don't think Bjorn3D overclocked the memory, which is a big source of performance increases for Llanos. I'd like to see what a site like Anandtech is able to pull off with an unlocked processor. But yeah, K10 is still not very efficient, especially without L3. Hopefully Piledriver is at least not humiliatingly bad.

VorpalFish
Mar 22, 2007
reasonably awesometm

Isn't higher clocked memory only relevant to the iGPU performance? It shouldn't do anything to remedy a lackluster CPU.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

VorpalFish posted:

Isn't higher clocked memory only relevant to the iGPU performance? It shouldn't do anything to remedy a lackluster CPU.

While it definitely has large performance gains for the integrated GPU, K10-based are just not as memory efficient as Sandy Bridge and faster memory does show real-world performance. I want to say it's around 1866mhz for Llano is 1333mhz for Sandy Bridge.

VorpalFish
Mar 22, 2007
reasonably awesometm

Eh, regardless, llano never struck me as making a lot of sense for a desktop platform, where you can drop in lower-midrange discrete graphics so easily and cheaply. Seems like the best niche for it is and always was ultra cheap gaming laptops.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
I think it's still the best option for lower-end systems where you won't be using a dGPU and for HTPC applications, but yeah laptops are where it's very compelling. It sucks that desktop Llanos don't support Turbo Core (except the low-power models), and that laptop Llanos don't support DDR3-1866.

Maxwell Adams
Oct 21, 2000

T E E F S
The scheduler fix for Windows 7 and Windows Server 2008 is out, for real this time.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
I wonder what the results will be. I'm still dubious, seeing as sites have already overclocked the processors and received poor results.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
AMD demoed their upcoming Trinity APUs at CES, they're expected to be available this summer in 17W and 35W models. Trinity contains two Piledriver modules (four cores) and an as-yet unannounced GPU. Their demo was a laptop running three monitors, on one was a DX11 game (Dirt 3), on the second was a GPU-accelerated video transcode, and on the third was a full-screen video.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
They also demoed a single-cable DisplayPort, USB 3.0, and power connector. It'll take a bit longer than Trinity to come out, though.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
That's pretty slick, though the 90MB/sec (720mbps) cap on USB throughput may end up being a limiting factor. That said, it will let you hook up a monitor with a built-in USB hub and USB devices without the need for another cable. Most importantly its essentially free versus Thunderbolt, as USB3.0 controllers are already built into modern chipsets. I worry about using the same port causing confusion, however.

Bob Morales
Aug 18, 2006


Just wear the fucking mask, Bob

I don't care how many people I probably infected with COVID-19 while refusing to wear a mask, my comfort is far more important than the health and safety of everyone around me!

Look out, Ultrabooks.

quote:

AMD aiming to undercut Ultrabooks with $500 Trinity ultrathins

AMD has been showcasing laptops based on its upcoming Trinity processor at CES this week. The company is hoping to bring thin and light Ultrabook-style machines—though AMD calls them "ultrathins," to avoid Intel's trademarks—to market for as little as $500. This would substantially undercut Intel-powered Ultrabooks, which currently start at $800. Intel hopes to reduce the Ultrabook entry price to $700 by the end of the year.

Each Trinity chip will contain a CPU and a GPU. The CPU will be a second generation Bulldozer core, codenamed Piledriver. The GPU portion will be based on AMD's Southern Islands architecture, which made its debut late in 2011 with the launch of the Radeon HD 7970.

There will be two lines of Trinity chips; low power 17 W ones for ultrathins, and higher power 35 W ones for standard laptops. The ultrathin-oriented chips will have about the same performance as AMD's current Llano A-series chips, but with half the power draw. The high-power chips will have a 25 percent faster CPU and a 50 percent faster GPU.

AMD did not say when the chips would be released, but the company intends to disclose more about its release strategy at its financial analyst meeting in February.

This will be huge for AMD if they can keep them between $500-$700. The Intel ultrabooks we have seen so far aren't much cheaper than the Macbook Airs, which takes most of the appeal away from them.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

That's a much more winnable war than anything they've got going on in high-performance processors right now. Good news, hope they can bring it down and keep it below Intel's prices.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
AMD has confirmed that Trinity will be VLIW4, just like the Radeon HD 69000-series. Since the shaders come in blocks of 64, I'm betting Trinity will have 384 shaders in its high-end configuration. This should make things run more efficiently, and give us the image quality and geometry throughput benefits that come from the 6900-series cards. They'll still be held back by memory bandwidth, but maybe AMD will debut DDR3-2133 support? It would be cool if VLIW4 found its way to Brazos 2.0 as well.

Most of the article is about the Radeon HD 7000M-series, confirming that everything from the 7700M on up will be GCN, while the lower cards will be a new stepping and core/clock configurations of the HD 6000M-series.

movax
Aug 30, 2008

I met an actual Bulldozer fanboy yesterday :psyduck:

He had a nice watercooling setup going (red flag #1) so I asked what he was running, it was a 6-core Phenom. I mentioned offhandedly about Bulldozer being somewhat of a failure when compared to the 2500/2600, and he got really upset and was like "WHAT NO BRO, ITS AWESOME, I NEED THOSE CORES, IVE GOT LIKE 50 WINDOWS OPEN AT ALL TIMES!"

At that point I wasn't going to try to discuss further, but there are AMD customers out there somewhere!

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
The only way you could think Bulldozer was awesome is if you somehow completely ignored all available evidence.

Quite a feat.

Previa_fun
Nov 10, 2004

HalloKitty posted:

The only way you could think Bulldozer was awesome is if you somehow completely ignored all available evidence.

Quite a feat.

What does this scrub know I mean the FX series is awesome. I need those cor...


:v:

dud root
Mar 30, 2008
I have a nice watercooling setup going :(

(but I got a dud 2500k that 'only' does 4.4Ghz. I like w/c because its almost silent at full load)

pixaal
Jan 8, 2004

All ice cream is now for all beings, no matter how many legs.


dud root posted:

I have a nice watercooling setup going :(

(but I got a dud 2500k that 'only' does 4.4Ghz. I like w/c because its almost silent at full load)

I got water cooling once I no longer have it (sprung a leak the rug still has a green stain from the fluid). It looked really cool. It was mostly because I was bringing my computer to the dorm where the entire building was programming/IT majors, so people actually saw and thought it looked cool. Oh god I spent so much money on making the outside look cool, but I guess people spent even more making their cars look cool in highschool and I avoided that poo poo.

movax
Aug 30, 2008

pixaal posted:

I got water cooling once I no longer have it (sprung a leak the rug still has a green stain from the fluid). It looked really cool. It was mostly because I was bringing my computer to the dorm where the entire building was programming/IT majors, so people actually saw and thought it looked cool. Oh god I spent so much money on making the outside look cool, but I guess people spent even more making their cars look cool in highschool and I avoided that poo poo.

Yeah I used to water-cool back in college as well, but once heat-pipe coolers hit the stage, I went back to air. My Ultra 120 has been going strong, and I'm confident that adapters will keep coming out for upcoming CPU sockets.

text editor
Jan 8, 2007

movax posted:

I met an actual Bulldozer fanboy yesterday :psyduck:

He had a nice watercooling setup going (red flag #1) so I asked what he was running, it was a 6-core Phenom. I mentioned offhandedly about Bulldozer being somewhat of a failure when compared to the 2500/2600, and he got really upset and was like "WHAT NO BRO, ITS AWESOME, I NEED THOSE CORES, IVE GOT LIKE 50 WINDOWS OPEN AT ALL TIMES!"

At that point I wasn't going to try to discuss further, but there are AMD customers out there somewhere!

You should have asked him if he lapped the CPU too.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

text editor posted:

You should have asked him if he lapped the CPU too.

Oh god, now we're talking. I remember when heat spreaders were thought of as just another thing in the way, and people would prise that poo poo off to get a direct core contact. Maybe even lapping the core.

text editor
Jan 8, 2007

HalloKitty posted:

Oh god, now we're talking. I remember when heat spreaders were thought of as just another thing in the way, and people would prise that poo poo off to get a direct core contact. Maybe even lapping the core.

I believe AMD Opterons were the most popular for this, IIRC there were people on DFI forums who were slicing off the heatsink and replacing it with a goddamn sanded down penny to fill the gap to the cooler, so it was a constant copper piece from the top of core to the copper core of the heatsink.


AMD 'fanboys' never change.

Killer robot
Sep 6, 2010

I was having the most wonderful dream. I think you were in it!
Pillbug

text editor posted:

I believe AMD Opterons were the most popular for this, IIRC there were people on DFI forums who were slicing off the heatsink and replacing it with a goddamn sanded down penny to fill the gap to the cooler, so it was a constant copper piece from the top of core to the copper core of the heatsink.


AMD 'fanboys' never change.

Tell me they weren't bright enough to know you need an old penny for that. That's all I need.

Zhentar
Sep 28, 2003

Brilliant Master Genius
I think the ones bright enough to know that were using silver dollars instead.

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride

movax posted:

Yeah I used to water-cool back in college as well, but once heat-pipe coolers hit the stage, I went back to air. My Ultra 120 has been going strong, and I'm confident that adapters will keep coming out for upcoming CPU sockets.

They just came out with a new revision of it, so I would imagine.

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva

text editor posted:

I believe AMD Opterons were the most popular for this, IIRC there were people on DFI forums who were slicing off the heatsink and replacing it with a goddamn sanded down penny to fill the gap to the cooler, so it was a constant copper piece from the top of core to the copper core of the heatsink.
This was because (at the time) instead of soldering the core to the IHS, AMD decided to use low-grade TIM instead. Why they did this, I do not know.

Chuu
Sep 11, 2004

Grimey Drawer

Agreed posted:

It's worth if if you are can afford, are willing to, and intend to continually upgrade. If it's just part of your cycle, then you can buy in each generation at a nice sweet spot (this generation, two 6950s or two 570s; next generation, who knows) and get dramatically superior performance than a top end card for a same or marginally higher outlay. If you go top of the line every year, you're spending a lot of money trying to push the highest numbers.

Dogen gives me very well deserved poo poo for weighing the pros and cons of adding a second 580 to my setup, because he plays the same games I do and is perfectly happy with the performance - but I'm not, and so it looks like this generation I've locked myself into either continuing to be unusually bothered by framerate dips when settings are maxed despite an OC to 925mhz (which scores well in 3Dmark11 and other synthetics that aren't heavily weighted to favor ATI's parallelized calculation). My minimum desire is no FPS below 30, ever - prefer minimum 45 or minimum 60 ideally, but 30 is the bottom number, and one overclocked 580 won't deliver that in modern games at 1080p. I mean, it will in games like Space Pirates and Zombies, but I play a lot of S.T.A.L.K.E.R. with extra shaders and texture overhauls, Metro 2033 (and upcoming sequel), Crysis 2, and other games that actually can cause even this expensive poo poo of a card stress.

But the figures for a single 7970 even heavily overclocked aren't past that point. Two 580s still outperforms one 7970 like crazy in games that are well supported. In games with mediocre SLI support, scaling can be as low as 40%, which is still better than the OC'd 7970 does in most games, but on average SLI with two cards scales around 90% increase. Metro 2033, it's almost 100% increase, very little overhead in that game. Other games, I could definitely live with 80%-90% scaling. Because there is no way, NO WAY, that a single card from either company is going to improve framerates by 80% for the amount it would cost to add another 580.

It's dumb, the high end sucks. But if I were building a computer right now I'd be clicking everywhere to try to find a 7970. Sigh. Idiot me. Shiny things pretty things.

I'm usually a subscriber to the "high end sucks" newsletter, but if this really does 4W idle that changes a lot. The power envelope, at least to me, is the main reason that the high end really sucks. I couldn't care less if it guzzles an insane amount of power when I need it if during the other 90% of the time I don't need it it is at 4W, and 4W idle means I don't have to pay the power premium in a couple years when mainstream cards catch up.

I'm building out a computer right now since I'm on a Opteron 170 /w a Radeon 4800 series card since my main box died; and if this really does sip 4W during the 16 hours or so when it is on and I'm not using it, then it is going onto the spec sheet.

Assuming I can even find one that is.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Chuu posted:

I'm usually a subscriber to the "high end sucks" newsletter, but if this really does 4W idle that changes a lot. The power envelope, at least to me, is the main reason that the high end really sucks. I couldn't care less if it guzzles an insane amount of power when I need it if during the other 90% of the time I don't need it it is at 4W, and 4W idle means I don't have to pay the power premium in a couple years when mainstream cards catch up.

I'm building out a computer right now since I'm on a Opteron 170 /w a Radeon 4800 series card since my main box died; and if this really does sip 4W during the 16 hours or so when it is on and I'm not using it, then it is going onto the spec sheet.

Assuming I can even find one that is.

I've had more time to think about it and look at the results people are getting, and that's changed my opinion a bit, especially as far as giving advice goes - I'll probably be waiting for Kepler if they manage to get it out in a reasonably timely manner, since I need the CUDA, but here's what I'm feeling about the 7970 now without needing to crosspost/repost:

One
Two

I do agree the very aggressive power gating is extremely cool, hopefully nVidia can bring some real power savings to the table as well since they've stated that as a big goal for Kepler going on some time now.

GRINDCORE MEGGIDO
Feb 28, 1985


text editor posted:

AMD 'fanboys' never change.

Fanboys are retarded - but none so perfectly as a guy I saw on a forum, who bought a brand new Intel P4 EE and prised the heatspreader off it - not realising it was soldered on. £900 dead chip.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
Oh, that's just glorious. If there was ever a simple example of having more money than sense, that would be it.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
I totally remember that post on Hardforums. The innards of the chip were ripped out and you could see the different metal layers of the chip. It looked like an x-ray version of some of the VLSI diagrams put out in press kits but in a vertical cross section form.

Nintendo Kid
Aug 4, 2011

by Smythe

wipeout posted:

Fanboys are retarded - but none so perfectly as a guy I saw on a forum, who bought a brand new Intel P4 EE and prised the heatspreader off it - not realising it was soldered on. £900 dead chip.

Reminds me of this kid who in 2011 was running XP Home (32 bit of course since 64 bit doesn't exist for it) on a 6 core AMD high end CPU, with 16 GB of RAM and two video cards, which needed to have power leads plugged into them, not having those plugged in so they were struggling by on PCI-Express power, and bragging about the system.

And in a separate forum on that same site, was asking for help for why his machine wasn't performing as good as it should, while insisting running a 32 bit 10 year old OS made it faster.

unpronounceable
Apr 4, 2010

You mean we still have another game to go through?!
Fallen Rib

Install Gentoo posted:

And in a separate forum on that same site, was asking for help for why his machine wasn't performing as good as it should, while insisting running a 32 bit 10 year old OS made it faster.

It was made for computers built 10 years ago. Computers are a gajillion times faster now, so it HAS to be faster than if I were to run Windows 7 :downs:.

Adbot
ADBOT LOVES YOU

pixaal
Jan 8, 2004

All ice cream is now for all beings, no matter how many legs.


Install Gentoo posted:

Reminds me of this kid who in 2011 was running XP Home (32 bit of course since 64 bit doesn't exist for it) on a 6 core AMD high end CPU, with 16 GB of RAM and two video cards, which needed to have power leads plugged into them, not having those plugged in so they were struggling by on PCI-Express power, and bragging about the system.

And in a separate forum on that same site, was asking for help for why his machine wasn't performing as good as it should, while insisting running a 32 bit 10 year old OS made it faster.

A little bit of knowledge is a dangerous thing. Seriously give someone just a bit of computer knowledge and they think they are kings of everything and do really loving stupid poo poo. At least logically more RAM should always work, and the extra power connectors? Nah man that's for if you have 2 power supplies for some reason. gently caress it I can't even figure out the logic in not plugging in a power connector beyond not seeing it.

I've forgotten to plug them in before the cards don't even turn on, and a nice scary red light starts blinking, or at least that's what happened with the x1950

e:

unpronounceable posted:

It was made for computers built 10 years ago. Computers are a gajillion times faster now, so it HAS to be faster than if I were to run Windows 7 :downs:.

I see your logic, I'm going to run windows 3.1, and if I need DirectX I'll boot into windows ME.

  • Locked thread