Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

I am happy that they're pursuing K10.5, because to my eyes that's really their only good path forward. I'm also happy they're biting the bullet and getting 'em out cheap (relatively speaking), because that will keep things interesting.

Allllright!

Edit: Pretty :rolleyes: that they're naming it so blatantly close to Intel's much higher performing parts, but whatever. Seeing AMD willing to go forward with the much more efficient Llano products that fill the role previously filled by EOL'd K10 processors is a good sign. Any news as to whether these chips have the architectural flaw that kept K10 processors from going above 4.0GHz stably on 64-bit operating systems? AMD has said, basically, that you can more or less expect easy overclocking up to 3.8GHz on any of the unlocked chips, and apparently some dude in Japan killed a chip pushing it to 5GHz+, but I'm curious to see whether these more conventional 4-core (real 4-core, not module based, just cores is cores is cores) designs can hit for enthusiasts and how they perform relative to Intel's hardware in the price range. The graphics side of it is pretty crap for desktop use outside of HTPC or whatever, but at least you can expect good driver support from AMD with their integrated graphics.

Agreed fucked around with this message at 13:08 on Jan 3, 2012

Adbot
ADBOT LOVES YOU

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
I worry that this product really won't be competitive with Bulldozer, architectural inefficiencies all kind of melt away when Bulldozer has a 1Ghz clockspeed advantage. The real rub for AMD is that for the money you spend to get an overclock-capable board and cooler (none of their CPUs are getting far on stock cooling), you could afford to buy an i5 2400 (3400 in a few months) that would crush AMD's offerings in most applications. AMD's killer app is their excellent integrated graphics, and that's not really something most overclockers are going to take advantage of. That said, it would be interesting to see what kind of graphics powerhouses these become when you put a top performance air cooler on them, crank the memory up past 2.2Ghz, and overclock the hell out of the CPU and GPU. I expect we'll see it pummeling everything from a 6570 GDDR3 on down.

Bonus Edit: The AMD Radeon HD 8000-series is codenamed "Canary Islands."

Alereon fucked around with this message at 14:03 on Jan 3, 2012

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Even with RAM cheap, fast RAM isn't cheap... And it appears to be pretty bandwidth limited in that regard, too, with not much going on if you just overclock the GPU itself. But that could be a bug in the hardware side of things, hard to tell.

I want to know about its overclocking for the same reason you do; overclocking Bulldozer is a painful process that requires tremendous wattage and extraordinary cooling. Obviously that's not ideal :v: If this new architecture can meet Bulldozer's "normal" overclocks of between 4.0GHz and 4.5GHz then all of a sudden the clock advantage is gone and it's just down to which performs better under real world loads.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
I'd really expect to see similar power scaling when overclocking to Bulldozer since they're both using the same GF 32nm process. There is some (comparatively) cheap high-speed memory available, this G.Skill 16GB (4x4GB) 1866Mhz kit is $97.50 with coupon code, which isn't bad at all for RAM that hit 2200Mhz in Anandtech's Llano overclocking tests. That said, that's a lot of effort to get performance competitive with a stock Intel chip out of a CPU meant to go into the cheapest machines. It's absolutely awesome in those machines, but trying to get it to pull double-duty in a gaming machine is hoping for too much I think.

Star War Sex Parrot
Oct 2, 2003

I really wish the 7970 would show up on Newegg already. It would certainly help my indecisiveness if I could just impulse-buy one, instead of wondering if Kepler will show up at CES and dash my 7970 dreams.

Gunjin
Apr 27, 2004

Om nom nom
AMD getting sued by laptop fabricator Quanta for allegedly supplying defective products:

http://www.bloomberg.com/news/2012-01-04/quanta-sues-amd-over-chips-for-nec-notebook-computers.html

probably not what investors and the market want to hear in regards to AMD right now.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Good lord. AnandTech has some news about lower-end HD 7000 SKUs.

quote:

So much of what we take for granted with retail cards – well defined specifications and formal product announcements through press releases – simply don’t happen in the OEM market. Instead the OEM market is ambiguous on its best days and secretive at its worst
...
The following is a list of some the important attributes and major features being introduced with Southern Islands. None of which will be available with the Turks based 7670.
  • TSMC’s 28nm HKMG process
  • Graphics Core Next architecture
  • PCI-Express 3.0
  • Direct3D 11.1
  • Partially Resident Texturing
  • Fast HDMI
  • Video Codec Engine (fixed function H.264 encoder)
  • DDM Audio
  • ZeroCore Power
  • Anisotropic Filtering Quality Improvements
...
The Turks based 6570 is back as the 7570. The Caicos based 6450 is back as the 7470 and the 7450 (depending on the type of RAM used). And absurdly enough, the 2 year old Cedar based 5450 is back as the 7350. The last one is particularly notable as Cedar is from the Evergreen family, not Northern Islands. So it lacks all the features Northern Islands brought, including DisplayPort 1.2 support, improved anisotropic filtering, UVD3, MLAA, and the improved tessellation unit; all of this being on top of all of the differences between Northern Islands and Southern Islands.
Why is AMD doing this poo poo?

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

nVidia and ATI both get up to some heinous bullshit when it comes to the low-end and mobile SKUs, do they not? And "why" is probably "because they get away with it every time since the market for those cards is squarely aimed at people who have no idea what they're missing out on."

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
AMD doesn't have access to the 28nm manufacturing capacity to produce low-end chips, but I had at least hoped they would make up new SKUs with new clockspeed/configurations, but nope, straight rename. The worst part is there's just no reason for anything south of the 7570 to even exist, they just don't bring anything to the table.

Autarch Kade
May 6, 2008
Not sure if this has been posted yet, but apparently at least Sapphire is going to produce 6GB 7970s of the Eyefinity6 variety.

Sapphire Document Image

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
Holy loving balls. We knew it had headroom and scaled WELL, but a factory one overclocked to 1335Mhz from 925? Mother of god that is going to be fast; as in, likely faster than any dual GPU/single card solution currently available.

I hope this sheet is close to the truth.

movax
Aug 30, 2008

Agreed posted:

nVidia and ATI both get up to some heinous bullshit when it comes to the low-end and mobile SKUs, do they not? And "why" is probably "because they get away with it every time since the market for those cards is squarely aimed at people who have no idea what they're missing out on."

The low-end GPU business is being cannibalized (and rightfully so) by integrated GPUs. There are very few reasons to get something lower than a midrange card these days...you either need the GPU (games) or you don't (integrated will handle Aero, DXVA, etc).

I remember my GeForce 7300 being some kind of GeForce 6100 or something equally retarded. All this does is confuse buyers and show the shareholders "look how many products/segments we serve!"

We're all smart enough to know not to buy anything below 7xx0 or whatnot, but the average consumer isn't :(

Longinus00
Dec 29, 2005
Ur-Quan
How is this even news? NVidia started it before ATI and they've both been doing it ever since. You guys going to rage out every year when it happens again?

Star War Sex Parrot
Oct 2, 2003

Autarch Kade posted:

Not sure if this has been posted yet, but apparently at least Sapphire is going to produce 6GB 7970s of the Eyefinity6 variety.

Sapphire Document Image
I'm still trying to wrap my head around a 44% factory overclock. :drat:

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Longinus00 posted:

How is this even news? NVidia started it before ATI and they've both been doing it ever since. You guys going to rage out every year when it happens again?
AMD only did it once, for the 5700/6700-series. Other card models have been similar, but straight renaming cards is unusual for AMD.

Fatal
Jul 29, 2004

I'm gunna kill you BITCH!!!
I'm bummed there's no mention of 4x display port/1DVI/1HDMI or something similar. Only one 6 display card is kinda lame with the power that the 7970 has going for it.

I guess there's hope that they include active display adapters for those of us with 6 monitors but not all display port...

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Fatal posted:

I'm bummed there's no mention of 4x display port/1DVI/1HDMI or something similar. Only one 6 display card is kinda lame with the power that the 7970 has going for it.
There's supposed to have been DisplayPort hubs available cheaply for quite some time, they just haven't been making it to market. Once those are available you can connect up to four monitors to one DisplayPort port, much like how USB works. From Anandtech's R7970 review:

Anandtech posted:

Speaking about Eyefinity, one issue that comes up time and time again is Multi Stream Transport (MST) hubs. We were introduced to MST hubs back with the launch of the 6800 series, which allowed a single DP 1.2 port to drive up to 4 monitors by taking advantage of the high bandwidth of DP1.2 and embedding transport streams for several monitors into the signal. The purpose of MST hubs was so that users could use several monitors with a regular Radeon card, rather than needing an exotic all-DisplayPort “Eyefinity edition” card as they need now.

But as many of you have asked me about, several deadlines for MST hubs have come and gone, including the latest deadline which was supposed to be by the end of this year. As with active DP adaptors this is largely out of AMD’s hands since they don’t produce the hardware, but they have been continuing to prod their partners on the issue. The latest deadline from AMD isn’t rosy – summer of 2012 – but they seem more confident of this deadline than deadlines in the past. Not that another half-year wait will be of any comfort for users who have been looking for MST hubs for the better part of the year, but at least it provides some idea on when to expect them.

Autarch Kade
May 6, 2008
I'm curious how that Eyefinity6 edition card will overclock using AMD's slider/tools. I doubt I'd be getting a 1335Mhz clock speed from the stock speeds though. However, what I'm wondering really is will there be anything different about the six mini displayport card that will hamper overclocking? Or did they just leave the clock speeds at stock on it to reduce competition among their own cards?

Also, is there a need for six gigs of video memory versus going with the higher clocked three gig card when it comes to using three monitors?

Fatal
Jul 29, 2004

I'm gunna kill you BITCH!!!

Autarch Kade posted:

Also, is there a need for six gigs of video memory versus going with the higher clocked three gig card when it comes to using three monitors?

I would think the 6Gb/6DP version is designed around using 6 monitors since it's a pretty niche market. If you 1-4 monitors you go with any of the other options. 5-6+ (with MST Hub I guess) you go 6 DP.

shodanjr_gr
Nov 20, 2007
Is there some sort of documentation out there on how PRTs are exposed programmatically?

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

shodanjr_gr posted:

Is there some sort of documentation out there on how PRTs are exposed programmatically?
It'll be exposed via an OpenGL extension in future driver releases, we don't know what their DirectX plans are. There probably won't be any documentation available until the card is released, or the supporting drivers are out.

shodanjr_gr
Nov 20, 2007

Alereon posted:

It'll be exposed via an OpenGL extension in future driver releases, we don't know what their DirectX plans are. There probably won't be any documentation available until the card is released, or the supporting drivers are out.

It'd be interesting to see if its some sort of multi pass method or something. To my understanding, virtual texturing is usually driven by a pre pass/readback step that helps determine which tiles of the texture need to be paged into the GPU...

Star War Sex Parrot
Oct 2, 2003

7970s are on Newegg, almost all out of stock already. :(

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
Good thing they have hit today on the sites I checked. I spec'd out a machine for a friend last night with everything except the graphics card.

7970 completes that picture, and fits in the budget given. Yay for building monstrous machines!

Edit: There's an XFX out with a GPU clock of 1GHz, so this bodes well for Sapphire's crazy sheet.

HalloKitty fucked around with this message at 09:59 on Jan 9, 2012

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
Anandtech has a review of the XFX Radeon HD 7970 Black Edition Double Dissipation card, which uses a custom dual-fan cooler on a slightly pre-overclocked Radeon HD 7970. Anandtech did some overclocking on the card and managed to hit 1125Mhz on the GPU at stock voltage and 1575Mhz on the memory. All indications are that these things will be BEASTS when we get custom PCBs with more VRM phases and higher-performing aftermarket coolers. The Arctic Cooling Accelero Xtreme 7970 comes out January 30th.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
While I was reading that article, I was also struck that there seems to be 1:1 percentage scaling on GPU/RAM speeds and FPS. As in, a 10% overclock gives a 10% performance increase. The Radeon 6850 scales very well (about 10% overclock for 9% performance), but not that well.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Factory Factory posted:

While I was reading that article, I was also struck that there seems to be 1:1 percentage scaling on GPU/RAM speeds and FPS. As in, a 10% overclock gives a 10% performance increase. The Radeon 6850 scales very well (about 10% overclock for 9% performance), but not that well.

GTX 580's scaling is pretty close, but not that good.

Here's some ridiculous :tinfoil: for you - we're ruining next year's cards, because next year's cards were supposed to just be much more highly clocked this year's cards. :tinfoil:

I can get about an 80mhz overclock at stock voltage on my 580. 200mhz seems to be what pretty much every review site settles on as the stock voltage OC for these things.

Iff... IF AND ONLY IF... Kepler makes these cards look kind of slow, I'll not go down the lonesome, stupid road of 2x580. Because while someone looking to buy a high end card now is in a good position with the 7970, someone who bought a high end card last generation is looking at a realistic 25%ish increase on average, raise to up to 35% or so overclocked. That's great for the new buyer, but poo poo for someone who can spend that same amount of money on an SLI setup.

Uh, new buyers, consider that if you turn into a gibbering man-ape when you want to turn graphics options up, buying a very high end graphics card is the opposite of awesome, it actually kind of loving sucks, because it's a racket. There will always be that beautiful poo poo just out of reach, framerate dips, and the only way you can avoid it is to play games only 3-4 years after their launch, really, unless you want to spend $1000 slamming two of these monsters together :smithicide:

Wedesdo
Jun 15, 2001
I FUCKING WASTED 10 HOURS AND $40 TODAY. FUCK YOU FATE AND/OR FORTUNE AND/OR PROBABILITY AND/OR HEISENBURG UNCERTAINTY PRINCIPLE.

^^^
That's really true. I bought 2 Radeons 5870s a year and a half after they came out (December 2010) for $300 total and they're still running strong.

Buying top-of-the-line is almost never worth it.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

It's worth if if you are can afford, are willing to, and intend to continually upgrade. If it's just part of your cycle, then you can buy in each generation at a nice sweet spot (this generation, two 6950s or two 570s; next generation, who knows) and get dramatically superior performance than a top end card for a same or marginally higher outlay. If you go top of the line every year, you're spending a lot of money trying to push the highest numbers.

Dogen gives me very well deserved poo poo for weighing the pros and cons of adding a second 580 to my setup, because he plays the same games I do and is perfectly happy with the performance - but I'm not, and so it looks like this generation I've locked myself into either continuing to be unusually bothered by framerate dips when settings are maxed despite an OC to 925mhz (which scores well in 3Dmark11 and other synthetics that aren't heavily weighted to favor ATI's parallelized calculation). My minimum desire is no FPS below 30, ever - prefer minimum 45 or minimum 60 ideally, but 30 is the bottom number, and one overclocked 580 won't deliver that in modern games at 1080p. I mean, it will in games like Space Pirates and Zombies, but I play a lot of S.T.A.L.K.E.R. with extra shaders and texture overhauls, Metro 2033 (and upcoming sequel), Crysis 2, and other games that actually can cause even this expensive poo poo of a card stress.

But the figures for a single 7970 even heavily overclocked aren't past that point. Two 580s still outperforms one 7970 like crazy in games that are well supported. In games with mediocre SLI support, scaling can be as low as 40%, which is still better than the OC'd 7970 does in most games, but on average SLI with two cards scales around 90% increase. Metro 2033, it's almost 100% increase, very little overhead in that game. Other games, I could definitely live with 80%-90% scaling. Because there is no way, NO WAY, that a single card from either company is going to improve framerates by 80% for the amount it would cost to add another 580.

It's dumb, the high end sucks. But if I were building a computer right now I'd be clicking everywhere to try to find a 7970. Sigh. Idiot me. Shiny things pretty things.

Star War Sex Parrot
Oct 2, 2003

Cross-post from the parts-picking:

Sapphire and VisionTek 7970s are on Amazon right now. I just snagged one with next-day delivery. :dance:

I can't wait to try to cram it into a mini-ITX box.

movax
Aug 30, 2008

Agreed posted:

It's worth if if you are can afford, are willing to, and intend to continually upgrade. If it's just part of your cycle, then you can buy in each generation at a nice sweet spot (this generation, two 6950s or two 570s; next generation, who knows) and get dramatically superior performance than a top end card for a same or marginally higher outlay. If you go top of the line every year, you're spending a lot of money trying to push the highest numbers.

Dogen gives me very well deserved poo poo for weighing the pros and cons of adding a second 580 to my setup, because he plays the same games I do and is perfectly happy with the performance - but I'm not, and so it looks like this generation I've locked myself into either continuing to be unusually bothered by framerate dips when settings are maxed despite an OC to 925mhz (which scores well in 3Dmark11 and other synthetics that aren't heavily weighted to favor ATI's parallelized calculation). My minimum desire is no FPS below 30, ever - prefer minimum 45 or minimum 60 ideally, but 30 is the bottom number, and one overclocked 580 won't deliver that in modern games at 1080p. I mean, it will in games like Space Pirates and Zombies, but I play a lot of S.T.A.L.K.E.R. with extra shaders and texture overhauls, Metro 2033 (and upcoming sequel), Crysis 2, and other games that actually can cause even this expensive poo poo of a card stress.

But the figures for a single 7970 even heavily overclocked aren't past that point. Two 580s still outperforms one 7970 like crazy in games that are well supported. In games with mediocre SLI support, scaling can be as low as 40%, which is still better than the OC'd 7970 does in most games, but on average SLI with two cards scales around 90% increase. Metro 2033, it's almost 100% increase, very little overhead in that game. Other games, I could definitely live with 80%-90% scaling. Because there is no way, NO WAY, that a single card from either company is going to improve framerates by 80% for the amount it would cost to add another 580.

It's dumb, the high end sucks. But if I were building a computer right now I'd be clicking everywhere to try to find a 7970. Sigh. Idiot me. Shiny things pretty things.

I'm glad I don't have your illness when it comes to shiny things (I game at 2560x1600). GTX 460 still going strong :patriot:

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

movax posted:

I'm glad I don't have your illness when it comes to shiny things (I game at 2560x1600). GTX 460 still going strong :patriot:

Amen, and more power to you, it's ridiculous. I'm a grown man, I know better. If I didn't have the ability to shuffle the money from selling some music gear into treating myself here I wouldn't even be entertaining it but it's GPU season and games are really, really good lately, so drat it I want a new/another video card :downs:

Fatal
Jul 29, 2004

I'm gunna kill you BITCH!!!
Nothing past 4 monitor support yet which is kinda lame. Hopefully the 6DP model shows up soonish...

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Agreed posted:

But the figures for a single 7970 even heavily overclocked aren't past that point. Two 580s still outperforms one 7970 like crazy in games that are well supported.

http://www.anandtech.com/show/5314/xfxs-radeon-hd-7970-black-edition-double-dissipation-the-first-semicustom-7970/6

I could cherry pick Metro: 2033 here and show that to be eerily close to not being the case, even though I of course realise SLI 580s are faster than a 590.

Point is, the 7970 is going to guzzle a hell of a lot less power.

I never really thought SLI or CF was a good idea, though, and I tend not to recommend it, for power & cost reasons mainly, but also the galling fact that a few years down the line you have a pile of expensive graphics cards with no use, just so you could gain a few FPS here and there (odd issues with FPS loss and drivers notwithstanding), even though most of your time was spent using the cards in aero, which Intel onboard would have sufficed for.

HalloKitty fucked around with this message at 22:29 on Jan 9, 2012

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

HalloKitty posted:

http://www.anandtech.com/show/5314/xfxs-radeon-hd-7970-black-edition-double-dissipation-the-first-semicustom-7970/6

I could cherry pick Metro: 2033 here and show that to be eerily close to not being the case, even though I of course realise SLI 580s are faster than a 590.

Point is, the 7970 is going to guzzle a hell of a lot less power.

I never really thought SLI or CF was a good idea, though, and I tend not to recommend it, for power & cost reasons mainly, but also the galling fact that a few years down the line you have a pile of expensive graphics cards with no use, just so you could gain a few FPS here and there (odd issues with FPS loss and drivers notwithstanding), even though most of your time was spent using the cards in aero, which Intel onboard would have sufficed for.

I've already run the numbers to the greatest extent we can, the following post is just one in an ongoing series of me trying to find a reason to go with a 2012 graphics card. I'm linking this one because 1. it was today, and 2. it deals with Metro 2033 more specifically.

http://forums.somethingawful.com/showthread.php?threadid=3458091&userid=0&perpage=40&pagenumber=13#post399379970

The 590 is just such a pile of poo poo that it doesn't bear comparison, a chart which shows 590 performance rather than 580 SLI performance isn't conveying much. Fermi 110 scales well with clock, the default clock is 772MHz on a GTX 580 while the GTX 590 gets two Fermi 110 chips at a paltry 630mhz a piece... Hand picked to run at lower power, since it's stupidly putting all that in one slot. If you open up the GTX 580 SLI performance capability it goes way the hell beyond anything a 590 could hope to do, and the 580 also happens to overclock well - my single GTX 580 puts out numbers very similar to the non-overclocked ATI 7970, because I've got it pretty substantially overclocked. Overclocking a 590 is a great way to kill an $800 card. They're not comparable, really, a 150mhz underclock per core hamstrings the performance of the 590, and its 1.5GB VRAM in parallel makes its performance in extremely high resolution setups poorly competitive with last-gen ATI hardware which has a .5GB VRAM advantage (not to mention scaling that works really well beyond just two cards).

:sigh:

Agreed fucked around with this message at 23:01 on Jan 9, 2012

Star War Sex Parrot
Oct 2, 2003

[H]ardOCP (I know, I know) just posted an article looking at 7970 overclocking.

edit: they hit 1260 on the core and 6.9GHz memory on a reference board with some voltage tweaking. That seems insane to me.

Star War Sex Parrot fucked around with this message at 23:24 on Jan 9, 2012

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Star War Sex Parrot posted:

[H]ardOCP (I know, I know) just posted an article looking at 7970 overclocking.

edit: they hit 1260 on the core and 6.9GHz memory on a reference board with some voltage tweaking. That seems insane to me.

Edit: I just don't know. I have no clue what I'm doing for graphics this year. gently caress making a decision right now, 'til Kepler comes out I'm not committing a penny, let alone half a grand. But that is some great performance, both with stock and especially with (what really seems to me like "edge of stability") the raised voltage. nVidia better have their boots on for this one.

Agreed fucked around with this message at 23:41 on Jan 9, 2012

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
CES is in progress, and no sign of Kepler so far. I think we were speculating that that would mean Kepler is not gonna trounce Southern Islands or anything.

Also, want to trade two 6850s for your 580 and break the hell-cycle that is high-end upgrading? :v:

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
Bear in mind, in that HardOCP article, the GTX 580 used is factory overclocked to a hefty degree. 772 to 860..

Edit: hah, you edited your post. I preferred the old one, that was clear amazement. :D

Adbot
ADBOT LOVES YOU

movax
Aug 30, 2008

Factory Factory posted:

CES is in progress, and no sign of Kepler so far. I think we were speculating that that would mean Kepler is not gonna trounce Southern Islands or anything.

Also, want to trade two 6850s for your 580 and break the hell-cycle that is high-end upgrading? :v:

SemiAccurate had a quiet article that mentioned Kepler slipping further to 2013, or maybe I was drunk and confused "Kepler slipping to 2012" with the year still be 2011, and beginning to sob helplessly.

Also, I think Agreed should :toxx: himself somehow with regards to the 7970, for our entertainment :toot:

  • Locked thread