Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.
Glimpse at Trinity, running a DX11 game on 3 monitors:

http://www.tomshardware.com/news/AMD-Trinity-Piledriver-VCE-Demo,15009.html

Adbot
ADBOT LOVES YOU

tijag
Aug 6, 2002
It's the 19th. Availabiilty of the 78xx's is basically 0 right now.

Seems like if it was anything other than a logistic's problem [perhaps shipments to retailers were delayed?] I would have expected some sort of announcement postponing the launch.

I suppose it could be TSMC problems that Charlie claims exist.

Really suspicious that there are none available though.

pixaal
Jan 8, 2004

All ice cream is now for all beings, no matter how many legs.


tijag posted:

It's the 19th. Availabiilty of the 78xx's is basically 0 right now.

Seems like if it was anything other than a logistic's problem [perhaps shipments to retailers were delayed?] I would have expected some sort of announcement postponing the launch.

I suppose it could be TSMC problems that Charlie claims exist.

Really suspicious that there are none available though.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814102984&Tpk=7850%20HD%20Sapphire

It's in stock right now

pixaal fucked around with this message at 23:16 on Mar 19, 2012

tijag
Aug 6, 2002
This is a pitiful launch if 1 card from 1 vendor is all the 'stock' that can be managed.

tijag
Aug 6, 2002
Images from Tom's Hardware review of the 680 leaked and it appears as if nvidia has finally out engineered AMD.

The die size is still in question, but all the rumors indicate that its about 40 - 60mm2 smaller than Tahiti [79xx], meanwhile it tends to be 15% or so faster than Tahiti and consume less power while doing so.

This is great news to me, because if nvidia prices this GPU at around $500, then AMD will have to lower the 79xx considerably. That *should* make prices cascade and make the midrange parts cheaper.

I can at least hope.

Crackbone
May 23, 2003

Vlaada is my co-pilot.

tijag posted:

Images from Tom's Hardware review of the 680 leaked and it appears as if nvidia has finally out engineered AMD.

The die size is still in question, but all the rumors indicate that its about 40 - 60mm2 smaller than Tahiti [79xx], meanwhile it tends to be 15% or so faster than Tahiti and consume less power while doing so.

This is great news to me, because if nvidia prices this GPU at around $500, then AMD will have to lower the 79xx considerably. That *should* make prices cascade and make the midrange parts cheaper.

I can at least hope.

Link?

tijag
Aug 6, 2002

Crackbone posted:

Link?

http://imgur.com/a/RCDqK

Obviously Tom's is probably the least reputable benchmarking site, but it seems that the 680 is going to be a very good performer.

Crackbone
May 23, 2003

Vlaada is my co-pilot.

tijag posted:

http://imgur.com/a/RCDqK

Obviously Tom's is probably the least reputable benchmarking site, but it seems that the 680 is going to be a very good performer.

Assuming those are real, looks good. Of course, it all depends on retail pricing, and for most people what Nvidia does for the mid-range market.

tijag
Aug 6, 2002

Crackbone posted:

Assuming those are real, looks good. Of course, it all depends on retail pricing, and for most people what Nvidia does for the mid-range market.

Yep. Apparently they were up on the real website for a while, someone was smart and downloaded them all before they were pulled down. I think they are real.

In recent generations AMD has had a lot more performance per mm2 and generally more performance per watt. So the fear was that nvidia had to postpone the GK100 [big chip] and wouldn't be able to compete with AMD without having their large monolithic die. Rumors of a 300 - 320mm2 GK104 led me to believe that they would be competing with Pitcairn, so Nvidia really has made a big turn around here. This reminds me of the HD4870, which surprised everyone, including nvidia.

I know nvidia won't be pricing this chip at those kinds of low prices, but at the very least it should force AMD to sell GPU's for less than they are now. I suppose it's possible that nvidia could sell the GK104 for like, $600, and AMD wouldn't have to move prices, but I think it's more likely to launch at $500, and then force AMD to drop the price of the 7970 down quite a bit.

In the midrange I think the 7870 is a great looking chip, but it's about $50 too much, otherwise I would have ordered it today. Maybe the cascade of price cuts that GK104 *should* cause will trickle down to the 7870. :D

Gwaihir
Dec 8, 2009
Hair Elf
Wasn't the 680 supposedly the equivalent of the situation we had with the 480/580, in that the 480 was a sorta rushed to market chip that didn't meet the original shader core count/whatever due to production issues? Which were then fixed and revised in to the 580 which was the same architecture but with the outrageous power usage fixed and the original core count intact.

Or am I just getting all the usual random video card FUD mixed up since this cycle has been so long? I swore they were planning on releasing the midrange card first this time instead of starting with the halo high end part.

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride
Looks like the album has been pulled too, though I can still see the thumbs. This generation is shaping up to be pretty interesting on both sides if we can put any confidence in those leaked shots.

tijag
Aug 6, 2002

Gwaihir posted:

Wasn't the 680 supposedly the equivalent of the situation we had with the 480/580, in that the 480 was a sorta rushed to market chip that didn't meet the original shader core count/whatever due to production issues? Which were then fixed and revised in to the 580 which was the same architecture but with the outrageous power usage fixed and the original core count intact.

Or am I just getting all the usual random video card FUD mixed up since this cycle has been so long? I swore they were planning on releasing the midrange card first this time instead of starting with the halo high end part.

Yes, the 680 is the GK104 part. This is 300 - 320mm2 by the best estimates from the 'die shots' we've seen. The larger GK100/GK110/GK112 [whatever they call it] won't be out until August at the earliest.

technically this part is the upgrade to the GF114 part, but it's performance is at least equal Tahiti, perhaps even 10 - 20% better.

Of course I think AMD will release either a super clocked version, a 'performance driver' or perhaps both to compete with the 680, and the reality is both will still have a higher priced card.

The other factor here is availability. No matter how fast these things are if they are always out of stock it won't impact current pricing much, if at all.

tijag
Aug 6, 2002
http://forums.overclockers.co.uk/showpost.php?p=21514515&postcount=61

He's not giving details and seems to be basing his opinion purely on synthetic benchmarks... however, this is looking very promising. If it's selling for $500, then that's going to force competition.

Gwaihir
Dec 8, 2009
Hair Elf
Mmm, looks good. I've got a evga GTX480 now (Snagged on some outrageous dell deal where they were selling for $360 somehow right after launch), but 2560*1600 is still a pretty stressful resolution even with a 100 mhz OC on the card. If the 680 looks this good, I'll be pretty happy to wait till q3 and pick up the latest and greatest, since I'm not really interested in any SLI shenanigans. I have a spitfire cooler on the card now so it's not outrageously loud or obnoxiously hot at least.

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride
The new beta of afterburner has Kepler support for whatever that is worth.

movax
Aug 30, 2008

tijag posted:

http://forums.overclockers.co.uk/showpost.php?p=21514515&postcount=61

He's not giving details and seems to be basing his opinion purely on synthetic benchmarks... however, this is looking very promising. If it's selling for $500, then that's going to force competition.

I can attest to some nice jumps in CUDA performance, but I have not tried any games...

That overclocks.co.uk guy said the AIB partners are already stocked up for launch though? A clean three months ahead of Q3?

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

movax posted:

I can attest to some nice jumps in CUDA performance, but I have not tried any games...

That overclocks.co.uk guy said the AIB partners are already stocked up for launch though? A clean three months ahead of Q3?

Wouldn't be that surprising if nVidia just straight up learned their lesson with the Fermi arc, would it? How many times does a company with a lot of money make the same mistake if they're worth a drat?

Also, if the numbers don't lie, in for the top performer when the dust has settled. I was considering dropping CUDA and hoping for broader GPGPU but no reason to do that if nVidia can bring a competitive product with all the trimmings in addition to GPGPU with API penetration into the stuff I'm actually doing. Won't exactly be bummed about being able to keep PhysX, either. nVidia trades a lot on its perks but they're good ones to have. If the guy saying that it overclocks well is accurate, that's going to be even tougher on ATI's top lineup, because the 7970 is pretty much clock-for-clock with the GTX 580 - you get a 580 to 7970 stock clocks and they bench almost identically and perform in games within the margin of error you'd expect for brand support differences. The 7970 can take off from there and leave the 580 behind, and has a major upper hand in high resolution gaming, of course, but the 580 is hard to count out despite being just the realized version of an idea that's not so new. If Kepler offers a similar performance leap on nVidia's side, that would put it clock for clock ahead of ATI, and extracting more performance out of it would be asymmetrically better unless they turn out to be poor overclockers for some reason (something I doubt in principle, if someone's willing to pay, anyway).

I just really got into nvidiaInspector and forcing flags that aren't shown, and it's amazing how good looking even an older, cludgy game like Mass Effect 1 is (did a replay of 1 and 2 for 3 :shepface:) with a proper AA setup. UE3 games are the worst for compatibility, the deferred rendering thing is a right pain in the rear end to try to make work with AA flags, but the 580 at the overclock I've got just packs enough horsepower that the relatively low poly and low-res textures in a given scene don't stress it, so I can run it with various kinds of AA and get a much better looking game. Sparse grid SSAA for transparency is the poo poo on old games like that, but even with the SG prefix and higher performance that comes with it, still takes a lot of power to do any kind of supersampling, it's just gonna be inefficient compared to elegant AA solutions available now and going forward.

I wonder what kind of image quality improvements Kepler's going to bring? Fermi already has outstanding IQ, in my opinion, I mean ATI basically copied nVidia's AF for the 5000 to 6000 generational change and the image quality in edge cases looked a lot better for it.

But what does concern me is that ATI seems to be having serious supply issues - not really any way for that to be a "good problem to have" since all the demand in the world won't help if you can't put product in people's hands, and, again, assuming that the info leak holds water, ATI ought to be making use of this time to get as much market penetration as possible. Being the performance king when you can't actually get people committed through a purchase because you're out of products doesn't matter. If nVidia hits the market at a competitive-enough price it'll be a slaughter, if for no other reason than nVidia holds more "that's the good one" headspace than ATI, with the problematic perception that they've got better drivers and the "way it's meant to be played" pound-it-into-your-skull advertising. I don't feel like ATI, especially given what's up with AMD generally right now, can afford delays; when Kepler hits, if any of the info we've got is correct, it's going to hit straight to ATI's books.

Agreed fucked around with this message at 07:43 on Mar 21, 2012

tijag
Aug 6, 2002
the GTX 680 is not a good GPGPU peformer. Unless it's driver problems, it's getting beaten by the 7870 from what I have read.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

tijag posted:

the GTX 680 is not a good GPGPU peformer. Unless it's driver problems, it's getting beaten by the 7870 from what I have read.

Well, I'm not cracking passwords, I'm using CUDA DSP, and...

movax posted:

I can attest to some nice jumps in CUDA performance, but I have not tried any games...

... API penetration is a motherfucker. nVidia got there first. Raw compute performance and parallelism doesn't count for as much as "it actually does something with your software" if those are the options.

Chuu
Sep 11, 2004

Grimey Drawer
I missed the leaked screenshots, but does that mean the major hardware sites have their review samples? Any idea when the NDA officially drops?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
According to completely unsourced rumors, in hours.

We probably need a GPU thread.

roadhead
Dec 25, 2001

Newegg messed up and sent me an email shouting about ordering it RIGHT NOW. And I followed the links to nothing. So.... Soon?


http://www.newegg.com/GTX680/?nm_mc=EMCCP-032212-NV&cm_mmc=EMCCP-032212-NV-_-GeForceGTX680-_-banner-_-SHOPNOW


Goes to a promo page but the links don't work, yet.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
http://www.anandtech.com/show/5699/nvidia-geforce-gtx-680-review

Get your fresh hot review, right now

tl;dr - GTX 680 is, with a few exceptions, the fastest single card right now, in stock form. The gap often isn't huge, but from the data, that's an easy conclusion.
Curious moments in the review involve the fact compute performance is WORSE than 580, and 7xxx series compute is considerably better. A little odd, but there you go.

Looks an extremely compelling card, although of course most people don't buy straight into the high end - and AMD now will have to drop prices. Sorry AMD, but at least you had a few months of being able to charge a lot for the best cards.

That doesn't mean the AMD cards are bad by any means, but they will be stuck at slightly inappropriate price points given Kepler; but we'll still have to see what smaller Kepler dies have to offer.

That said, a few things: Southern Islands overclocks like a motherfucker with no extra voltage. I think it's time for more SKUs or AMD partners to release higher factory clocked cards. On the NVIDIA side, I get the impression they're pushing GPU clocks very high from stock (with GPU boost), so if there's less headroom to be had, AMD could basically provide the same performance. We'll see how that pans out.

HalloKitty fucked around with this message at 14:41 on Mar 22, 2012

tijag
Aug 6, 2002
I ordered one. I've never bought a top tier card. This is a pretty compelling one, with really nice features like Adaptive V-Sync and TXAA. It's generally faster than AMD's fastest while being slightly slower.

It's massive overkill for me @ 2048x1152, but it will be the first time I can play games with the graphics settings set to max all the time.

I'm pretty stoked.

First Nvidia GPU I've bought since Riva 128. In a dell I had a 8800 GT for a while also I think.

movax
Aug 30, 2008

HalloKitty posted:

tl;dr - GTX 680 is, with a few exceptions, the fastest single card right now, in stock form. The gap often isn't huge, but from the data, that's an easy conclusion.
Curious moments in the review involve the fact compute performance is WORSE than 580, and 7xxx series compute is considerably better. A little odd, but there you go.

Yeah, I dunno what's up with their compute benchmarks. Our software is running faster on the 680 than it was the 580, but I've also had this entire year so far to optimize/re-write, so I guess that helps.

I'm a little worried about the memory clocks though; at those speeds, poor engineering and design of the PCB with regards to signal integrity could severely curtail any memory overclocking. I think I'll shoot for eVGA as my vendor of choice though (lifetime warranty!).

Anyways, digging the single-card performance in the games I play (has it really been five years since DX10 GPUs launched?). And it can break 30FPS in balls-out Metro (i.e. ridiculous settings)! Now to still wait and see if I want to upgrade from my 460 just yet...

Gwaihir
Dec 8, 2009
Hair Elf
With how these benches turned out, performance on the GK110 should be really exceptional. Not to mention the inevitable dual chip 690 or whatever they end up calling it. With the much lower power envelope compared to the 580, they should be able to do an actual decent job of it, instead of the 590 which had to sacrifice so much clockspeed to keep the thermals in check.

tijag
Aug 6, 2002

Gwaihir posted:

With how these benches turned out, performance on the GK110 should be really exceptional. Not to mention the inevitable dual chip 690 or whatever they end up calling it. With the much lower power envelope compared to the 580, they should be able to do an actual decent job of it, instead of the 590 which had to sacrifice so much clockspeed to keep the thermals in check.

The GK110 or whatever it will be called is likely going to be a very GPGPU focused chip. No doubt it will have better games performance, but I suspect a lot of the die size increase will go to increasing DP performance.

zer0spunk
Nov 6, 2000

devil never even lived
Totally sold out from every manufacturer. Gah!

I'm interested in this from a CUDA/CS6 perspective. Premiere uses CUDA for a few different things (except encoding). I would have gone AMD/ATI already if it wasn't for that.

Would it be smarter to wait on a gk110 part for this purpose? I plan on pairing the GPU with a higher end cpu to handle the encoding side of things.

I've been looking at virtu too, but I don't know much about it other then it would enable quicksync without disabling a discrete card
http://www.lucidlogix.com/product-virtu-gpu.html

zer0spunk fucked around with this message at 18:34 on Mar 22, 2012

kuddles
Jul 16, 2006

Like a fist wrapped in blood...
The most surprising thing about these GTX 680 reviews coming out are on the power consumption/acoustics side of things. Nvidia finally seems to be getting those under control and even beating AMD to some degree.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Why the drop from the higher memory bus to the lower one? Weird choice, they're packing fast GDDR5 and yet I've got 1.5x the memory bus on my 580 as the 680 has. But, the price is right, it'll force competition from ATI. Might wait around to see what partners do with it before picking one up, might wait on GK110, overclocked 580 is holding steady for the time being and that's a great price for the performance it offers. Wait and see is my plan, with the eventual goal of purchasing.

Rashomon
Jun 21, 2006

This machine kills fascists

kuddles posted:

The most surprising thing about these GTX 680 reviews coming out are on the power consumption/acoustics side of things. Nvidia finally seems to be getting those under control and even beating AMD to some degree.

Indeed. From Anandtech:

Ryan Smith posted:

Since Cypress and Fermi NVIDIA and AMD have effectively swapped positions. It’s now AMD who has produced a higher TDP video card that is strong in both compute and gaming, while NVIDIA has produced the lower TDP part that is similar to the Radeon HD 5870 right down to the display outputs. In some sense it’s a reaction by both companies to what they think the other did well in the last generation, but it’s also evidence of the fact that AMD and NVIDIA’s architectures are slowly becoming more similar.

The 680 kills; I was thinking of not doing a rebuild for a while but now I am thinking I might want to go full out, even get a 2560x1440 monitor and all, when Ivy Bridge comes around. Then again, I guess it depends on what games I want to play -- if the big game for me this year is going to be Diablo 3, then that might be just crazy and I can stick with my 560 Ti.

rscott
Dec 10, 2009
It's cheaper to use higher speed memory with a narrower bus than it is to use a wide bus. Less die space to the memory controller and all. That being said, the GTX680 has exactly the same memory bandwidth as the 580, which is interesting to me because that's basically a bet on nVidia's part that games will be shader limited in the future and not bandwidth limited.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Agreed posted:

Why the drop from the higher memory bus to the lower one? Weird choice, they're packing fast GDDR5 and yet I've got 1.5x the memory bus on my 580 as the 680 has. But, the price is right, it'll force competition from ATI. Might wait around to see what partners do with it before picking one up, might wait on GK110, overclocked 580 is holding steady for the time being and that's a great price for the performance it offers. Wait and see is my plan, with the eventual goal of purchasing.

According to AnandTech, there's an even-higher-end single chip Kepler forthcoming. The 680 GTX is the current single-chip king, but it won't be the single-chip king of this generation.

tijag
Aug 6, 2002

Factory Factory posted:

According to AnandTech, there's an even-higher-end single chip Kepler forthcoming. The 680 GTX is the current single-chip king, but it won't be the single-chip king of this generation.

I think by the time big Kepler launches, GCN 2.0 will be out.

Crackbone
May 23, 2003

Vlaada is my co-pilot.

Factory Factory posted:

According to AnandTech, there's an even-higher-end single chip Kepler forthcoming. The 680 GTX is the current single-chip king, but it won't be the single-chip king of this generation.

Oh god, can't wait for the naming conventions this gen.

GTX 680ti? 683? 680DoublePlusGood?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Maybe it'll tie into Intel's trend of pumping up the last two digits of Core i processor model numbers, and two years from now both will be dipping into hexadecimal digits since they used up 99 already.

Get your Alienware AlienAnus with i7-5AAA A-core CPU @ AAA GHz with two nVidia GeForce GTX AAA GTX in SLI! With 0xAAA CUDA cores! And 0xAAAAAA KB of RAM!

We call it, "AAAAAAAAAAAAAAAAAAAAAA!"

Crackbone
May 23, 2003

Vlaada is my co-pilot.

Factory Factory posted:

Maybe it'll tie into Intel's trend of pumping up the last two digits of Core i processor model numbers, and two years from now both will be dipping into hexadecimal digits since they used up 99 already.

Get your Alienware AlienAnus with i7-5AAA A-core CPU @ AAA GHz with two nVidia GeForce GTX AAA GTX in SLI! With 0xAAA CUDA cores! And 0xAAAAAA KB of RAM!

We call it, "AAAAAAAAAAAAAAAAAAAAAA!"

Hexidecimal model numbering would rule.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Crackbone posted:

GTX 680ti?

Serious talk if I were guessing, this would be my guess. They've gone pretty far out of their way to establish and promote the TI branding.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
NVIDIA Geforce GTX 680 Ti 2048-Core Special Black Edition

Adbot
ADBOT LOVES YOU

Crackbone
May 23, 2003

Vlaada is my co-pilot.

Actually, they could use 685, like they did back with the 285. The 95 is usually reserved for the dual gpu/single card setup. But then again, who knows with Nvidia, they seem to make poo poo up as they go along.

  • Locked thread