Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
As much as nVidia cards have their advantages, I don't think it's fair to say that Rage never worked right on AMD cards. The launch experience sucked, but cutting edge games tend to take a driver revision or two before everything catches up (and Id didn't spend nearly enough time testing or polishing Rage, which John Carmack admits to). As long as you use updated drivers and install the profiles things work well.

Adbot
ADBOT LOVES YOU

MixMasterMalaria
Jul 26, 2007
How is the value on the GTX 660? Mercury engine playback support for Premiere CS6 would be nice, and it seems to be quite a bit faster than the 7850, but I'm not seeing it recommended here.

MixMasterMalaria fucked around with this message at 15:23 on Sep 16, 2012

The Illusive Man
Mar 27, 2008

~savior of yoomanity~

MixMasterMalaria posted:

How is the value on the GTX 660? Mercury engine playback support for Premiere CS6 would be nice, and it seems to be quite a bit faster than the 7850, but I'm not seeing it recommended here.

Decent, but I think people are hesitant to recommend it because currently, for a little more you can get a 7870 that comes with a copy of Sleeping Dogs.

Rigged Death Trap
Feb 13, 2012

BEEP BEEP BEEP BEEP

Space Racist posted:

Decent, but I think people are hesitant to recommend it because currently, for a little more you can get a 7870 that comes with a copy of Sleeping Dogs.

So it's sleeping dogs vs Borderlands 2.

[e]Oh wait no. That comes with the higher tiers.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

MixMasterMalaria posted:

How is the value on the GTX 660? Mercury engine playback support for Premiere CS6 would be nice, and it seems to be quite a bit faster than the 7850, but I'm not seeing it recommended here.
Isn't Mercury only supported on Fermi (GTX 400/500) cards? I thought we didn't get support on Kepler until the GTX 700-series.

MixMasterMalaria
Jul 26, 2007

Alereon posted:

Isn't Mercury only supported on Fermi (GTX 400/500) cards? I thought we didn't get support on Kepler until the GTX 700-series.

My understanding was that it works fine with a simple hack, but I'm not positive. The Adobe forums would seem to confirm this.

MixMasterMalaria fucked around with this message at 22:05 on Sep 16, 2012

Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.
This is a rumor, but supposedly leaked specs of the Radeon 8870 and 8850

http://videocardz.com/34981/amd-radeon-hd-8870-and-hd-8850-specifiation-leaked

Seems good and believable to me, seeing as how the 7870 was close to the 6970 in performance and the 8870 will be close to the 7970.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Endymion FRS MK1 posted:

This is a rumor, but supposedly leaked specs of the Radeon 8870 and 8850

http://videocardz.com/34981/amd-radeon-hd-8870-and-hd-8850-specifiation-leaked

Seems good and believable to me, seeing as how the 7870 was close to the 6970 in performance and the 8870 will be close to the 7970.

Pretty sure the 7870 is faster than the 6970, but I get your point, the mid-range Radeon targets the high end performance of the previous generation.

Still, 8000 series already? Seems like it's time to clean up the branding mess the 7000 series became with the different sub-versions of cards. Things could be getting very interesting.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
I was doing some back-of-the-envelope math to figure out the core configuration of the supposed 8870 using this table on Wikipedia, and I'm getting roughly double a 7850: 2048 shaders, 128 texture units, but still 32 ROPs. A 256-bit memory bus using 6Ghz GDDR5, just like the GTX 680. The FLOP numbers that page gives seem about 9% low by my math, but I guess that's not that bad. The transistor count is down 1B from Tahiti, though I suppose they could have shaved that much off by the reductions in memory controller count, caches, and using simpler shader clusters with less DP throughput. Cramming that into 160W at 1050Mhz is a neat trick though, unless that's an "average" TDP and the max is actually higher.

Edit: Of course this could always be a complete fake.

Alereon fucked around with this message at 00:42 on Sep 17, 2012

The Illusive Man
Mar 27, 2008

~savior of yoomanity~
If AMD really can get cards out with those specs in ~6 months, that'll really gently caress nVidia over (having just launched the GTX 660). I'm curious how exactly they would respond to that.

ReWinter
Nov 23, 2008

Perpetually Perturbed
I've got a 660ti (the EVGA SC+ 3gb) coming in tomorrow to replace the 550ti I am currently running, looking to enjoy that Borderlands 2 PhysX. I'm not totally clueless when it comes to cards, but the population in this thread is much better- what sort of (ballpark) PSU draw would those two cards together be pulling if I kept the 550ti in for PhysX, and is the 550ti even worth keeping in that role?

GRINDCORE MEGGIDO
Feb 28, 1985


Glen Goobersmooches posted:

Mine was OCed (the CCC cap is really low for the series unfortunately) but the gap in power between the two models is more significant than could be bridged by such.

Sapphire Trixx is good, no low limit if the card can be controlled by it.
I thought clocking a 7850 even with CCC got it very close to a 7870?

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

wipeout posted:

Sapphire Trixx is good, no low limit if the card can be controlled by it.
I thought clocking a 7850 even with CCC got it very close to a 7870?

Yeah, there are definitely benchmarks out there showing 7850s clocked high enough to exceed 7870 speed at stock, or otherwise come very close to it.

HalloKitty fucked around with this message at 09:04 on Sep 17, 2012

Meow Tse-tung
Oct 11, 2004

No one cat should have all that power
I know basically nothing about video card brand quality and just want something that will play new games for the next foreseeable 3-4 years. Does this look like a solid buy for the pricerange? It seemed to have the best specs for the 250 range, and I have a gigabyte motherboard so I figured less chance of weird mobo conflicts. I'm definitely open to advice on better buys, but I'm ordering tomorrow morning so just figured I would run this by the pros.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814125418

also considering

http://www.newegg.com/Product/Product.aspx?Item=N82E16814150605

But honestly have no clue what to get.

Meow Tse-tung fucked around with this message at 09:36 on Sep 17, 2012

unpronounceable
Apr 4, 2010

You mean we still have another game to go through?!
Fallen Rib

Meow Tse-tung posted:

I know basically nothing about video card brand quality and just want something that will play new games for the next foreseeable 3-4 years. Does this look like a solid buy for the pricerange? It seemed to have the best specs for the 250 range, and I have a gigabyte motherboard so I figured less chance of weird mobo conflicts. I'm definitely open to advice on better buys, but I'm ordering tomorrow morning so just figured I would run this by the pros.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814125418

also considering

http://www.newegg.com/Product/Product.aspx?Item=N82E16814150605

But honestly have no clue what to get.

Having different brand mobos and GPUs is a non-issue. Gigabyte isn't recommended around here because they have wonky problems with their power delivery. XFX isn't because they strip components from the reference design of GPUs, which makes them less robust. For recommended brands, we recommend Sapphire, Asus, MSI and EVGA. Personally I'd choose this Sapphire over either of the ones you chose.

Unless you want to spend closer to $300, a 7870 is the best value you can get.

Wozbo
Jul 5, 2010
Isn't this not true with at least the latest Nvidia generation because they all had to follow a certain baseline spec to the letter? I'm getting a Gigabyte 670 (with the 3 fans) because it looks to perform pretty awesomely.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Wozbo posted:

Isn't this not true with at least the latest Nvidia generation because they all had to follow a certain baseline spec to the letter? I'm getting a Gigabyte 670 (with the 3 fans) because it looks to perform pretty awesomely.

I thought it was the exact opposite with the newest NVIDIA generation, that you couldn't really tell what clocks the cards were boosting to, so there was more room for variation than ever before, but I could be way off with that thought.

HalloKitty fucked around with this message at 18:36 on Sep 17, 2012

Wozbo
Jul 5, 2010
That's not what I'm getting at. I'm getting at the fact that you always have to meet at LEAST <bar> and then the card could boost up based on what else you may have put in there (better cooling etc, but always has to be better than reference). For example I'm fairly sure the 670 from gigabyte is actually based off of a reference 680 board, and meets that baseline (meaning pretty good boost).

At least, that's my understanding. I guess I'll make a E/N post when I get mine and it runs like dog poo poo.

zer0spunk
Nov 6, 2000

devil never even lived

Alereon posted:

Isn't Mercury only supported on Fermi (GTX 400/500) cards? I thought we didn't get support on Kepler until the GTX 700-series.

Support for the 680 was just added in an update. I re ran an encode I did right before the switch from software to hardware. Software only (i7 3770k oc'd to 4.3) a 60 minute video being encoded to 480p from a mix of 1080i hdv and 1080p h.264 footage took 15 minutes..with hardware (gtx 680), 7 minutes.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

zer0spunk posted:

Support for the 680 was just added in an update. I re ran an encode I did right before the switch from software to hardware. Software only (i7 3770k oc'd to 4.3) a 60 minute video being encoded to 480p from a mix of 1080i hdv and 1080p h.264 footage took 15 minutes..with hardware (gtx 680), 7 minutes.

Do you happen to have similar numbers for a Fermi card?

zer0spunk
Nov 6, 2000

devil never even lived

Factory Factory posted:

Do you happen to have similar numbers for a Fermi card?

I don't, but I'd imagine it's close since the compute on the 6 series isn't what it could have been. The trade off being how much hotter the 5 series runs. I also haven't run a test of quicksync cpu encoding vs gpu AME encoding the same h.264 output, but I'd imagine they are very close. Everything I've read suggests quality takes a hit with quick sync encodes at the same bit rates though, so why bother?

Hoping the top end 7 series in a year isn't as compute neutered, but I can't see anyone complaining with current GPU acceleration in cs6.

MixMasterMalaria
Jul 26, 2007

zer0spunk posted:

Support for the 680 was just added in an update. I re ran an encode I did right before the switch from software to hardware. Software only (i7 3770k oc'd to 4.3) a 60 minute video being encoded to 480p from a mix of 1080i hdv and 1080p h.264 footage took 15 minutes..with hardware (gtx 680), 7 minutes.

Did they add support for the lower end cards? 680 is waaaayyyy outside budget.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

MixMasterMalaria posted:

Did they add support for the lower end cards? 680 is waaaayyyy outside budget.

Get a cheap GTX 570, it should do about as well on compute, maybe a touch better, for half the price or lower.

TheRationalRedditor
Jul 17, 2000

WHO ABUSED HIM. WHO ABUSED THE BOY.

Wozbo posted:

That's not what I'm getting at. I'm getting at the fact that you always have to meet at LEAST <bar> and then the card could boost up based on what else you may have put in there (better cooling etc, but always has to be better than reference). For example I'm fairly sure the 670 from gigabyte is actually based off of a reference 680 board, and meets that baseline (meaning pretty good boost).

At least, that's my understanding. I guess I'll make a E/N post when I get mine and it runs like dog poo poo.
I got the Gigabyte 670 Windforce 3, and it's been performing fantastically. I've been running it at 1060Mhz core (it can go higher but I don't have the need to push it harder atm) and GPU-Z tells me the boost ratchets up to 1250Mhz during the most demanding games, so it ain't no gimp boost.

Not to mention that the VRAM on my card clocks up like nuts. I've been able to max what the sliders allow (1502Mhz -> 1915Mhz). In addition to all this, it and the Asus DirectCU seem to have the most effective and quietest cool setups in all the 670 showdowns I've read. Go figure.

Incidentally, Borderlands 2 has been running super nicely with PhysX on Medium. I'm gonna test out some gameplay on high and I'm still in the intro areas, but I was dreading major slowdown with it enabled and am pleasantly surprised thus far. BL2 has no in-game MSAA options though, just postAA.

TheRationalRedditor fucked around with this message at 08:14 on Sep 18, 2012

greatapoc
Apr 4, 2005
I asked this in the system build thread earlier but it's probably better asked here.

I'm struggling to decide on which GPU to upgrade to. At the moment I have a 6850 and I game at 2560x1440 which brings this card to its knees sometimes. CPU is an i7 860 which I'll either be upgrading to a 3770K or waiting until the next generation. I also do a lot of work in Photoshop/Lightroom and video editing.

I'm torn between the 670 and 7970 GHz right now. Both can be had for pretty much the same price here in Australia although the only GHz I can find is Gigabyte whereas there's much more choice in the 670s. I'm not adverse to overclocking either so if there's something lower that will work better I'm fine with that too as long as it's good at 2560x1440.

Are the GHzs just higher binned 7970s? The price of a 7970 GHz here is $50 less than an ASUS TOP 7970.

Mayne
Mar 22, 2008

To crooked eyes truth may wear a wry face.

zer0spunk posted:

Support for the 680 was just added in an update. I re ran an encode I did right before the switch from software to hardware. Software only (i7 3770k oc'd to 4.3) a 60 minute video being encoded to 480p from a mix of 1080i hdv and 1080p h.264 footage took 15 minutes..with hardware (gtx 680), 7 minutes.

Yeah but quality of GPU encodes cannot compare at all with software encodes with a decent encoder like x264.

Wozbo
Jul 5, 2010

Glen Goobersmooches posted:

I got the Gigabyte 670 Windforce 3, and it's been performing fantastically. I've been running it at 1060Mhz core (it can go higher but I don't have the need to push it harder atm) and GPU-Z tells me the boost ratchets up to 1250Mhz during the most demanding games, so it ain't no gimp boost.

Not to mention that the VRAM on my card clocks up like nuts. I've been able to max what the sliders allow (1502Mhz -> 1915Mhz). In addition to all this, it and the Asus DirectCU seem to have the most effective and quietest cool setups in all the 670 showdowns I've read. Go figure.

Incidentally, Borderlands 2 has been running super nicely with PhysX on Medium. I'm gonna test out some gameplay on high and I'm still in the intro areas, but I was dreading major slowdown with it enabled and am pleasantly surprised thus far. BL2 has no in-game MSAA options though, just postAA.

This is what I like to hear. If I don't dork around with the settings will it still boost that high if it is able to?

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

People in the BL2 thread in Games are reporting great results with even a 560Ti doing both graphics and PhysX, nVidia must have optimized the poo poo out of this game - if this is the sort of PhysX that we can get, in general, going forward, then all this talk about a coprocessor is probably just unnecessary. Everything is a particle in BL2 with the PhysX on high. It's crazy.

Incessant Excess
Aug 15, 2005

Cause of glitch:
Pretentiousness

greatapoc posted:

I'm not adverse to overclocking either so if there's something lower that will work better I'm fine with that too as long as it's good at 2560x1440.

I'm using a GTX670 at 2560x1440 and it works like a charm with everything I've thrown at it.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

LooKMaN posted:

Yeah but quality of GPU encodes cannot compare at all with software encodes with a decent encoder like x264.

You're thinking the Nvidia encoder (which is poo poo) or the AMD encoder (which is no faster than x86 at similar quality) or QuickSync (which is designed as a good-enough encoder but very fast for tablets/phones and such).

zer0spunk is talking Adobe's Mercury Engine, which is a premier (:v:) media software company's custom GPU encoder. It's not only fast, it's archival/production quality and even better at render-time tasks like color grading and certain other post filters than software-based encode.

Wozbo
Jul 5, 2010
I'm wondering if Nvidia finally stopped juggling for physx and just dedicates a flat amount to physx + some leeway for larger workloads. Seems like it would make the most sense for now.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Wozbo posted:

I'm wondering if Nvidia finally stopped juggling for physx and just dedicates a flat amount to physx + some leeway for larger workloads. Seems like it would make the most sense for now.

That's not really possible, since compute and graphics loads require a GPU context switch, which requires a CPU context switch and major re-jiggering of GPU.

That is, on Fermi and prior. Kepler can spawn GPU threads from other GPU threads, which might potentially allow smaller overhead for PhysX loads. But that wouldn't explain why a 560 Ti does so well.

Maybe them fiziks just easy.

Wozbo
Jul 5, 2010
Heh, maybe they found something akin to the magic number for inverse roots.

TheRationalRedditor
Jul 17, 2000

WHO ABUSED HIM. WHO ABUSED THE BOY.

Wozbo posted:

This is what I like to hear. If I don't dork around with the settings will it still boost that high if it is able to?
Well from what I've read unless you've got a model where the boost is messed with in a way that reacts badly with overclocking (Tom's Hardware found this in some makers' 660Ti models), it's generally a fixed amount from an estimated pool of 2-3 values per brand offering based on the binning of the GPU. The model I have seems to push +180Mhz at max output (the modifier doesn't change with overclocking, it just exceeds the default core by that much under load), which is one of the better values I saw documented.

Agreed posted:

People in the BL2 thread in Games are reporting great results with even a 560Ti doing both graphics and PhysX, nVidia must have optimized the poo poo out of this game - if this is the sort of PhysX that we can get, in general, going forward, then all this talk about a coprocessor is probably just unnecessary. Everything is a particle in BL2 with the PhysX on high. It's crazy.
It's pretty stellar. No slowdown whatsoever with a junkyard full of bolts n' debris. Outhouse sludge in all its fetid, globular glory!

TheRationalRedditor fucked around with this message at 15:56 on Sep 18, 2012

zer0spunk
Nov 6, 2000

devil never even lived

MixMasterMalaria posted:

Did they add support for the lower end cards? 680 is waaaayyyy outside budget.

http://www.adobe.com/products/premiere/mercury-playback-engine.html

This lists the supported cards for MPE in hardware mode...The 570/580 are on there, the 680 isn't but the update dropped a week or two ago, so I'd imagine it'll be listed at some point soon now that it's officially supported.

You can "hack" in support for an unlisted card, quite easily actually (basic txt editing a config file), but it could either work beautifully or be insanely crashprone. The changes in compute between kepler and fermi meant that they had to actually go in and code support, so the "hack" was buggy until they did so.

So basically, if a nvidia card has been out for 6 months when a new version of CS comes out, there should definitely be support for CUDA. If a new card drops around the same time, expect a few month delay before getting official support in a patch, even though adobe is a partner.

My big concern is how new card support works with older versions. Will cs5 and 5.5 get 680 support too now that cs6 has it? If not it means when I swap out for the next 7 series in a year or whatever, I'll have to buy cs6.5 or 7 to get support for the 7 series, which is pretty terrible for consumers. I suppose we'll find out with cs5.5/5...

On the plus, adobe did crank out 680 support fast (~ 1 month) for aftereffects...kind of tells you AE is more important to them then premiere (and I'm sure photoshop does better then both)

Ludicrous Gibs!
Jan 21, 2002

I'm not lost, but I don't know where I am.
Ramrod XTreme

Agreed posted:

People in the BL2 thread in Games are reporting great results with even a 560Ti doing both graphics and PhysX, nVidia must have optimized the poo poo out of this game - if this is the sort of PhysX that we can get, in general, going forward, then all this talk about a coprocessor is probably just unnecessary. Everything is a particle in BL2 with the PhysX on high. It's crazy.

For me, it's silky smooth on my 560Ti up until a firefight with 3 or more enemies begin. Then, my framerate dips below 30. And this is in singleplayer; I'd hate to think what it'll look like with 3 other people blasting holes in the ground and making sludge pools everywhere.

But god drat is it ever pretty. In the first boss fight, chunks of debris were flying everywhere, and when my Siren partner used Stasis, they all got sucked into a swirling vortex around the target. What's the cheapest I could get a coprocessor card for? ;)

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
More worthwhile is to get a 600-series card and use the 560 Ti for coprocessing :getin:

Also, I can't help but feel that I have just given advice which makes sense in a very narrow context but is ultimately ridiculous.

ScienceAndMusic
Feb 16, 2012

CANNOT STOP SHITPOSTING FOR FIVE MINUTES
So if I wanted to get a new nVidia card, is there a recommended card that everyone agrees is the best price to performance sweet spot?

Berk Berkly
Apr 9, 2009

by zen death robot

ScienceAndMusic posted:

So if I wanted to get a new nVidia card, is there a recommended card that everyone agrees is the best price to performance sweet spot?

As of right now? No higher than a GTX670 and no lower than the GTX660, with the only card in between being the 660TI. The 680 is far to close to the 670 performance to justify the extra $100 it costs, and the 690 is just a doubled up 680. It is a matter of how much you want to spend.

Just don't pay premium for GTX660/660TIs with more than 2GBs of VRAM.

Berk Berkly fucked around with this message at 20:21 on Sep 18, 2012

Adbot
ADBOT LOVES YOU

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Factory Factory posted:

More worthwhile is to get a 600-series card and use the 560 Ti for coprocessing :getin:

Also, I can't help but feel that I have just given advice which makes sense in a very narrow context but is ultimately ridiculous.

Yep. That said, I've gone in and gussied it up like loving crazy in the .ini files and it looks gorgeous rendering on the 680... And I don't ever encounter slowdown, ever, with the overkill 580. It just crunches through it and takes a nap at the same time, I reckon.

So, :getin: indeed.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply