Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Guni
Mar 11, 2010

slidebite posted:

Just to make sure I understand the present nvidia timeline, the next gen (supersede of Kepler) of GPUs won't be out until sometime in 2014, right?

AFAIK, that's correct. Same goes for AMD - both are rumored to come out early 2014.

Adbot
ADBOT LOVES YOU

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride
I was installing the new nvidia betas and the installer slideshow indicated a bundle is coming with Metro Last Light

Klyith
Aug 3, 2007

GBS Pledge Week

Dogen posted:

I was installing the new nvidia betas and the installer slideshow indicated a bundle is coming with Metro Last Light
It's already in effect, with GTX 660s and up, so starting at $200. Of course, the AMD bundle is now Bioshock, Tomb Raider, and Far Cry Blood Dragon. Between a 660 and a 7870 the performance is pretty much a wash as well.

Metro is definitely an improvement for nvidia though. That poo poo-tier F2P credit was the main reason I bought my first AMD/ATI a few months ago, after a 10 year run of nvidias (not a fanboy thing, I just never happened to buy a video card in the generations that ATI had the upper hand).

Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.
Catalyst 13.4 WHQL is live. No release notes yet, drivers have 7990 support. Ran Heaven benchmark on 13.3b3, then on these. No significant FPS increase, but it felt smoother, as unscientific as that sounds. Only using it as an interim until 13.5 betas hit hopefully tomorrow with the 7990's launch.

Edit: Was also having fun seeing how far I could push my 7950 last night. Ended up on 1150/1625 @ 1.25v. Didn't do any proper (i.e. OCCT) testing, just ran it through a complete Heaven run. Messing around more or less.

Endymion FRS MK1 fucked around with this message at 17:57 on Apr 23, 2013

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride

Klyith posted:

It's already in effect, with GTX 660s and up, so starting at $200. Of course, the AMD bundle is now Bioshock, Tomb Raider, and Far Cry Blood Dragon. Between a 660 and a 7870 the performance is pretty much a wash as well.

Metro is definitely an improvement for nvidia though. That poo poo-tier F2P credit was the main reason I bought my first AMD/ATI a few months ago, after a 10 year run of nvidias (not a fanboy thing, I just never happened to buy a video card in the generations that ATI had the upper hand).

I know a lot of people in the last month or two have said the same thing. Maybe they figured out it was costing them some sales, finally.

The Illusive Man
Mar 27, 2008

~savior of yoomanity~

Dogen posted:

I know a lot of people in the last month or two have said the same thing. Maybe they figured out it was costing them some sales, finally.

Nothing justifies the cost of a $200-$300 card like ~$135 worth of free, new AAA games. Of course, I'm curious how significantly this promo is eating into AMD's profit margins.

Klyith
Aug 3, 2007

GBS Pledge Week

Space Racist posted:

Nothing justifies the cost of a $200-$300 card like ~$135 worth of free, new AAA games. Of course, I'm curious how significantly this promo is eating into AMD's profit margins.
I think it's actually helped their revenue -- Q1 2013 was a $150 million loss, but that's better than expected and way better than some previous quarters. They're actually better off selling video chips at a small loss than selling nothing at all. They've had to eat giant penalty losses on their fab contracts because they were below promised demand.

Now that we know that both PS4 and MS consoles will use AMD chips, I think their strategy for the past year was to keep their head just barely enough above water to stave off drowning. Margins on the new console chips will be sweet gently caress all, but it improves their revenue and solves that fab capacity problem. They're still in a dicey position.

forbidden dialectics
Jul 26, 2005





Just imagine if AMD had purchased nVidia like they originally planned (according to Hector Ruiz' book) and Jen-Hsun Huang took over as CEO of all of AMD like he demanded (which is why they didn't buy nVidia). We might be looking at an AMD that actually is competitive and profitable. We can dream \/:shobon:\/.

Wistful of Dollars
Aug 25, 2009

The reviews of the 7990 are coming out and it sounds rather underwhelming. If you're going to be a kook who wants to spend that much on graphics then get a titan or two 680s/7970s.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Catalyst 13.4 WHQL and 13.5 Beta have been released. I installed 13.5 Beta, and other than a BSOD right after the install, it seems to have improved CF microstutter in Far Cry 3, which had been annoying me.

E: Except now my second monitor, a Dell U2410 attached via DP, doesn't work?

Factory Factory fucked around with this message at 01:12 on Apr 25, 2013

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

Miffler posted:

The reviews of the 7990 are coming out and it sounds rather underwhelming. If you're going to be a kook who wants to spend that much on graphics then get a titan or two 680s/7970s.

After everything I've read about the frame latency issues of AMD cards, and my personal experiences with the unique kind of hell it is waiting for AMD to release driver updates so that new games don't have horrific bugs, I would personally never buy an AMD card again. I upgraded from my 5970 to a gtx680 when the 5970 died and I haven't looked back, I'd say I get 50% more performance, and It lets me run anything I want on max at 2560x1440.

Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.

The Lord Bude posted:

After everything I've read about the frame latency issues of AMD cards, and my personal experiences with the unique kind of hell it is waiting for AMD to release driver updates so that new games don't have horrific bugs, I would personally never buy an AMD card again. I upgraded from my 5970 to a gtx680 when the 5970 died and I haven't looked back, I'd say I get 50% more performance, and It lets me run anything I want on max at 2560x1440.

Good news! AMD is actually listening.

http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-AMD-Improves-CrossFire-Prototype-Driver

New prototype driver to be released in June-ish. Download all three videos they show, it is incredible the difference. Never realized you Crossfire folks endured that stuttery mess. I'll let this graph speak for itself:

madjdmyo
Jan 10, 2007

Factory Factory posted:

Catalyst 13.4 WHQL and 13.5 Beta have been released. I installed 13.5 Beta, and other than a BSOD right after the install, it seems to have improved CF microstutter in Far Cry 3, which had been annoying me.

E: Except now my second monitor, a Dell U2410 attached via DP, doesn't work?

I have a single U3011 attached via DP and the 13.5 beta just gave me no signal halfway through installing the drivers. Had to use the 13.4.

kuddles
Jul 16, 2006

Like a fist wrapped in blood...
So my current PC is playing games just fine, but I'm getting the itch. I'm thinking of building a new PC from scratch, with the added intent of moving up to either a 27" or 30" monitor at the same time. I can't decide if I should build a new PC once Haswell comes about (and probably the 700 series of video cards), or hold off another year when both Nvidia and AMD supposedly move to new architectures and we have a better idea of what the next generation of games requires. Anyone in the same boat?

slidebite
Nov 6, 2005

Good egg
:colbert:

I have a question about video memory.

Last gen cards were generally 1GB+ (thinking nvidia 5xx series) and now 2-4GB is becoming the norm.

Where is this memory put to use? Do "mainstream" resolutions like 1080p even use it? Or once all the bells and whistles, high FSAA and AF, HDR/lighting, etc...do they tap into it?

I know in Arma2 the video options actually let you slide scale the amount of video memory you have... so do games like that use all of it?

slidebite fucked around with this message at 16:12 on Apr 25, 2013

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

slidebite posted:

Where is this memory put to use? Do "mainstream" resolutions like 1080p even use it? Or once all the bells and whistles, high FSAA and AF, HDR/lighting, etc...do they tap into it?
Yes, we're reaching the end of the line for 1GB cards being viable at 1080p, that's why all cards intended for 1080p gaming have at least 2GB of RAM. Beyond 1080p more than 2GB is recommended, which is why AMD cards have 3GB of RAM (nVidia would have to jump to 4GB, which is too expensive to be justifiable on GTX 680s). Video memory usage in games was artificially limited by consoles for some time, the PS4 has 8GB of unified system and video RAM so that's not going to be a factor anymore.

DaNzA
Sep 11, 2001

:D
Grimey Drawer
Yeah, look at game like Bioshock Infinite and you can see how sharp the textures can be nowadays compare to the old blurry stuff that's pretty much just a mess as soon as you get up close.

Klyith
Aug 3, 2007

GBS Pledge Week
I don't think that 1080p inherently needs more memory. Older video cards used to be completely capable of 1600x1200 which is approximately the same number of output pixels as 1080p.

It's the high resolution textures, and multiple textures per surface, that's eating most of the memory. Plus lots of pixel shaders need memory buffers. Also the trend in games to things like open-world and large, diverse levels means keeping a much larger working set in memory.

e:

Jago posted:

Frame buffers and AA are still not free. Look at any benchmarks with a bunch of cards and watch them scale as resolution is increased and decrease.
You have to watch for the difference between memory size and memory bandwidth. Most of the performance hits from increased resolution are not due to running low on memory capacity, they're from increased bandwidth requirements. Antialiasing is a perfect example. Multisampling AA is a perfect example, it doesn't need much of anything in the way of additional capacity, but it needs multiple memory hits for each edge pixel that the AA is being applied to.

Framebuffers at this point have been so outstripped by memory growth that they might as well be free. Triple buffered 1080p 32bit + zbuffer is like 50 mb. If you're supersampling it balloons up but even then the bandwidth requirements kill you first.

Klyith fucked around with this message at 00:07 on Apr 26, 2013

LRADIKAL
Jun 10, 2001

Fun Shoe
Frame buffers and AA are still not free. Look at any benchmarks with a bunch of cards and watch them scale as resolution is increased and decrease.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
Well I think it's implied when we talk about video memory requirements we're talking about the games of today and the foreseeable future, which really do have significant video memory needs. The thing that finally pushed me into buying a new videocard to replace my Radeon HD 4850 512MB was that I could not play Alan Wake at 1080p at the lowest settings, it simply could not fit into 512MB. You have to turn the settings down to play it at 1080p with 1GB of VRAM, and I recall it using over 1.5GB on my GTX 670 at highest in-game settings, but without any obscene antialiasing (16X CSAA is pretty low-impact). Alan Wake is not a brand new or horribly unoptimized game (though they didn't set the requirements as low as many titles would have), newer games are going to have higher requirements.

I try to stress the importance of having enough VRAM because you do NOT get a graceful decline in performance from running low. Performance drops into the teens and twenties and you get long pauses as the card swaps to and from system memory. Essentially you stop having an enjoyably playable experience the instant you run out of VRAM, so you want to make sure that never happens with the games/settings you want to play.

Alereon fucked around with this message at 02:47 on Apr 26, 2013

slidebite
Nov 6, 2005

Good egg
:colbert:

Thank you for your answers.

brb, gonna buy a GTX 670 4GB :v:

seriously, I am. It's on sale for the same price as a 2GB

TheRationalRedditor
Jul 17, 2000

WHO ABUSED HIM. WHO ABUSED THE BOY.

Alereon posted:

I try to stress the importance of having enough VRAM because you do NOT get a graceful decline in performance from running low. Performance drops into the teens and twenties and you get long pauses as the card swaps to and from system memory. Essentially you stop having an enjoyably playable experience the instant you run out of VRAM, so you want to make sure that never happens with the games/settings you want to play.
Just to clarify, but VRAM in SLI setups doesn't get pooled together with identical cards for some reason, right? Why is that?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
It's because data has to be duplicated between cards. Every texture on Card A must also be in memory on Card B, etc. The actual SLI/CF link is very slow and doesn't do much besides shuffle the frame buffer around.

The Illusive Man
Mar 27, 2008

~savior of yoomanity~

Factory Factory posted:

It's because data has to be duplicated between cards. Every texture on Card A must also be in memory on Card B, etc. The actual SLI/CF link is very slow and doesn't do much besides shuffle the frame buffer around.

Just for curiosity's sake, is this the same on cards like the 7990 and 690?

I.e., even though the 7990 is a 'single' card with 6 GB of VRAM, is only 3 GB effectively usable?

The Illusive Man fucked around with this message at 05:02 on Apr 26, 2013

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Yes.

spouse
Nov 10, 2008

When our turn comes, we shall not make excuses for the terror.


slidebite posted:

I have a question about video memory.

Last gen cards were generally 1GB+ (thinking nvidia 5xx series) and now 2-4GB is becoming the norm.

Where is this memory put to use? Do "mainstream" resolutions like 1080p even use it? Or once all the bells and whistles, high FSAA and AF, HDR/lighting, etc...do they tap into it?

I know in Arma2 the video options actually let you slide scale the amount of video memory you have... so do games like that use all of it?

On a single screen, I see very little difference between my 7870 XT (2gb) and my 7950 (3gb). Running three screen eyefinity, there is a tremendous (17fps > 40fps) increase in framerate from the 7870 xt, especially Farcry 3. So, I don't think 1080p and below use more than 2gb usually, but there is certainly a use for more, and who doesn't love a little breathing room?

Guni
Mar 11, 2010

Space Racist posted:

Just for curiosity's sake, is this the same on cards like the 7990 and 690?

I.e., even though the 7990 is a 'single' card with 6 GB of VRAM, is only 3 GB effectively usable?


Whats even the point in getting 7990's and the like when theyre really just two 7970's stuck together. Wouldnt two stand alone 7970's offer the exact same performance and experience as a 7990?

Guni fucked around with this message at 09:31 on Apr 26, 2013

spouse
Nov 10, 2008

When our turn comes, we shall not make excuses for the terror.


Guni posted:

[quote]
Yes
[/quote ]
Whats even the point in getting 7990's and the like when theyre really just two 7970's stuck together. Wouldnt two stand alone 7970's offer the exact same performance and experience as a 7990?

There are more 9's instead of 7's, so no.

iuvian
Dec 27, 2003
darwin'd



Guni posted:

[quote]
Yes
[/quote ]
Whats even the point in getting 7990's and the like when theyre really just two 7970's stuck together. Wouldnt two stand alone 7970's offer the exact same performance and experience as a 7990?

Multi-card setups in general are dumb as heck, the appeal of a single card solution is less noise/power/heat issues. Although I still had to deal with driver issues and sli profiles even with a dual gpu single slot card. Never again.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Get two, do quad CF/SLI with an mATX board.

SlayVus
Jul 10, 2009
Grimey Drawer

iuvian posted:

Multi-card setups in general are dumb as heck, the appeal of a single card solution is less noise/power/heat issues. Although I still had to deal with driver issues and sli profiles even with a dual gpu single slot card. Never again.

Yeah, the biggest draw of single card, dual gpus is that it allows you do get increased GPU performance in a smaller area. You can put in two 7990s into just two PCI-E slots and get Quad-CFX. Where as, I believe, you can only do Tri-CFX with single card solutions. Same thing for SLI. You can get get Tri-SLI in single card solutions, but dual GPU cards allow you to go Quad-SLI.

However, as far as I am aware, the performance gain from a 2nd to 3rd card was something like 40-50% with Titan. Then imagine if you were to do quad-SLI with Titan. You would only get probably something in the range of 20-30% increase in performance for only another $1000. Edit:I don't know effective Tri-SLI is on Titan now, but when it first came out it was less than a 30% increase in performance when going from SLI to Tri-SLI. Where as going from a single card to a SLI setup with Titan was only about an 80% increase in performance.

SlayVus fucked around with this message at 13:32 on Apr 26, 2013

TheRationalRedditor
Jul 17, 2000

WHO ABUSED HIM. WHO ABUSED THE BOY.

Factory Factory posted:

It's because data has to be duplicated between cards. Every texture on Card A must also be in memory on Card B, etc. The actual SLI/CF link is very slow and doesn't do much besides shuffle the frame buffer around.
Argh, that sucks.

Mad_Lion
Jul 14, 2005

Re: Dual GPU cards.

The only one I ever owned was the Voodoo 5500 back in the day, and I don't remember anything about SLI profiles. It just worked, and you got about double the performance of a single 4500 card. Since NVidia bought 3DFX (including the SLI technology), why is it that NVidia cards require SLI profiles and tweaking now? Is it just that games have gotten more complicated, or what?

SlayVus
Jul 10, 2009
Grimey Drawer

Mad_Lion posted:

Re: Dual GPU cards.

The only one I ever owned was the Voodoo 5500 back in the day, and I don't remember anything about SLI profiles. It just worked, and you got about double the performance of a single 4500 card. Since NVidia bought 3DFX (including the SLI technology), why is it that NVidia cards require SLI profiles and tweaking now? Is it just that games have gotten more complicated, or what?

Some games, the way they are coded, can't actually run in SLI. They get worse performance over a single GPU. Even a game like that will have a SLI profile that will disable SLI. There are also different forms of rendering the frames. I think AMD has more ways to do it than Nvidia does though.

These are the basic ones

  • GPUs go back and forth rendering frames. GPU 1 renders frame 1 while GPU 2 is rendering frame 2. Alternate frame rendering
  • GPUs render half the screen each, vertical. GPU 1 renders left side of frame 1 while GPU 2 randers right side of frame 1. Scissors
  • GPUs render half the screen each, horizontal. GPU 1 renders top half of frame 1 while GPU2 renders bottom half of frame 1. Scissors
  • GPUs render the screen in a checkerboard fashion. The screen is split into a checkerboard and each GPU renders every other board piece. Checkerboard
  • GPUs render the scan-lines of the frame. GPU 1 renders every even numbers scan-line. GPU 2 renders every odd number scan-line. This was the original frame rendering technique of the 3dfx Voodoo2 which was bought by Nvidia. Scan-Line Interleave now stands for Scalable Link Interface.

SlayVus fucked around with this message at 17:28 on Apr 26, 2013

Klyith
Aug 3, 2007

GBS Pledge Week

Mad_Lion posted:

Is it just that games have gotten more complicated, or what?
It's not exactly the games that are more complicated, it's the entire video rendering pipeline. Back in the Voodoo days, nothing in the pipeline was context dependent -- no pixel really cared about the state of it's neighbor. The GPU acquired triangles (already transformed by the CPU), applied textures and maps to them, applied lights, and rendered them. Nothing about the process cared where the next pixel in line was being taken care of.

Now, huge amounts of the GPU work is higher-level programs that don't react well to being split across scan lines. When you split them up, every border is a division that the GPUs have to pass data to keep synched. Which is a problem, because the fewer borders you have the worse the load balancing is.

Mad_Lion
Jul 14, 2005

Thanks guys, that clears it up. IIRC, the original SLI was literally scan-line-interleave indeed. I'm planning a gigantic computer upgrade by this time next year, and it will be the entire system this time. Intel Q6600 and a Radeon HD 5850 will give way to a Haswell I7 and at least a 7950. I still like AMD/Radeon video cards. I've never had problems with them, starting with the Radeon 8500 I got after the Voodoo 5500, through the Radeon 9700, and onto now. I did have an Nvidia 8800 GT in there, but it was the best thing at the time. I've also owned a 4870 1GB.

Anandtech's newest review of the 7990 makes it look pretty good vs. the 690. http://www.anandtech.com/show/6915/amd-radeon-hd-7990-review-7990-gets-official

It uses more power and makes a bit more noise, but other than that, it wins in enough games to make it a totally viable choice, especially considering that it has 3GB of ram per GPU vs. 2GB. In addition, if you care about compute, it owns the 680.

Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.
It'll stutter like a bitch until the prototype drivers end up finished though. As much as I love ATI/AMD, had Radeons and a Rage all my life, minus a GTX260, Crossfire is broken beyond all belief.

Klyith
Aug 3, 2007

GBS Pledge Week
e: ^^^ My understanding is that if you enable vsynch it's relatively fine and always has been. I'd think the main benefit of a SLI/CF setup is so you can enable vsynch and never drop below 60 frames.

Mad_Lion posted:

Anandtech's newest review of the 7990 makes it look pretty good vs. the 690. http://www.anandtech.com/show/6915/amd-radeon-hd-7990-review-7990-gets-official

It uses more power and makes a bit more noise, but other than that, it wins in enough games to make it a totally viable choice, especially considering that it has 3GB of ram per GPU vs. 2GB. In addition, if you care about compute, it owns the 680.
Your comparisons are not very good (the thing to put it up against is dual 680s), and your upgrade strategy isn't very good (spending huge amounts on bonkers top-of-the-line parts is less effective than spending sane amounts more frequently, unless someone else is paying for your computer).

But this isn't the thread for upgrade questions / talk.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

iuvian posted:

Multi-card setups in general are dumb as heck, the appeal of a single card solution is less noise/power/heat issues.

For a while, a pair of market ATIs in Crossfire were a quieter and less hot solution than a single Fermi card.

Klyith posted:

e: ^^^ My understanding is that if you enable vsynch it's relatively fine and always has been. I'd think the main benefit of a SLI/CF setup is so you can enable vsynch and never drop below 60 frames.

Yeah, Vsync pretty much eliminates the microstuttering from AFR, and Dynamic Vsync lets you at least keep a tolerable performance if your frame rate goes below 60.

But whether AFR actually helps a game engine or not is pretty much a crapshoot.

Adbot
ADBOT LOVES YOU

Mad_Lion
Jul 14, 2005

I disagree that putting the dual GPU Radeon card up against the dual GPU NVidia card is a bad comparison. It's a DIRECT comparison, IMO. Geforce 690 vs. Radeon 7990 is similar, two SLI'd or Crossfired GPUs on the same card. The Radeon does pretty well in this comparison. I mis-typed on the compute comparison, I meant to say that the current Radeons outperform the current Geforces in compute, in general. A 7990 beats a 690 in compute, no question.

As for my upgrade path, well, I used to upgrade more often, but the Q6600 with a 5850 has lasted quite a while. I'm one of those that goes big and then waits, sue me. I7 and a high end video card will last me a while.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply