|
slidebite posted:Just to make sure I understand the present nvidia timeline, the next gen (supersede of Kepler) of GPUs won't be out until sometime in 2014, right? AFAIK, that's correct. Same goes for AMD - both are rumored to come out early 2014.
|
# ? Apr 23, 2013 08:45 |
|
|
# ? May 30, 2024 14:07 |
|
I was installing the new nvidia betas and the installer slideshow indicated a bundle is coming with Metro Last Light
|
# ? Apr 23, 2013 13:26 |
|
Dogen posted:I was installing the new nvidia betas and the installer slideshow indicated a bundle is coming with Metro Last Light Metro is definitely an improvement for nvidia though. That poo poo-tier F2P credit was the main reason I bought my first AMD/ATI a few months ago, after a 10 year run of nvidias (not a fanboy thing, I just never happened to buy a video card in the generations that ATI had the upper hand).
|
# ? Apr 23, 2013 14:00 |
|
Catalyst 13.4 WHQL is live. No release notes yet, drivers have 7990 support. Ran Heaven benchmark on 13.3b3, then on these. No significant FPS increase, but it felt smoother, as unscientific as that sounds. Only using it as an interim until 13.5 betas hit hopefully tomorrow with the 7990's launch. Edit: Was also having fun seeing how far I could push my 7950 last night. Ended up on 1150/1625 @ 1.25v. Didn't do any proper (i.e. OCCT) testing, just ran it through a complete Heaven run. Messing around more or less. Endymion FRS MK1 fucked around with this message at 17:57 on Apr 23, 2013 |
# ? Apr 23, 2013 17:55 |
|
Klyith posted:It's already in effect, with GTX 660s and up, so starting at $200. Of course, the AMD bundle is now Bioshock, Tomb Raider, and Far Cry Blood Dragon. Between a 660 and a 7870 the performance is pretty much a wash as well. I know a lot of people in the last month or two have said the same thing. Maybe they figured out it was costing them some sales, finally.
|
# ? Apr 23, 2013 23:53 |
|
Dogen posted:I know a lot of people in the last month or two have said the same thing. Maybe they figured out it was costing them some sales, finally. Nothing justifies the cost of a $200-$300 card like ~$135 worth of free, new AAA games. Of course, I'm curious how significantly this promo is eating into AMD's profit margins.
|
# ? Apr 24, 2013 04:56 |
|
Space Racist posted:Nothing justifies the cost of a $200-$300 card like ~$135 worth of free, new AAA games. Of course, I'm curious how significantly this promo is eating into AMD's profit margins. Now that we know that both PS4 and MS consoles will use AMD chips, I think their strategy for the past year was to keep their head just barely enough above water to stave off drowning. Margins on the new console chips will be sweet gently caress all, but it improves their revenue and solves that fab capacity problem. They're still in a dicey position.
|
# ? Apr 24, 2013 06:35 |
|
Just imagine if AMD had purchased nVidia like they originally planned (according to Hector Ruiz' book) and Jen-Hsun Huang took over as CEO of all of AMD like he demanded (which is why they didn't buy nVidia). We might be looking at an AMD that actually is competitive and profitable. We can dream \/\/.
|
# ? Apr 24, 2013 20:10 |
|
The reviews of the 7990 are coming out and it sounds rather underwhelming. If you're going to be a kook who wants to spend that much on graphics then get a titan or two 680s/7970s.
|
# ? Apr 25, 2013 00:56 |
|
Catalyst 13.4 WHQL and 13.5 Beta have been released. I installed 13.5 Beta, and other than a BSOD right after the install, it seems to have improved CF microstutter in Far Cry 3, which had been annoying me. E: Except now my second monitor, a Dell U2410 attached via DP, doesn't work? Factory Factory fucked around with this message at 01:12 on Apr 25, 2013 |
# ? Apr 25, 2013 01:04 |
|
Miffler posted:The reviews of the 7990 are coming out and it sounds rather underwhelming. If you're going to be a kook who wants to spend that much on graphics then get a titan or two 680s/7970s. After everything I've read about the frame latency issues of AMD cards, and my personal experiences with the unique kind of hell it is waiting for AMD to release driver updates so that new games don't have horrific bugs, I would personally never buy an AMD card again. I upgraded from my 5970 to a gtx680 when the 5970 died and I haven't looked back, I'd say I get 50% more performance, and It lets me run anything I want on max at 2560x1440.
|
# ? Apr 25, 2013 03:58 |
|
The Lord Bude posted:After everything I've read about the frame latency issues of AMD cards, and my personal experiences with the unique kind of hell it is waiting for AMD to release driver updates so that new games don't have horrific bugs, I would personally never buy an AMD card again. I upgraded from my 5970 to a gtx680 when the 5970 died and I haven't looked back, I'd say I get 50% more performance, and It lets me run anything I want on max at 2560x1440. Good news! AMD is actually listening. http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-AMD-Improves-CrossFire-Prototype-Driver New prototype driver to be released in June-ish. Download all three videos they show, it is incredible the difference. Never realized you Crossfire folks endured that stuttery mess. I'll let this graph speak for itself:
|
# ? Apr 25, 2013 06:04 |
|
Factory Factory posted:Catalyst 13.4 WHQL and 13.5 Beta have been released. I installed 13.5 Beta, and other than a BSOD right after the install, it seems to have improved CF microstutter in Far Cry 3, which had been annoying me. I have a single U3011 attached via DP and the 13.5 beta just gave me no signal halfway through installing the drivers. Had to use the 13.4.
|
# ? Apr 25, 2013 06:40 |
|
So my current PC is playing games just fine, but I'm getting the itch. I'm thinking of building a new PC from scratch, with the added intent of moving up to either a 27" or 30" monitor at the same time. I can't decide if I should build a new PC once Haswell comes about (and probably the 700 series of video cards), or hold off another year when both Nvidia and AMD supposedly move to new architectures and we have a better idea of what the next generation of games requires. Anyone in the same boat?
|
# ? Apr 25, 2013 15:27 |
|
I have a question about video memory. Last gen cards were generally 1GB+ (thinking nvidia 5xx series) and now 2-4GB is becoming the norm. Where is this memory put to use? Do "mainstream" resolutions like 1080p even use it? Or once all the bells and whistles, high FSAA and AF, HDR/lighting, etc...do they tap into it? I know in Arma2 the video options actually let you slide scale the amount of video memory you have... so do games like that use all of it? slidebite fucked around with this message at 16:12 on Apr 25, 2013 |
# ? Apr 25, 2013 15:48 |
|
slidebite posted:Where is this memory put to use? Do "mainstream" resolutions like 1080p even use it? Or once all the bells and whistles, high FSAA and AF, HDR/lighting, etc...do they tap into it?
|
# ? Apr 25, 2013 17:09 |
|
Yeah, look at game like Bioshock Infinite and you can see how sharp the textures can be nowadays compare to the old blurry stuff that's pretty much just a mess as soon as you get up close.
|
# ? Apr 25, 2013 23:11 |
|
I don't think that 1080p inherently needs more memory. Older video cards used to be completely capable of 1600x1200 which is approximately the same number of output pixels as 1080p. It's the high resolution textures, and multiple textures per surface, that's eating most of the memory. Plus lots of pixel shaders need memory buffers. Also the trend in games to things like open-world and large, diverse levels means keeping a much larger working set in memory. e: Jago posted:Frame buffers and AA are still not free. Look at any benchmarks with a bunch of cards and watch them scale as resolution is increased and decrease. Framebuffers at this point have been so outstripped by memory growth that they might as well be free. Triple buffered 1080p 32bit + zbuffer is like 50 mb. If you're supersampling it balloons up but even then the bandwidth requirements kill you first. Klyith fucked around with this message at 00:07 on Apr 26, 2013 |
# ? Apr 25, 2013 23:32 |
|
Frame buffers and AA are still not free. Look at any benchmarks with a bunch of cards and watch them scale as resolution is increased and decrease.
|
# ? Apr 25, 2013 23:48 |
|
Well I think it's implied when we talk about video memory requirements we're talking about the games of today and the foreseeable future, which really do have significant video memory needs. The thing that finally pushed me into buying a new videocard to replace my Radeon HD 4850 512MB was that I could not play Alan Wake at 1080p at the lowest settings, it simply could not fit into 512MB. You have to turn the settings down to play it at 1080p with 1GB of VRAM, and I recall it using over 1.5GB on my GTX 670 at highest in-game settings, but without any obscene antialiasing (16X CSAA is pretty low-impact). Alan Wake is not a brand new or horribly unoptimized game (though they didn't set the requirements as low as many titles would have), newer games are going to have higher requirements. I try to stress the importance of having enough VRAM because you do NOT get a graceful decline in performance from running low. Performance drops into the teens and twenties and you get long pauses as the card swaps to and from system memory. Essentially you stop having an enjoyably playable experience the instant you run out of VRAM, so you want to make sure that never happens with the games/settings you want to play. Alereon fucked around with this message at 02:47 on Apr 26, 2013 |
# ? Apr 26, 2013 02:45 |
|
Thank you for your answers. brb, gonna buy a GTX 670 4GB seriously, I am. It's on sale for the same price as a 2GB
|
# ? Apr 26, 2013 02:50 |
|
Alereon posted:I try to stress the importance of having enough VRAM because you do NOT get a graceful decline in performance from running low. Performance drops into the teens and twenties and you get long pauses as the card swaps to and from system memory. Essentially you stop having an enjoyably playable experience the instant you run out of VRAM, so you want to make sure that never happens with the games/settings you want to play.
|
# ? Apr 26, 2013 04:20 |
|
It's because data has to be duplicated between cards. Every texture on Card A must also be in memory on Card B, etc. The actual SLI/CF link is very slow and doesn't do much besides shuffle the frame buffer around.
|
# ? Apr 26, 2013 04:50 |
|
Factory Factory posted:It's because data has to be duplicated between cards. Every texture on Card A must also be in memory on Card B, etc. The actual SLI/CF link is very slow and doesn't do much besides shuffle the frame buffer around. Just for curiosity's sake, is this the same on cards like the 7990 and 690? I.e., even though the 7990 is a 'single' card with 6 GB of VRAM, is only 3 GB effectively usable? The Illusive Man fucked around with this message at 05:02 on Apr 26, 2013 |
# ? Apr 26, 2013 05:00 |
|
Yes.
|
# ? Apr 26, 2013 05:10 |
|
slidebite posted:I have a question about video memory. On a single screen, I see very little difference between my 7870 XT (2gb) and my 7950 (3gb). Running three screen eyefinity, there is a tremendous (17fps > 40fps) increase in framerate from the 7870 xt, especially Farcry 3. So, I don't think 1080p and below use more than 2gb usually, but there is certainly a use for more, and who doesn't love a little breathing room?
|
# ? Apr 26, 2013 08:47 |
|
Space Racist posted:Just for curiosity's sake, is this the same on cards like the 7990 and 690? Whats even the point in getting 7990's and the like when theyre really just two 7970's stuck together. Wouldnt two stand alone 7970's offer the exact same performance and experience as a 7990? Guni fucked around with this message at 09:31 on Apr 26, 2013 |
# ? Apr 26, 2013 09:17 |
|
Guni posted:[quote] There are more 9's instead of 7's, so no.
|
# ? Apr 26, 2013 09:21 |
|
Guni posted:[quote] Multi-card setups in general are dumb as heck, the appeal of a single card solution is less noise/power/heat issues. Although I still had to deal with driver issues and sli profiles even with a dual gpu single slot card. Never again.
|
# ? Apr 26, 2013 09:30 |
|
Get two, do quad CF/SLI with an mATX board.
|
# ? Apr 26, 2013 09:39 |
|
iuvian posted:Multi-card setups in general are dumb as heck, the appeal of a single card solution is less noise/power/heat issues. Although I still had to deal with driver issues and sli profiles even with a dual gpu single slot card. Never again. Yeah, the biggest draw of single card, dual gpus is that it allows you do get increased GPU performance in a smaller area. You can put in two 7990s into just two PCI-E slots and get Quad-CFX. Where as, I believe, you can only do Tri-CFX with single card solutions. Same thing for SLI. You can get get Tri-SLI in single card solutions, but dual GPU cards allow you to go Quad-SLI. However, as far as I am aware, the performance gain from a 2nd to 3rd card was something like 40-50% with Titan. Then imagine if you were to do quad-SLI with Titan. You would only get probably something in the range of 20-30% increase in performance for only another $1000. Edit:I don't know effective Tri-SLI is on Titan now, but when it first came out it was less than a 30% increase in performance when going from SLI to Tri-SLI. Where as going from a single card to a SLI setup with Titan was only about an 80% increase in performance. SlayVus fucked around with this message at 13:32 on Apr 26, 2013 |
# ? Apr 26, 2013 13:26 |
|
Factory Factory posted:It's because data has to be duplicated between cards. Every texture on Card A must also be in memory on Card B, etc. The actual SLI/CF link is very slow and doesn't do much besides shuffle the frame buffer around.
|
# ? Apr 26, 2013 16:13 |
|
Re: Dual GPU cards. The only one I ever owned was the Voodoo 5500 back in the day, and I don't remember anything about SLI profiles. It just worked, and you got about double the performance of a single 4500 card. Since NVidia bought 3DFX (including the SLI technology), why is it that NVidia cards require SLI profiles and tweaking now? Is it just that games have gotten more complicated, or what?
|
# ? Apr 26, 2013 16:16 |
|
Mad_Lion posted:Re: Dual GPU cards. Some games, the way they are coded, can't actually run in SLI. They get worse performance over a single GPU. Even a game like that will have a SLI profile that will disable SLI. There are also different forms of rendering the frames. I think AMD has more ways to do it than Nvidia does though. These are the basic ones
SlayVus fucked around with this message at 17:28 on Apr 26, 2013 |
# ? Apr 26, 2013 17:25 |
|
Mad_Lion posted:Is it just that games have gotten more complicated, or what? Now, huge amounts of the GPU work is higher-level programs that don't react well to being split across scan lines. When you split them up, every border is a division that the GPUs have to pass data to keep synched. Which is a problem, because the fewer borders you have the worse the load balancing is.
|
# ? Apr 27, 2013 00:35 |
|
Thanks guys, that clears it up. IIRC, the original SLI was literally scan-line-interleave indeed. I'm planning a gigantic computer upgrade by this time next year, and it will be the entire system this time. Intel Q6600 and a Radeon HD 5850 will give way to a Haswell I7 and at least a 7950. I still like AMD/Radeon video cards. I've never had problems with them, starting with the Radeon 8500 I got after the Voodoo 5500, through the Radeon 9700, and onto now. I did have an Nvidia 8800 GT in there, but it was the best thing at the time. I've also owned a 4870 1GB. Anandtech's newest review of the 7990 makes it look pretty good vs. the 690. http://www.anandtech.com/show/6915/amd-radeon-hd-7990-review-7990-gets-official It uses more power and makes a bit more noise, but other than that, it wins in enough games to make it a totally viable choice, especially considering that it has 3GB of ram per GPU vs. 2GB. In addition, if you care about compute, it owns the 680.
|
# ? Apr 27, 2013 04:40 |
|
It'll stutter like a bitch until the prototype drivers end up finished though. As much as I love ATI/AMD, had Radeons and a Rage all my life, minus a GTX260, Crossfire is broken beyond all belief.
|
# ? Apr 27, 2013 05:00 |
|
e: ^^^ My understanding is that if you enable vsynch it's relatively fine and always has been. I'd think the main benefit of a SLI/CF setup is so you can enable vsynch and never drop below 60 frames.Mad_Lion posted:Anandtech's newest review of the 7990 makes it look pretty good vs. the 690. http://www.anandtech.com/show/6915/amd-radeon-hd-7990-review-7990-gets-official But this isn't the thread for upgrade questions / talk.
|
# ? Apr 27, 2013 05:00 |
|
iuvian posted:Multi-card setups in general are dumb as heck, the appeal of a single card solution is less noise/power/heat issues. For a while, a pair of market ATIs in Crossfire were a quieter and less hot solution than a single Fermi card. Klyith posted:e: ^^^ My understanding is that if you enable vsynch it's relatively fine and always has been. I'd think the main benefit of a SLI/CF setup is so you can enable vsynch and never drop below 60 frames. Yeah, Vsync pretty much eliminates the microstuttering from AFR, and Dynamic Vsync lets you at least keep a tolerable performance if your frame rate goes below 60. But whether AFR actually helps a game engine or not is pretty much a crapshoot.
|
# ? Apr 27, 2013 05:15 |
|
|
# ? May 30, 2024 14:07 |
|
I disagree that putting the dual GPU Radeon card up against the dual GPU NVidia card is a bad comparison. It's a DIRECT comparison, IMO. Geforce 690 vs. Radeon 7990 is similar, two SLI'd or Crossfired GPUs on the same card. The Radeon does pretty well in this comparison. I mis-typed on the compute comparison, I meant to say that the current Radeons outperform the current Geforces in compute, in general. A 7990 beats a 690 in compute, no question. As for my upgrade path, well, I used to upgrade more often, but the Q6600 with a 5850 has lasted quite a while. I'm one of those that goes big and then waits, sue me. I7 and a high end video card will last me a while.
|
# ? Apr 27, 2013 05:16 |