Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Freakazoid_
Jul 5, 2013


Buglord

Gonkish posted:

Yeah, I was thinking about it earlier and was wondering what 4gb really gets me right now.

From what I read in the pc building thread last month, there's some concern about the influence of the ps4's 8gb of GDDR5. A few goons believe it's a big step up in graphics, to the point where we will all need new cards in the next year or two.

I'm not sure if that's actually something to fret over or even if it can be future proofed, but it's the only reason to get more than 2gb right now.

Adbot
ADBOT LOVES YOU

Parker Lewis
Jan 4, 2006

Can't Lose


Agreed posted:

I dunno, my 780 and 650Ti combo do not have this problem. At all. I can make so much gunk fly you'd think I were doing some horrible monster-bukkake game and yet it runs a solid 60 vsync'd, or just takes the gently caress off to crazy-town FPS-wise if uncapped.

Hm, I'm at a loss then. What CPU are you using? I have an i5-4670k at 4.2GHz, maybe an i7 helps out with PhysX stuff somehow? Or maybe the 780 is just that much better than a 760 for 1080p?

beejay
Apr 7, 2002

Freakazoid_ posted:

From what I read in the pc building thread last month, there's some concern about the influence of the ps4's 8gb of GDDR5. A few goons believe it's a big step up in graphics, to the point where we will all need new cards in the next year or two.

I'm not sure if that's actually something to fret over or even if it can be future proofed, but it's the only reason to get more than 2gb right now.

I think you may have misinterpreted some conversation in that thread because the prevailing thought in the parts picking thread is that for the next couple of years games are going to be made for cross-platform so it won't matter. 2 years or so would be the point you'd want to start thinking about an upgrade anyway if you are wanting to stay on the cutting edge, and it's been that way for a while.

Basically if you are getting a 780 or possibly a 770, it's worthwhile to think about going 4GB "just in case" but if you are using a 760 or something for 1080p, 2GB is going to be fine for the forseeable future.

edit: This kind of conversation probably belongs in the parts picking thread anyway.

beejay fucked around with this message at 15:11 on Jul 24, 2013

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Some of you might remember, around when the GeForce Titan was released, that Nvidia did a demo of a realistic face renderer that pushed the Titan to its limit and, in return, got the best looking simulated real-time bald guy ever.

Watch this video:

https://www.youtube.com/watch?v=Vx0t-WJFXzo

That's 1 SMX of Kepler on an ARM SoC drawing about 3W of power. The engineering samples of Nvidia's Logan SoC are in, and Nvidia loves the poo poo out of them. Peak FLOPS is more than 5x the iPad 4's GPU, and faster than a PS3. This is about what you could expect from a top-end Logan tablet. Plus, being Kepler, it's a true unified shader with support for not only OpenGL, but DirectX 11, OpenCL, and CUDA.

Nvidia: Wizards.

Factory Factory fucked around with this message at 15:19 on Jul 24, 2013

Wistful of Dollars
Aug 25, 2009

I wouldn't be surprised if Nvidia's engineers have made a pact with an Elder God that lives in the basement.

Ramadu
Aug 25, 2004

2015 NFL MVP


So my friend just spent a stupid amount of money on a lot more computer than he needs to watch porn and game on and wants to know what games right now can push the graphics the most. Any suggestions? I thought maybe the new metro?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Ramadu posted:

So my friend just spent a stupid amount of money on a lot more computer than he needs to watch porn and game on and wants to know what games right now can push the graphics the most. Any suggestions? I thought maybe the new metro?

The new Metro is actually better optimized than the first one, but they're both pretty good. Bioshock Infinite can look really pretty. Crysis 3 sticks with the Crysis tradition and will make computers cry. Witcher 2 and RIFT are a bit older but still really beefy. STALKER, too. Sleeping Dogs is really nice but not as demanding as these other titles, technically speaking.

Gonkish
May 19, 2004

Factory Factory posted:

Nvidia: Wizards.

That is loving amazing.

VorpalFish
Mar 22, 2007
reasonably awesometm

Welp, add me to the list of people having problems with the 320 nvidia drivers. Installed it last night, instant crashes! Back to 314.22 for me.

Un-l337-Pork
Sep 9, 2001

Oooh yeah...


Ramadu posted:

So my friend just spent a stupid amount of money on a lot more computer than he needs to watch porn and game on and wants to know what games right now can push the graphics the most. Any suggestions? I thought maybe the new metro?

I've been impressed with Tomb Raider pretty much maxed out on my 780. Looks very pretty. The game isn't as bad as I thought it would be (basically Uncharted: Lara Croft) -- worth the $12 or whatever I paid during the Steam sale. Oh, and GTA IV with settings maxed (might have to fiddle with draw distance) at a steady, high framerate is pretty incredible.

Yudo
May 15, 2003

I splurged on Steam's summer discounts and bought Metro: Last Light. On my 7970 the game is 10 FPS choppy at any setting with the release 13.4 drivers; the beta is much, much better. Is there some setting Metro does not like that I can somehow disable?

MagusDraco
Nov 11, 2011

even speedwagon was trolled

Yudo posted:

I splurged on Steam's summer discounts and bought Metro: Last Light. On my 7970 the game is 10 FPS choppy at any setting with the release 13.4 drivers; the beta is much, much better. Is there some setting Metro does not like that I can somehow disable?

Make sure Advanced Physx (or physics however they have it in the menu) is turned off since that's Nvidia gpus only. If you have ssaa on, you could turn that off too.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

This was much more true of 2033, but their implementation of ADOF is really, really GPU heavy and is usually worth about 20FPS on high end hardware. I reckon they optimized it for Last Light but there's only so much fat you can trim off of it and I bet it's still a pretty substantial performance hitter.

real_scud
Sep 5, 2002

One of these days these elbows are gonna walk all over you

havenwaters posted:

Make sure Advanced Physx (or physics however they have it in the menu) is turned off since that's Nvidia gpus only. If you have ssaa on, you could turn that off too.
The Physx stuff I believe is defaulted on so make sure to get that quick, I had to re-install and got bit by it. As soon as I turned it off fps went back up to normal for my 7970 and sit around 50 or so with almost everything else turned up.

beejay
Apr 7, 2002

I've played Metro: last light on High on a 7870 with no problems, didn't touch any specific settings. I use the 13.6 beta drivers. I haven't made it very far into the game at all but so far it's nice and smooth and looks great.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

If the benchmark is any indicator, the engine will happily chug along with your processor doing the really fancy physX which is hilariously slideshow any time actual things are happening, so yeah, turn it off stat if it's got that going on (even for nVidia owners; that is still kind of a premium setting and can make perfectly good cards - that could handle the rendering of that sucker - weep at trying to juggle compute and render).

Animal
Apr 8, 2003

ASUS 760 Mini announced. Only $250, this might be the 760 to get.

Packstand
Sep 22, 2012

Factory Factory posted:

Some of you might remember, around when the GeForce Titan was released, that Nvidia did a demo of a realistic face renderer that pushed the Titan to its limit and, in return, got the best looking simulated real-time bald guy ever.

Watch this video:

https://www.youtube.com/watch?v=Vx0t-WJFXzo

That's 1 SMX of Kepler on an ARM SoC drawing about 3W of power. The engineering samples of Nvidia's Logan SoC are in, and Nvidia loves the poo poo out of them. Peak FLOPS is more than 5x the iPad 4's GPU, and faster than a PS3. This is about what you could expect from a top-end Logan tablet. Plus, being Kepler, it's a true unified shader with support for not only OpenGL, but DirectX 11, OpenCL, and CUDA.

Nvidia: Wizards.

This is remarkable... crazy to think we'll be playing current gen tech on mobile devices pretty soon.

Packstand
Sep 22, 2012
I just got an EVGA 770 GTX, someone tell me why it was a huge mistake.

E820h
Mar 30, 2013

Packstand posted:

I just got an EVGA 770 GTX, someone tell me why it was a huge mistake.

Was it 2GB or 4GB? I don't think it's a mistake if you want super shiny gaming at 1080p+ :confused:

Stumpus Maximus
Dec 15, 2007

Half sunk, a shattered visage lies, whose frown,
And wrinkled lip, and sneer of cold command,
Tell that its sculptor well those passions read
Which yet survive, stamped on these lifeless things,
The hand that mocked them and the heart that fed.

Animal posted:

ASUS 760 Mini announced. Only $250, this might be the 760 to get.

I saw this almost a month ago. Still no word as to when it's going to be released. Where'd you get that $250 figure? I bought a 670 mini (and then subsequently returned it because it was DOA) on release, and that carried a price premium over regular 670s.

Edit: Looks like they lowered the price to put the 670 mini in line with other 670s. Don't preorder GPUs, kids.

Stumpus Maximus fucked around with this message at 21:35 on Jul 25, 2013

Animal
Apr 8, 2003

I got the price out of my rear end (actually, from another forum and somehow thought I read it from the information page). Its more likely going to be around $300 I guess.

knox_harrington
Feb 18, 2011

Running no point.

E820h posted:

Was it 2GB or 4GB? I don't think it's a mistake if you want super shiny gaming at 1080p+ :confused:

I dunno. This subforum (here and the upgrading thread) was until recently an oasis of moderation on the internet but in the past few weeks has started to tend towards overkill.

(GPU overkill I mean)

knox_harrington fucked around with this message at 22:30 on Jul 25, 2013

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
I wouldn't argue with you, given that Nvidia has gone overkill by releasing the GeForce Titan and GTX 780 and people have bought them. But at the same time, when I first built my current system I went by the recommendations at the time and ended up with a Radeon 6850 for 1920x1200, and it was disappointing. It played everything I had already great, don't get me wrong, but it was extremely rapidly outclassed by uptake of DX11 graphics options. Basically as soon as the GeForce 560 Ti came out, the recommendation was upgraded to be that for twice the price and a sizeable performance leap.

Since then, DirectX 11 games have moved from "eye candy" tier to mainstream, and they ask for power to look nice. You can play them looking not-nice, of course, but things like tessellation are no longer mostly-theoretical image quality enhancements.



(from LegitReviews)

And boy howdy do DirectX 11 features make a difference. That's from Unigine Heaven and it pretty much shows the difference between DirectX 9 and DirectX 11 in a nutshell. Those are the same models and textures; the difference is in tessellation allowing those textures to enable true displacement mapping.

The thing is, these effects are still really intensive. It's just that today's top-end video cards pretty much are as powerful as the pair of GeForce 470s previously needed to make these effects run well. Below a 670/760, you're in this weird equilibrium where you can get really nice framerates, or you can get really nice image quality, but you can't quite get both at the same time, not at the 1920x1080 mark. It'd be swell if you could get both, right? And you can, just need to get a slightly more powerful card, which is available and doesn't involve a multi-card setup.

When Maxwell comes and delivers its supposed 20-30% performance increase, the price/performance point will likely shift down a product tier or so.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

knox_harrington posted:

I dunno. This subforum (here and the upgrading thread) was until recently an oasis of moderation on the internet but in the past few weeks has started to tend towards overkill.

(GPU overkill I mean)

The 560Ti launched at $300 and stayed there for a long time and everyone thought it was awesome. (It was! At the time!)

The GTX 760 is about $250 and offers a far superior value prospect this generation compared to the 560Ti vs. the $500 GTX 580, so I don't know what to tell you. High performance graphics cards have always been pricy.

You're free to get a GTX 650Ti Boost or a Radeon 7850 if you want to adhere heavily to the "price" part of price:performance," but features that used to be "for the future" are here now that it's the future and hardware supports them well. So it goes. So it has always been. Remember the GTX 280's launch price of $600+ from all vendors, too? Big powerful architectures aren't cheap but they deliver the goods. It's just up to you whether the goods they deliver are valuable or not.

knox_harrington
Feb 18, 2011

Running no point.

Agreed posted:

The 560Ti launched at $300 and stayed there for a long time and everyone thought it was awesome. (It was! At the time!)

The GTX 760 is about $250 and offers a far superior value prospect this generation compared to the 560Ti vs. the $500 GTX 580, so I don't know what to tell you. High performance graphics cards have always been pricy.

You're free to get a GTX 650Ti Boost or a Radeon 7850 if you want to adhere heavily to the "price" part of price:performance," but features that used to be "for the future" are here now that it's the future and hardware supports them well. So it goes. So it has always been. Remember the GTX 280's launch price of $600+ from all vendors, too? Big powerful architectures aren't cheap but they deliver the goods. It's just up to you whether the goods they deliver are valuable or not.

Haha, no I don't really disagree with that at all, but I'm sure a 760 will completely max out any game currently available at 1080p. Certainly my decidedly mid range card (7870 xt) comfortably runs Metro LL at very high, with very high tessellation, a 760 would be absolutely perfect, and a 770 a borderline waste of $50 or whatever the extra is, for that resolution.

e: I am a dick and thought he was talking about 1080p for some reason :confused:

knox_harrington fucked around with this message at 23:42 on Jul 25, 2013

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

If a 760 would literally max everything, I'd be using a 760, promise. But a 680 wouldn't do it, so I'm riding the 780 train at 1080p and for most games it's massive overkill - but for those games where the 680 choked (the triple-A, super showy games with features meant to bring your system to its knees, dig), the 780 pulls through. I'm able to run Crysis 3 with a minimum FPS of 30 even in those segments that would get the 680 into sub-20 FPS, for example, and I can take advantage of some of the really graphically complicated things like Sleepy Doggins Ultra AA, which has a huge performance hit but looks amazing. And I can force 2xSSAA transparency AA (forget Sparse Grid, it's more efficient but a total pain in the rear end to get running in a lot of games when I no longer have to, y'know?) and address shimmering and the other sorts of image quality issues that have largely been ignored in pursuit of FPS and more overt shiny poo poo, if you dig it.

If I didn't use a PhysX card I'd probably be running two 760s in SLI, because THAT'S nVidia's "$500 card" this generation - just not labeled as such, hah - it's a price:performance anomaly that is neck and neck in some areas and just leaves a 780 behind for a lot less, and it kicks the 770 in the teeth.

I'm still waiting to see what AMD's refresh looks like. Unless they've been hiding an 8-billion transistor monster up their sleeve (which, clearly, no, haha), they've got a 4.1 billion transistor architecture on their hands which they need to try to wring even more performance out of if they're going to even try to answer the high end. With their driver team vexed to come up with a solution to their inopportunely timed outing of trading scaling for coherency and thus more than acceptable levels of microstutter in Crossfire setups, I feel like apart from die-hard AMD users they really are in a position of needing to embarrass the GTX 770 in order to compete in the enthusiast bracket; as for the rest, a price landslide might be in order. They've been doing it anyway, or at least merchants have.

Of course this is speculative. It just seems that they've already done an in-generation refresh of Tahiti taking its overclocking headroom that wowed everyone out the gate and making it official with the GHz edition cards. So where do you go from "maxed?" nVidia is in the same position with their GTX 770, basically being what the better overclocked GTX 680s were right out the gate. How AMD answers that will be meaningful but I think the price:performance tiers will be where the fight is. As usual.

WHERE MY HAT IS AT
Jan 7, 2011

Animal posted:

ASUS 760 Mini announced. Only $250, this might be the 760 to get.

That's awesome, would be great for a SFF build. :swoon:

Gonkish
May 19, 2004

Man, that 760 SLI setup is really, really tempting. I'd shoot for it, but currently the 2GB 760s are around $250 dollars, so I'm looking at $500 versus the $400 for a single 770. Granted, the dual 760s would outperform, but I'm wary about the power concerns (heat, thankfully, doesn't seem to be much of an issue with this case). I also feel like it's easier to replace a single card versus two cards when the time comes.

I do like that SLI actually scales well now. I have always thought of it as more of a bragging rights thing, but now it seems to actually work in the consumer's favor.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

Factory Factory posted:

And boy howdy do DirectX 11 features make a difference. That's from Unigine Heaven and it pretty much shows the difference between DirectX 9 and DirectX 11 in a nutshell. Those are the same models and textures; the difference is in tessellation allowing those textures to enable true displacement mapping.

To be fair, the geometry on the DX9 version is pretty drat primitive (heh, get it?). They could easily have added some more detail without really causing a considerable performance hit. Games have rarely been vertex bound since everyone made the switch to deferred shaded engines.

Of course, this is just one still frame and the rest of the scene might be a lot more vertex heavy, but that particular frame kind of looks like a joke.

TheRationalRedditor
Jul 17, 2000

WHO ABUSED HIM. WHO ABUSED THE BOY.

Gonkish posted:

Man, that 760 SLI setup is really, really tempting. I'd shoot for it, but currently the 2GB 760s are around $250 dollars, so I'm looking at $500 versus the $400 for a single 770. Granted, the dual 760s would outperform, but I'm wary about the power concerns (heat, thankfully, doesn't seem to be much of an issue with this case). I also feel like it's easier to replace a single card versus two cards when the time comes.

I do like that SLI actually scales well now. I have always thought of it as more of a bragging rights thing, but now it seems to actually work in the consumer's favor.
Getting 2GB 760s for SLI would definitely shoot you in the foot within the next 2 years.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Well... yeah, Unigine Heaven pretty much might as well just be a tech demo for tessellation. This forum post has some more DX9/10/11 comparisons, and the dragon statue is a little more practical. But the "before" is more like CS:Source (DX7/8) than more-current DX9/10 releases.

[H] has an image quality article on Metro: Last Light that's more practical, and applied to more complex geometry, the tessellation isn't as obvious. However, it does take a LOT of the "video-gamey"-ness out of the edges of curved surfaces, the number one place where bump-mapping falls flat on its rear end.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Jan posted:

To be fair, the geometry on the DX9 version is pretty drat primitive (heh, get it?). They could easily have added some more detail without really causing a considerable performance hit. Games have rarely been vertex bound since everyone made the switch to deferred shaded engines.

Of course, this is just one still frame and the rest of the scene might be a lot more vertex heavy, but that particular frame kind of looks like a joke.

It's a demo specifically intended to show off (a lot of) tessellation; the level of extrapolation from the basic geometry is incredible compared to most games. But real-world tessellation is a lot more interesting and wow-factor than flat models, too. Beats the hell out of parallax mapping, anyway. Nothing like sneaking up on a corner and hitting that sharp edge to kill the illusion of depth :v:


TheRationalRedditor posted:

Getting 2GB 760s for SLI would definitely shoot you in the foot within the next 2 years.

Definitely? That seems like a little bit more than we can accurately claim at this point. Are you stating that because of the consoles and their high overall memory count? We aren't entirely sure what that means, vis a vis PC gaming. Some AMD APUs in the future will be able to use GDDR5, because hey why the hell not they've totally got that down now - but if the basis for your speculation that two 760s in SLI is a bad decision is because you think that the 2GB memory limitation is going to be a significantly limiting factor I think that's questionable. We just don't know.

Otherwise, if I've misinterpreted you, I agree - the rule persists: don't buy for two years from now, buy for today. Don't get anything more than you want TODAY, and don't plan on availability of SKUs later on in some hare-brained scheme to future proof. Can't do it, just sock the money away and buy the next hot-poo poo Maxwell price:performance card and win the pretty poo poo game that way.

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

knox_harrington posted:

I dunno. This subforum (here and the upgrading thread) was until recently an oasis of moderation on the internet but in the past few weeks has started to tend towards overkill.

(GPU overkill I mean)

No, the upgrading thread used to be an oasis of people clamouring to take a giant steaming poo poo on anyone who dared to poke their head up and ask for advice on building a PC on a far more flexible budget that would give them maximum performance.

There are many people - most people, in fact - who have a really tight budget and want advice on wringing the greatest possible value out of that budget; but the upgrading thread doesn't just exist for those people. Not everyone has a really tight budget, some people are happy to pay more on their gaming hobby. Some people are more interested in performance, than they are in value. A PC that doesn't do what you wanted it to do is a waste of money, no matter how good the price:performance ratio was.

I don't believe in flushing money down the toilet by buying 32gigs of l33t fence ram when there would be literally no reason to do so, or buying a 1500watt power supply - those are the sorts of mistakes the thread needs to be correcting.

I feel like the upgrading thread is finally respecting people with all levels of budget. If you want advice on getting the best possible value for your $800, we can help. If you want to spend a couple of grand on a high performance gaming rig, we can show you how to best spend that money to eek out the best performance there.

TheRationalRedditor
Jul 17, 2000

WHO ABUSED HIM. WHO ABUSED THE BOY.

Agreed posted:

Definitely? That seems like a little bit more than we can accurately claim at this point. Are you stating that because of the consoles and their high overall memory count? We aren't entirely sure what that means, vis a vis PC gaming. Some AMD APUs in the future will be able to use GDDR5, because hey why the hell not they've totally got that down now - but if the basis for your speculation that two 760s in SLI is a bad decision is because you think that the 2GB memory limitation is going to be a significantly limiting factor I think that's questionable. We just don't know.
My thinking is that if the 2x 760 outstrips the power of the mighty 780 one is sensibly going to be playing in the realm of 1440/1600p and expecting top-shelf performance for years to come for their investment, and I forsee games adopting more and more absurd eyecandy modes (settings analogous the Witcher 2's ubersmapling, to a lesser extent of course) that as a power hungry GPU enthusiast you'd want to conquer all available bells and whistles and still maintain a smooth FPS under all circumstances. Given how many muscled render engines are sincerely testing 2MB VRAM @ 1440p at this point in time, is it really unlikely that it's a factor that stands to be rubbed up against uncomfortably within 2 years?

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

TheRationalRedditor posted:

My thinking is that if the 2x 760 outstrips the power of the mighty 780 one is sensibly going to be playing in the realm of 1440/1600p and expecting top-shelf performance for years to come for their investment, and I forsee games adopting more and more absurd eyecandy modes (settings analogous the Witcher 2's ubersmapling, to a lesser extent of course) that as a power hungry GPU enthusiast you'd want to conquer all available bells and whistles and still maintain a smooth FPS under all circumstances. Given how many muscled render engines are sincerely testing 2MB VRAM @ 1440p at this point in time, is it really unlikely that it's a factor that stands to be rubbed up against uncomfortably within 2 years?

Believe it or not, the answer is one part "we really don't know," and one part "even if it does, swapping is surprisingly not the limiting factor that we might think of it as conventionally." I can dig around for some benchmarks tomorrow if you'd like some comparisons, but even in minimum FPS (where you'd expect to see it just EAT poo poo compared to a single card of similar power but more VRAM) it actually keeps up remarkably well. Partly because of the way SLI gives a bit of a boost to texture addressing but also just because swapping is a bit of a bogeyman when we're talking about 2GB of very fast GDDR5 compared to their LAST memory controller which couldn't handle more than, what, 4GHz GDDR5?

Fermi had issues with its memory architecture all the way from loadout to on-die resource utilization, and you can see comparisons of the GTX 590 and the Radeon 6990 for a quick glance at "well THAT'S poo poo" high resolution gaming really being memory bandwidth and allocation space limited, but things have changed a lot and Kepler has an excellent memory controller, including being first to ship 1750MHz (7GHz effective clock) GDDR5 modules, in the GTX 770 (and before that, overclocking like crazy - within TDP limits, of course - in the GTX 680 and GTX 670... and little brothers in the Kepler family, too).

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
I think you may be underestimating the impact of running low on video memory. The thing that kills performance is that the GPU has to wait for the texture to be uploaded over the PCI-Express bus, which means a bottleneck of 8GB/sec (for SLI/Crossfire) rather than ~200GB/sec. The way this manifests in gameplay is a short pause when an enemy comes on-screen as the textures are swapped in, which is exactly the time you do NOT want to have hitching. This means that average FPS numbers won't tell the full story, since you're seeing dips into the teens or single-digits, and those drops are happening when it is most noticeable and distracting (panning, fast motion, new things coming on screen).

Modern history has shown a rather predictable increase in VRAM demands from games, and this is taking into account the moderating effect of the low, fixed VRAM amounts on last-gen consoles. Games are currently using more than 1GB of VRAM at 1080p for high settings, and you can close in on 2GB in some titles if you turn up antialiasing. This kind of lifespan is fine when we're talking about a GTX 760 or lower-end card (though even there I'd go 4GB if I expected a long upgrade cycle), but if you're dropping $400 on a GTX 770 that will have the performance to hold up for quite a fear years, it just doesn't make sense to me to not pay the extra 10-15% to double the card's useful lifespan. The fact that next-gen consoles have ridiculous amounts of VRAM and both DX11.1/11.2 and OpenGL 4.4 are all about finding cool new ways to use VRAM may also indicate that demands will grow faster than before, but you don't need this as a reason to want 4GB on high-end cards.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Two small points and one big one. I hope none of this comes across as sass, I'm just tired as hell :)

First, texture limitations at 2GB are a stretch. Are there games that do that at higher resolutions? Sure, plan accordingly in the build. At the render target of 1080p (which is where the next gen consoles are going to sit for their entire lifetime, according to available information), 2GB shouldn't be hitting the disk very often.

Second, people upgrading to a 770 right now and pairing 4GB of VRAM with it are going to be pretty disappointed when the GPU can't keep up with the textures, but maybe they can turn down some of the aforementioned AA or whatever (assuming the VRAM is utilized as such, which I can't just grant you, because...)

Big point: we don't really know dick about the consoles, y'know? We have a lot of speculation as to how the shared resources will be utilized, but unless you've got an NDA you're willing to break to let us know more, the best we've got is a PS4 down-to-the-metal exploration of what deep ISA access might allow developers to do courtesy of Timothy Lottes. It's not on his blog anymore, I guess he scrubbed it with all the other cool poo poo he had on there when he switched jobs rather suddenly but I saved a copy :3:. There are going to be a lot of things drawing on that shared pool of GDDR5 and we don't know WHAT it will mean. Probably for a couple years, the best speculation is just "really good computer-like graphics!" and then as they get better at working with the custom silicon and really dig into the resource utilization and can keep coherence despite disparate demands on the dies (:haw: alliteration! :downs:), then we'll start seeing some neat poo poo.

Right now, everyone is just gearing up. Nobody really knows anything. We can speculate, but ... that's as far as it goes until these things hit shelves. Developers will have some time to figure out how to get the most out of the hardware, and while that should be easier than it might have been in the past (CELL, don't get me started... awesome, completely impractical, miracle it worked as well as it did) it's still not going to be a picnic by a long shot. And we simply do not know what the relationship will be in terms of proportional performance demands from consoles to computer.

Someone extremely risk averse could get a Titan and hope it holds the line with its 6GB of fast GDDR5 and monolithic core, but most everyone else will be better served saving their money on a card or cards that do what they want right now, without an eye toward the future, because the future is less certain right now than at any point in the last, poo poo, nearly ten years. The whole console cycle. GPU makers aren't asleep at the wheel and they'll be keeping up, of course. The question is where will that place consumers? Standard upgrade cycle? Expectation of durable-goods like life cycle? Up to the individual.

knox_harrington
Feb 18, 2011

Running no point.

The Lord Bude posted:

No, the upgrading thread used to be an oasis of people clamouring to take a giant steaming poo poo on anyone who dared to poke their head up and ask for advice on building a PC on a far more flexible budget that would give them maximum performance.

There are many people - most people, in fact - who have a really tight budget and want advice on wringing the greatest possible value out of that budget; but the upgrading thread doesn't just exist for those people. Not everyone has a really tight budget, some people are happy to pay more on their gaming hobby. Some people are more interested in performance, than they are in value. A PC that doesn't do what you wanted it to do is a waste of money, no matter how good the price:performance ratio was.

I don't believe in flushing money down the toilet by buying 32gigs of l33t fence ram when there would be literally no reason to do so, or buying a 1500watt power supply - those are the sorts of mistakes the thread needs to be correcting.

I feel like the upgrading thread is finally respecting people with all levels of budget. If you want advice on getting the best possible value for your $800, we can help. If you want to spend a couple of grand on a high performance gaming rig, we can show you how to best spend that money to eek out the best performance there.

It should be balanced though. I'm completely in favour of people spending big on systems, if they want to, and I'm pleased both that the high-end kit exists and that people are willing to pay for it. If you are at 1080p there is no real reason for getting a 780, except that you "just want it" or have a 120hz display. It's just e-peen / future proofing, and people should continue to be steered towards resolution appropriate gpus. IMO.

Adbot
ADBOT LOVES YOU

Wistful of Dollars
Aug 25, 2009

It's not 'future-proofing', it's 'currently awesome-atizing'.

Also, AMD's crossfire fix is supposedly being released next week. I hope they've managed to do a good job; they might actually manage to sell an extra half-dozen 7990s if it works.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply