Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
veedubfreak
Apr 2, 2005

by Smythe

Shaocaholica posted:

Extra displays and resolution increases should represent a minor blip in vram usage when you're in the 2GB-8GB range. Not counting AA of course which can affect things but kinda pointless at higher resolutions. Adding a 1440p display with triple buffering is only ~50MB more vram. And if the game needs some deferred buffers then maybe ~100MB. That's really nothing. Games are going to goggle up vram because the current gen consoles have it to spare, not because people want to game on multiple displays and/or higher resolutions.

If games are already crossing the 3GB and 4GB threshold why are you advocating people skimp on vram for the very little cost it is to go one step higher? Plus resale value will probably be way better after you're done with the card ~18-24 months from now when no one will want to buy a 2GB or 3GB card.

Because the 280x is not a strong enough card for more than 3gb of memory to matter. The card will run out of processing power long before the vram becomes the bottleneck.

Adbot
ADBOT LOVES YOU

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

veedubfreak posted:

Because the 280x is not a strong enough card for more than 3gb of memory to matter. The card will run out of processing power long before the vram becomes the bottleneck.

Whats the perf difference going from say high to ultra textures if vram isn't an issue?

Ignoarints
Nov 26, 2010

Shaocaholica posted:

Extra displays and resolution increases should represent a minor blip in vram usage when you're in the 2GB-8GB range. Not counting AA of course which can affect things but kinda pointless at higher resolutions. Adding a 1440p display with triple buffering is only ~50MB more vram. And if the game needs some deferred buffers then maybe ~100MB. That's really nothing. Games are going to goggle up vram because the current gen consoles have it to spare, not because people want to game on multiple displays and/or higher resolutions.

If games are already crossing the 3GB and 4GB threshold why are you advocating people skimp on vram for the very little cost it is to go one step higher? Plus resale value will probably be way better after you're done with the card ~18-24 months from now when no one will want to buy a 2GB or 3GB card.

I'd only not recommend it if it cost more and the card can't do it anyway. Which has always been the case in this price range... until this week, for MSI in particular. Which is cool.

The problem before was it could actually put you closer to the next tier in cards. Like a 4gb 760 could cost close to a 2gb 770 - and that's a dumb decision. Or a 4gb 770 could cost like exactly the same as a r9 290 recently.

I always have AA turned up as high as I can go even in 1440p. The difference is very noticeable to me. If I remember afterburner reported 2.4 gb of vram usage at ultra settings and just 2x AA in BF4. Super sampling was a good way to use like 3.5gb of vram, as a test of sorts. Of course it was beyond unplayable and no amount of vram was going to make it playable.

Shaocaholica
Oct 29, 2002

Fig. 5E

veedubfreak posted:

Because the 280x is not a strong enough card for more than 3gb of memory to matter. The card will run out of processing power long before the vram becomes the bottleneck.

Proof? Technically there are artificial scenarios which could stress this kind of bottleneck but thats really a problem for developers to solve, not for consumers to pick at. Also, if memory bandwidth is an issue above 3GB, then how do you explain the PS4 and XBO with substantially less memory bandwidth than the 280x and much more memory to boot.

Ignoarints
Nov 26, 2010

Shaocaholica posted:

Proof? Technically there are artificial scenarios which could stress this kind of bottleneck but thats really a problem for developers to solve, not for consumers to pick at. Also, if memory bandwidth is an issue above 3GB, then how do you explain the PS4 and XBO with substantially less memory bandwidth than the 280x and much more memory to boot.

Admittedly I don't look into AMD cards much for this sort of stuff, because they always tend to have a better memory situation overall, but in the 280x's case in particular it doesn't pull away from a 2gb 770 with less memory bandwidth even when the game is requiring more than 2gb of ram. If I had to guess, by the time you are able to use 3gb of ram the card will literally be crawling to slide show fps levels. This is in contrast to a 780/290 comparison where the 290 has enough power to make use of its vram + bus bandwidth and it will pull away in gaming benchmarks during tests that emphasize memory stressing. However, even though they released a 780 with double the vram, that didn't make any difference for the 780.

I'm not going to pretend like what I know is concrete about this either. I know it must be far more complicated. This is just based on testing done over the last year by others, and I was only initially interested because it seems counter intuitive at first. But really we need better cards before spending more on vram at these price ranges.

Ignoarints fucked around with this message at 16:01 on May 29, 2014

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

Ignoarints posted:

However, even though they released a 780 with double the vram, that didn't make any difference for the 780.

Games are just now coming out though that are using 3+ gb of vram so the difference may not be apparent right now.

Shaocaholica
Oct 29, 2002

Fig. 5E

Ignoarints posted:

...but in the 280x's case in particular it doesn't pull away from a 2gb 770 with less memory bandwidth even when the game is requiring more than 2gb of ram. If I had to guess, by the time you are able to use 3gb of ram the card will literally be crawling to slide show fps levels.

It shouldn't be like that though. If everything else stays the same (resolution, shader complexity, polycount, etc) but only the textures get bigger, you should see minimal impact on performance as vram usage goes up until you hit the bandwidth limit. Like I said before, thats a developer problem at that point but its something they'll have to contend with on current gen consoles so if they can make it work on them it should work on a measly 280x with 6GB. Once you hit the vram limit there should be a considerable performance hit (slide show fps) regardless of the GPU power. Something else must be happening if a more powerful GPU is outperforming a slightly lesser GPU while being -actually- over the vram limit. Same way a CPU that runs out of main memory should in now way be able to keep up with a slightly lesser CPU thats not.

Ignoarints
Nov 26, 2010

Shaocaholica posted:

It shouldn't be like that though. If everything else stays the same (resolution, shader complexity, polycount, etc) but only the textures get bigger, you should see minimal impact on performance as vram usage goes up until you hit the bandwidth limit. Like I said before, thats a developer problem at that point but its something they'll have to contend with on current gen consoles so if they can make it work on them it should work on a measly 280x with 6GB. Once you hit the vram limit there should be a considerable performance hit (slide show fps) regardless of the GPU power. Something else must be happening if a more powerful GPU is outperforming a slightly lesser GPU while being -actually- over the vram limit. Same way a CPU that runs out of main memory should in now way be able to keep up with a slightly lesser CPU thats not.

I don't know why. I just have assumed it's because the 280x GPU itself is running out of steam. I haven't looked too much into them, although I should since the pricing has been reasonable for at least a month now.

edit: wait, is the 280x a more powerful gpu? I always assumed that it was not over a 770.

Don Lapre posted:

Games are just now coming out though that are using 3+ gb of vram so the difference may not be apparent right now.

I am eagerly awaiting results, since the 780 is a more capable card, I would expect it to show a difference. But it doesn't bode well when the 290 and 780 6gb or 3gb can be comparable at lower resolutions, when the 290 pulls away at higher resolutions (memory intense tests, AA, etc). Maybe 384 bit bandwidth just isn't enough for over 3 gb

Also kind of weary of watch dogs benchmarks at the moment. There are some graphical issues right now at the very least. But the first forbes article, while the point it was trying to make might be based in truth, the benchmarks used to justify it seem like bullshit. In fact AMD cards do seem to do slightly better if anything.

Ignoarints fucked around with this message at 16:55 on May 29, 2014

Shaocaholica
Oct 29, 2002

Fig. 5E

Ignoarints posted:

edit: wait, is the 280x a more powerful gpu? I always assumed that it was not over a 770.

I was speaking more generally. Referring to any scenario with a greater and lesser GPU.

veedubfreak
Apr 2, 2005

by Smythe

Shaocaholica posted:

I was speaking more generally. Referring to any scenario with a greater and lesser GPU.

Most companies do the research to figure out how much RAM the card will need depending on the processing power of the chip itself. Most of the time they get it right. Also, don't forget that just because a card has 6g and says it is using 4.5g that a 3g card would be limited. There has been a lot of testing of Titans vs 780ti cards and the lack of memory just isn't an issue because the card doesn't have enough power to actually put all that memory to use. Adding memory to cards has been a way to fleece people who don't research stuff for a while now.

Shaocaholica
Oct 29, 2002

Fig. 5E

veedubfreak posted:

Adding memory to cards has been a way to fleece people who don't research stuff for a while now.

I agree this was mostly true right up until developers started working on current gen consoles that have theoretically 8GB of vram. Its been what, since last summer that current gen specs were public? I think card makers will probably still fleece people but the numbers will be different based on a much higher minimum requirement. I honestly think 6GB will be mainstream(relatively high but based on console specs here), 8GB will be enthusiast and 8GB+ will be niche use and/or 'fleece/gimmick' amounts. Not forever though.

veedubfreak posted:

Most companies do the research to figure out how much RAM the card will need depending on the processing power of the chip itself. Most of the time they get it right.

Disagree here. I don't think the driving factor is processing power. I think the driving factor is cost and margins and what typical games used(past tense). A card maker wants people to buy their product so it has to at least have a minimum of vram for popular games but also not so much that its not affordable or competitive. I think the 'processing' limit of vram is much much higher than the cost restrictions so what we have seen in the past are cost restrictions and as you said, fleecing people and/or targeting niche users.

veedubfreak posted:

Also, don't forget that just because a card has 6g and says it is using 4.5g that a 3g card would be limited. There has been a lot of testing of Titans vs 780ti cards and the lack of memory just isn't an issue because the card doesn't have enough power to actually put all that memory to use.

I do agree that those tools that tell you vram usage should not be taken at face value. There's probably data thats residing in vram that isn't render critical so going overboard with such data may not incur a huge performance penalty. However, I think it would be foolish to count on this ambiguity to skimp on vram when the industry is trending up. The industry is always trending up in all aspects, not just GPU vram. Its just that current gen consoles have given the developers a huge push in the last 24 months (I'm counting pre-release dev time here). Its really just a natural growth spurt, which is a good thing because now games can be even more flexible with larger environments, textures, models, etc etc.

edit: I do feel bad for 680 and even 780 owners because those were spec'd based on previous gen games but you can't blame anyone for that. There's always someone losing out during these transition periods.

Shaocaholica fucked around with this message at 17:56 on May 29, 2014

Shaocaholica
Oct 29, 2002

Fig. 5E
Just for kicks if you look at the vram disparity at the end of the last gen of consoles between them and PC GPUs:

PS3/XBox: 384MB(upper end vram use)
2012/2013 GPUs: 2048MB

That's a 5x factor overage on the PC side. If that trend holds and we mark current gen consoles as having 6GB of upper end vram use then by 2020, PC GPUs should be in excess of 32GB of vram in the mainstream segment.

Seamonster
Apr 30, 2007

IMMER SIEGREICH
People will be driving multiple 4k displays the same way they drive multiple 1080p ones now by that time as well so your "trend" doesn't seem too farfetched but remember that memory bandwidth will also increase as time goes on, something that can be just as important as total VRAM.

Ignoarints
Nov 26, 2010
I really, really, really believe GPU processing power is the limiting factor for development. It is almost certainly the actual limitation on most cards. How the memory interfaces with the GPU can probably be rolled into that.

However, console ports and console limitations are definitely influencing development a lot of time. But keep in mind that they are still making "ports" of current games to xbox 360's, which have a whopping 512mb of gddr3 vram. It's not particularly hard to turn something down to accommodate that. However consoles do limit development for sure. I just don't believe vram quantity does (more than many other factors)

missed some posts but still.

its almost moot, vram will be increasing along with everything that needs to support it. However it will always be limited by the GPU in my opinion, assuming they give the card enough ram, any more wont help

Ignoarints fucked around with this message at 18:35 on May 29, 2014

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Shaocaholica posted:

PS3/XBox: 384MB(upper end vram use)

Xbox possibly, but the PS3's RAM wasn't unified.

Shaocaholica
Oct 29, 2002

Fig. 5E

HalloKitty posted:

Xbox possibly, but the PS3's RAM wasn't unified.

Devs figured out late in the life cycle of the PS3 to hijack the main memory for GPU data. Sure it probably wasn't optimal but it was done on a lot of titles towards the end.

Shaocaholica fucked around with this message at 18:30 on May 29, 2014

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Feeling sorry for people using a GK110 card right now is kinda silly. Especially if they bought early. This is an unusually long-lived architecture, Kepler looks like it'll be about three years old by the time any sort of genuine successor is released. Even the move from Fermi 400-series to Fermi 500-series included some architectural improvements and process adjustments to let them turn the whole thing on and stay within their thermal envelope; the only interesting stuff with the 700-series from nVidia is "wow we have a really badass memory controller, let's crank it up gently caress it 7GHz VRAM!" - even the 770 is just taking already known 680 capabilities and sending them your way from the factory (the math on their respective power usage when a comparable 680 skew is told to use its maximum power draw comes out roughly even for the chip itself when you look at a 770). Given that the 680/670 was somewhat strongly bandwidth limited, sufficiently so that you could outrun the card's bandwidth capabilities with your core OC and get lower performance than less core and faster memory OC, stepping up to 7GHz effective GDDR5 from the factory was a cool move on nVidia's part, but not exactly revolutionary, y'know?

Granted consumer availability of GK110 has been a newer thing, but nVidia is still using parts they taped out during Kepler's launch to combat next-gen designs by AMD and it's working fine. They'll both likely be updating around the same time, from what I can tell, with nVidia launching their next generation and AMD launching a refresh (that hopefully takes care of stock thermals since that's their "big issue" that has mostly gone away with the wide availability of vendor-available aftermarket coolers).

Feelin' sorry bout videogames :D

Shaocaholica
Oct 29, 2002

Fig. 5E

Seamonster posted:

...but remember that memory bandwidth will also increase as time goes on, something that can be just as important as total VRAM.


Ignoarints posted:

I really, really, really believe GPU processing power is the limiting factor for development. It is almost certainly the actual limitation on most cards. How the memory interfaces with the GPU can probably be rolled into that.

...I just don't believe vram quantity does (more than many other factors)

Don't get me wrong, I think everything trends together but vram quantity is a hard limit. If you run out of GPU processing power or memory bandwidth you just get less fps overall. If you run out of vram for critical data you get a huge performance hit (maybe even at some random point in a game). Developers should have some min spec they can count on that not happening. That min spec is typically defined by consoles since they are usually bound by lower costs and its just a bit of a transition right now because they leap-frogged mainstream GPU spec.

Also, even if GPU processing power lags for whatever R&D reason, you can still use more vram. Its not like they are hard linked. Just like how CPU power and main memory quantity are not hard linked.

For instance facial animation can be stored on the GPU(in a variety of ways I'm not going to get into). The longer the animation the more vram but the processing power required to process that animation is a static value that doesn't change when the animation is longer.

Again, I think all things need to trend up, not picking favorites here. Just calling it like it is.

Shaocaholica fucked around with this message at 18:40 on May 29, 2014

Ham Sandwiches
Jul 7, 2000

I pulled the trigger on a 780ti party because of the many assurances I read that 3gb of VRAM would be plenty for 1-2 years of 1080p gaming with everything set to ultra. :) 6 months in and that's already not the case :toot:, though the card itself is amazing.

In the case of Watch Dogs it really is a shame. The game looks and runs nicely when you're on foot and the framerate stays tolerable. However, when driving, the texture loading issues present and the game stutters pretty heavily. My performance is very similar to what hardocp got during their testing.

I think it's fair to warn people that before buying a $700 msrp card, there's serious concerns that it won't be able to run many games with ultra textures in the near future (and some it already can't), and that the 3gb of vram may be a very real limitation.

Ignoarints
Nov 26, 2010

Shaocaholica posted:

Don't get me wrong, I think everything trends together but vram quantity is a hard limit. If you run out of GPU processing power or memory bandwidth you just get less fps overall. If you run out of vram for critical data you get a huge performance hit (maybe even at some random point in a game). Developers should have some min spec they can count on that not happening. That min spec is typically defined by consoles since they are usually bound by lower costs and its just a bit of a transition right now because they leap-frogged mainstream GPU spec.

Also, even if GPU processing power lags for whatever R&D reason, you can still use more vram. Its not like they are hard linked. Just like how CPU power and main memory quantity are not hard linked.

For instance facial animation can be stored on the GPU(in a variety of ways I'm not going to get into). The longer the animation the more vram but the processing power required to process that animation is a static value that doesn't change when the animation is longer.

Again, I think all things need to trend up, not picking favorites here. Just calling it like it is.

I get what you're saying. But more vram simply won't help on the cards we're talking about. It's been tried. Once we talk about next gen, we completely align. This is starting to branch out but really this started with if more vram is better on a 770 or 280x and stuff like that. I'm holding out a tiny bit on some off chance that perhaps its going to be used in a different way than before with new games but I think the chances are seriously "no" still.


Rakthar posted:

I pulled the trigger on a 780ti party because of the many assurances I read that 3gb of VRAM would be plenty for 1-2 years of 1080p gaming with everything set to ultra. :) 6 months in and that's already not the case :toot:, though the card itself is amazing.

In the case of Watch Dogs it really is a shame. The game looks and runs nicely when you're on foot and the framerate stays tolerable. However, when driving, the texture loading issues present and the game stutters pretty heavily. My performance is very similar to what hardocp got during their testing.

I think it's fair to warn people that before buying a $700 msrp card, there's serious concerns that it won't be able to run many games with ultra textures in the near future (and some it already can't), and that the 3gb of vram may be a very real limitation.



not a vram problem in your case

Ignoarints fucked around with this message at 19:10 on May 29, 2014

Ham Sandwiches
Jul 7, 2000

Ignoarints posted:




not a vram problem

That's great, so if it's not a VRAM problem how come HardOCP sees the 780ti start dropping FPS once they put textures to ultra, and see it stop dropping FPS when they put it back to high?

http://hardocp.com/article/2014/05/27/watch_dogs_amd_nvidia_gpu_performance_preview/3

To me, that reads exactly like a vram limitation, so maybe you can help me understand why it's not? Also average FPS is not really a great measure of FPS drops and stutters, wasn't that why benchmarks went into frame pacing to begin with?

Shaocaholica
Oct 29, 2002

Fig. 5E

Ignoarints posted:

... this started with if more vram is better on a 770 or 280x and stuff like that. I'm holding out a tiny bit on some off chance that perhaps its going to be used in a different way than before with new games but I think the chances are seriously "no" still.

I think the last round of mid and high end cards could get a lot deserve more longevity from more vram at what are now very minimal cost differences. Especially if future games have separate settings for textures, models and shaders. But I'll defer the impending scrutinization of the community on these currently offending games and ones on the horizon. But if I had to make a call right now on GPU, I'd bet on more vram.

Shaocaholica fucked around with this message at 19:22 on May 29, 2014

Ignoarints
Nov 26, 2010

Rakthar posted:

That's great, so if it's not a VRAM problem how come HardOCP sees the 780ti start dropping FPS once they put textures to ultra, and see it stop dropping FPS when they put it back to high?

http://hardocp.com/article/2014/05/27/watch_dogs_amd_nvidia_gpu_performance_preview/3

To me, that reads exactly like a vram limitation, so maybe you can help me understand why it's not? Also average FPS is not really a great measure of FPS drops and stutters, wasn't that why benchmarks went into frame pacing to begin with?

You're right, it looks like poo poo. I've been avoiding in depth stuff on it since the first day because of the wildly different benchmarks reported. If there is any card that can see a vram limitation it'll be that one since it has a fast gpu.

Edit: on second thought, you're getting those problems at 1080p?

Ignoarints fucked around with this message at 19:22 on May 29, 2014

Ignoarints
Nov 26, 2010

Shaocaholica posted:

I think the last round of mid and high end cards could get a lot deserve more longevity from more vram at what are now very minimal cost differences. Especially if future games have separate settings for textures, models and shaders. But I'll defer the impending scrutinization of the community on these currently offending games and ones on the horizon. But if I had to make a call right now on GPU, I'd bet on more vram.

We just totally disagree then but that's fine. Like I said before, it's no bad thing, and at worst it's a small waste of money. Hopefully there will be a trend of cheaper higher vram cards like we've seen this week.

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!
What is kind of impressive is that, remembering Vendor A/B/C from earlier, nVidia was involved in coding this, and it's ultimately very optimised for nV cards, usually you see a much wider performance gap when this is the case.

Ignoarints
Nov 26, 2010

deimos posted:

What is kind of impressive is that, remembering Vendor A/B/C from earlier, nVidia was involved in coding this, and it's ultimately very optimised for nV cards, usually you see a much wider performance gap when this is the case.

Man, I don't have a lot of gaming friends, truly. Most are stoked about their PS4's. Yet somehow, that release day, I was getting texts from multiple people who don't even have a PC that can play this game at low settings, bitching about how nvidia is going to ruin the world because they kept AMD out of the development. Of course it turns out AMD cards are just fine if not better across the board. I have a feeling there is going to be some major patching planned for this game though.

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!

Ignoarints posted:

Man, I don't have a lot of gaming friends, truly. Most are stoked about their PS4's. Yet somehow, that release day, I was getting texts from multiple people who don't even have a PC that can play this game at low settings, bitching about how nvidia is going to ruin the world because they kept AMD out of the development. Of course it turns out AMD cards are just fine if not better across the board. I have a feeling there is going to be some major patching planned for this game though.

Honestly it plays well on AMD but only if you compare original MSRPs, 290s are competing in FPS with 770s at 1080p with "High" textures. GameWorks is a bane on PC gaming and a total dick move.

edit: err apparently the newest driver actually bridges the performance gap between red and green.

deimos fucked around with this message at 19:47 on May 29, 2014

Shaocaholica
Oct 29, 2002

Fig. 5E
I'm running 470s SLI and I'm really really tempted to pickup that cheap 280x 6GB. And I don't even game much anymore although thats more of a catch-22 what with getting older. Sigh.

fletcher
Jun 27, 2003

ken park is my favorite movie

Cybernetic Crumb

Shaocaholica posted:

I'm running 470s SLI and I'm really really tempted to pickup that cheap 280x 6GB. And I don't even game much anymore although thats more of a catch-22 what with getting older. Sigh.

What type of games do you like? I feel like there's a ton of great games out there right now.

Ignoarints
Nov 26, 2010

deimos posted:

Honestly it plays well on AMD but only if you compare original MSRPs, 290s are competing in FPS with 770s at 1080p with "High" textures. GameWorks is a bane on PC gaming and a total dick move.

edit: err apparently the newest driver actually bridges the performance gap between red and green.

Was that the reason? Because other benchmarks released literally hours later pretty much disproved all of the insane things like a 770 being as good as a 290 in that forbes article. I didn't know there was a driver that day for amd

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!

Ignoarints posted:

Was that the reason? Because other benchmarks released literally hours later pretty much disproved all of the insane things like a 770 being as good as a 290 in that forbes article. I didn't know there was a driver that day for amd

There was, AMD released a beta release day.

Shaocaholica
Oct 29, 2002

Fig. 5E

fletcher posted:

What type of games do you like? I feel like there's a ton of great games out there right now.

Yeah, thats the reason I'm thinking about getting a new GPU. I like most action games so I'll probably just get the gamut.

Pimpmust
Oct 1, 2008

From all the glitch videos I've seen of Watch____dogs it might just be sort of a shitheap of a game :ms:

Ignoarints
Nov 26, 2010
^^^ It's possible. I'm giving it a chance if they patch it quickly.

deimos posted:

There was, AMD released a beta release day.

Alright that makes much, much more sense then

Herr Tog
Jun 18, 2011

Grimey Drawer
edit: poo poo son, y'all are right.

edit 2: I guess I am just trying to figure out differences between numbers better. Sorry

Herr Tog fucked around with this message at 20:28 on May 29, 2014

LRADIKAL
Jun 10, 2001

Fun Shoe
this is not the parts picking thread.

beejay
Apr 7, 2002

What the :psyduck:

Edit: In other news, my 270X is playing Watch Dogs so well that I am actually hesitant to install that beta driver. I kind of hate this whole mess around this game because these are the kinds of situations that turn people off from PC gaming.

Ignoarints
Nov 26, 2010
My 770's ran it just fine at 1440p... it wasn't the kind of FPS I'd like to see for an online FPS but it was completely and totally acceptable at "high" settings. At ultra it was too lovely, which isn't surprising now. Another thing, I was definitely for sure blowing my vram to the moon and I wasn't experiencing anything like what HardOCP graphed. I'm pretty familiar with that situation too. I know it's for a different setup, but if there is one thing a single 780ti should always be better at than my former 770's it is avoiding those crazy pitfall frame drops from memory bottlenecks. I guess I'll see tomorrow when I get a 780ti



Herr Tog posted:

edit: poo poo son, y'all are right.

edit 2: I guess I am just trying to figure out differences between numbers better. Sorry

Just post it in the parts picking thread. There will plenty of people who know about those cards. I do not though.

veedubfreak
Apr 2, 2005

by Smythe
I feel like I need to buy Watch____________Dogs just to find out how it plays on my completely over the top machine. God knows it's not getting a good workout from D3. Also, playing spanned 7990x1440 in D3 sucks.

Adbot
ADBOT LOVES YOU

Ignoarints
Nov 26, 2010

veedubfreak posted:

I feel like I need to buy Watch____________Dogs just to find out how it plays on my completely over the top machine. God knows it's not getting a good workout from D3. Also, playing spanned 7990x1440 in D3 sucks.

If agreed doesn't want to buy my code, I can sell you it for whatever they go for on ebay if you want. If anything, I'm glad I have watch dogs to have something that truly wrecks my setup

(which is about $40 when auctions finish I believe)

Ignoarints fucked around with this message at 21:48 on May 29, 2014

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply