Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Carecat
Apr 27, 2004

Buglord

Factory Factory posted:

3GB of RAM doesn't actually cost that much on the bill of materials. Video card markups on VRAM are almost as bad as Apple markups on iPad flash storage.

I'm concerned we might all suffer as 2GB of VRAM is still the standard outside the top end cards but from what I googled the new consoles can offer about 3GB of VRAM and a pretty expensive card you bought today won't let you use the highest texture settings.

Adbot
ADBOT LOVES YOU

Shaocaholica
Oct 29, 2002

Fig. 5E

Carecat posted:

...from what I googled the new consoles can offer about 3GB of VRAM...

Technically a game developer could use much more than 3GB. There's no hard limit at 3GB at the hardware or software level.

Sports games, racing games and fighting games are a few genres where I think you can get away with very little game data and very very large VRAM data.

Shaocaholica fucked around with this message at 01:20 on Jun 22, 2014

Ignoarints
Nov 26, 2010

Jan posted:

Uh. No. That would involve having actual people work on the PC version.

Most likely what happened is once the game got postponed, all the people that had been shunted from other projects to wrap up Watch Dogs got put back on their original projects, and no one bothered replacing them.

I didn't look into it. But if people are getting originally-expected visual improvements from an .ini file... and the kind of visuals that are basically impossible on a console at that, well, things got a little more weird with this whole situation at the very least.

Shaocaholica
Oct 29, 2002

Fig. 5E

Ignoarints posted:

I didn't look into it. But if people are getting originally-expected visual improvements from an .ini file... and the kind of visuals that are basically impossible on a console at that, well, things got a little more weird with this whole situation at the very least.

Well they never would have had anything close to final hardware when they did their 2012 E3 demo.

"Those could range from performance issues, to difficulty in reading the environment in order to appreciate the gameplay, to potentially making the game less enjoyable or even unstable."

Pffft. "reading the environment" pretty much just means the depth of field was not designed right for the gameplay. You can simply turn that off. Some of the screenshots show DOF that isn't realistic at all. That rest of the statement is just usual boilerplate disclaimers. The game could well be just as (less) enjoyable and (as less) stable without the mod.

Shaocaholica fucked around with this message at 01:38 on Jun 22, 2014

teh_Broseph
Oct 21, 2010

THE LAST METROID IS IN
CATTIVITY. THE GALAXY
IS AT PEACE...
Lipstick Apathy

Carecat posted:

I'm concerned we might all suffer as 2GB of VRAM is still the standard outside the top end cards but from what I googled the new consoles can offer about 3GB of VRAM and a pretty expensive card you bought today won't let you use the highest texture settings.

Can we please stop having a post every page or two hinting that 4gb cards are needed for futureproofing with nothing to back it up besides 'well I guess that one console port that renders all the textures twice..' Yes, more vram will help at some point. No, we aren't there yet.

Ignoarints
Nov 26, 2010

teh_Broseph posted:

Can we please stop having a post every page or two hinting that 4gb cards are needed for futureproofing with nothing to back it up besides 'well I guess that one console port that renders all the textures twice..' Yes, more vram will help at some point. No, we aren't there yet.

And its a total waste of money on this generation *screams so loudly busts a vein* !!!!!!!1

Rime
Nov 2, 2011

by Games Forum

teh_Broseph posted:

Can we please stop having a post every page or two hinting that 4gb cards are needed for futureproofing with nothing to back it up besides 'well I guess that one console port that renders all the textures twice..' Yes, more vram will help at some point. No, we aren't there yet.

You've never installed texture mods in a game like Skyrim, or tried to run Space Engine at LOD 2. :allears:

Pimpmust
Oct 1, 2008

The higher speed of modern cards compared to the consoles... 2010?ish cards should alivate any RAM issues? Shouldn't be a problem to process 50% more textures if the card is more than twice as fast, or so I figure.


lovely ports (and oh boy will there be lovely ports) aside.

Shaocaholica
Oct 29, 2002

Fig. 5E

teh_Broseph posted:

Yes, more vram will help at some point. No, we aren't there yet.

So you think that game devs will limit themselves to 2GB of VRAM on current gen consoles for the sake of a PC culture that is clinging onto 2GB parts? Or are you fine with PC ports getting watered down because the same reason? This isn't about finding -examples-, its logically going to happening given the disparity people are clinging onto.

Ignoarints
Nov 26, 2010

Shaocaholica posted:

So you think that game devs will limit themselves to 2GB of VRAM on current gen consoles for the sake of a PC culture that is clinging onto 2GB parts? Or are you fine with PC ports getting watered down because the same reason? This isn't about finding -examples-, its logically going to happening given the disparity people are clinging onto.

There is no possible way a console can use 4, 5, 8 gb of vram and still hope to produce any kind of playable game. It simply can't compute that fast enough. The same goes for like 98% of the current PC cards as well.

I do not believe game devs limit development based on vram quantity. Game devs will develop around basic limitations of the console as a whole. And if there is a way to use heavy vram on a console with lower GPU requirements than normal nothing will stop them from doing so. For a PC with 2gb it would simply require lowering what would likely be one setting in that case.

PC LOAD LETTER
May 23, 2005
WTF?!

Pimpmust posted:

Shouldn't be a problem to process 50% more textures if the card is more than twice as fast, or so I figure.
If the card runs out of VRAM doesn't it have to page the textures from system RAM or the hard drive (ewww) over the PCIe bus? No matter how fast the GPU is that will likely cause a big performance hit since you're looking at a awful case scenario of dealing with much lower bandwidth and much higher latency to stream stuff with.

Shaocaholica
Oct 29, 2002

Fig. 5E

Ignoarints posted:

There is no possible way a console can use 4, 5, 8 gb of vram and still hope to produce any kind of playable game. It simply can't compute that fast enough. The same goes for like 98% of the current PC cards as well.

That's only the case if the GPU tries to access too much vram per frame. You can in theory use an infinite amount of vram and still have playability so long as you're not accessing beyond the threshold at any given frame.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
Whoa people, 2GB of VRAM isn't very much at all, so let's think about this for a bit.

First, running out of VRAM loving sucks. You page from system RAM over PCIe and your framerates instantly become low, stuttery, and barely playable at best. Worst, lag from texture loading happens when you least want it to, when enemies or objects first appear on screen. Unlike running low on shader throughput you can't just tweak some barely-noticeable settings to get performance back, lowering texture quality makes your game look like blurry poo poo. Annoyingly, running low on VRAM also often means getting dropped back to your desktop with a prompt to change the color scheme to Windows Basic.

Second, buying a card without enough VRAM doesn't typically give you a graceful transition. Games are still being produced for the Xbox 360 and PS3 with <512MB total RAM, yet it was a couple years back that games stopped being playable on cards with 512MB of RAM at 1080p regardless of how low you turned the texture settings.

Third, 2GB is barely enough TODAY. Granted I have a 1440p primary display for gaming and a secondary 1080p monitor for apps, but I don't play anything VRAM intensive yet I run into paging if I forget to close GPU-accelerated apps like my web browsers. Maybe you don't have a 1440p monitor, but they are pretty cheap and getting cheaper, you'll want one eventually right? The games we are playing today are hamstrung by the need to run on last-gen consoles, when that chain is broken you can't think devs are just going to keep on like they always have.

Maybe if you only have a 1080p monitor and no plans to upgrade and don't mind turning down texture settings then a 2GB card may well last awhile for you. I think that isn't most people reading this thread though, and most readers with 2GB cards (except the crappiest) are going to retire the card because of the VRAM before anything else. I know that the available models and their pricing right now can lead to some unfortunate choices between getting a card without enough VRAM or overpaying for one that has enough, but it doesn't help matters to just bury your head in the sand and pretend that 2GB oughtta be enough for anybody.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Alereon posted:

The games we are playing today are hamstrung by the need to run on last-gen consoles, when that chain is broken you can't think devs are just going to keep on like they always have.
While true, you also need to remember that it'll likely be another 1-2 years or so before devs really get with it on the PS4/XBone and start pushing boundaries. Until then we'll largely see stuff that's either been in development for awhile already (and isn't in danger of substantially exceeding 2GB on a 1080p screen with equitable textures to consoles), or we'll see rush-to-market stuff, of which 90% will be using similar techniques and whatnot as last-gen devs (and so not really stressing VRAM), while the other 10% will be poorly optimized crap like WatchDogs, which may see a legitimate benefit from more VRAM, but will still run like the poorly optimized crap it is.

So, on that front, sure, you will likely run out of VRAM with a 2GB card in another 18-24 months, but you're buying a <$200 card, right? That's not really a "buy once, use for years" sort of investment from the beginning, unless you're already planning on playing back-catalogs of games, rather than new-released AAA ones, in which case you'll be just fine.

"Oh, but DrDork! I'm not buying a $200 budget card! I'm buying a $300 770!" Well then you are goddamned loving retarded. If you're dumb enough to turn down a substantially faster card, with 4 Gee-Bees of your apparently paranoia-inducing VRAM, for less dollar bills than a 770, just because you love nVidia that much, I cannot help you.

Rime
Nov 2, 2011

by Games Forum

DrDork posted:

"Oh, but DrDork! I'm not buying a $200 budget card! I'm buying a $300 770!" Well then you are goddamned loving retarded. If you're dumb enough to turn down a substantially faster card, with 4 Gee-Bees of your apparently paranoia-inducing VRAM, for less dollar bills than a 770, just because you love nVidia that much, I cannot help you.

You joke, but I have co-workers who buy hook-line-sinker all the marketing BS that Nvidia puts out. I was talking on Friday about how, if AMD pulls off their Heterogeneous System Architecture plans, they will be the only sane hardware solution for gaming. Heck, the fact that they're driving all consoles this generation already means they have a one-up over Nvidia for optimization. Coworkers were having none of it, just kept tossing buzzwords like "Shadowplay" and "Titan Black" back at me, basically going "la-la-la-la".

And these were grown men pushing 35.

Ignoarints
Nov 26, 2010
I'm still going to say buying a 4gb 760 over a 2gb 770 is stupid every single time, and that's really the last major comparison this is ever going to affect. When the next gen rolls through this is all just going to go away, at least there's that.

Dark Solux
Dec 8, 2004

Old School Saturn God

Ignoarints posted:

I'm still going to say buying a 4gb 760 over a 2gb 770 is stupid every single time, and that's really the last major comparison this is ever going to affect. When the next gen rolls through this is all just going to go away, at least there's that.

Makes me glad I sprung for a 3GB 780 instead of a 4GB 770 :toot:

Shaocaholica
Oct 29, 2002

Fig. 5E

Ignoarints posted:

I'm still going to say buying a 4gb 760 over a 2gb 770 is stupid every single time, and that's really the last major comparison this is ever going to affect. When the next gen rolls through this is all just going to go away, at least there's that.

I don't think anyone here was defending that. Most of the scenarios are the same GPU with different vram options.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
We missed some news!

Apparently the 290X isn't a fully-enabled part. We're gonna get a fully-enabled part. That's four more CUs than a 290X, about the same difference as 290 to 290X. ROP count stays the same, so it's all in the extra shaders and texture mappers.

Nvidia quietly canceled doing server-focused ARM chips, so there won't be a server version of the Tegra K1 or anything like that. Still, the GPU architecture could be licensed by a third party to make something similar. But what Nvidia DID do is add compatibility to run a Tesla GPU off an ARM CPU. Previously it was just x86 and MIPS (e.g. PowerPC). So if someone does make that ARM server chip, they can use either GPU IP blocks or a whole dang GPU.

And a little tidbit of interest: Intel asked AMD for early access to the Mantle API specs. AMD said no, and Intel said "Yeah well we believe in DirectX 12 anyway."

Wistful of Dollars
Aug 25, 2009

Factory Factory posted:

We missed some news!

Apparently the 290X isn't a fully-enabled part. We're gonna get a fully-enabled part. That's four more CUs than a 290X, about the same difference as 290 to 290X. ROP count stays the same, so it's all in the extra shaders and texture mappers.


I will be deeply amused if there are early 290x's (or 290s) that can be flashed to the 295x (or whatever they'll call it).

Ignoarints
Nov 26, 2010

Shaocaholica posted:

I don't think anyone here was defending that. Most of the scenarios are the same GPU with different vram options.

Lol to pick up a week later, I meant that is the only comparison this really affects (ie. I want a 760, but I want 4gb because I read a watch dogs review, but then that costs as much as a 770 2gb, which one should I get~!) . Most other situations the answer is more obvious. Distant second is a 6 vs 3 gb 780 but I think the cost difference is like $20

dont be mean to me
May 2, 2007

I'm interplanetary, bitch
Let's go to Mars


DrDork posted:

"Oh, but DrDork! I'm not buying a $200 budget card! I'm buying a $300 770!" Well then you are goddamned loving retarded. If you're dumb enough to turn down a substantially faster card, with 4 Gee-Bees of your apparently paranoia-inducing VRAM, for less dollar bills than a 770, just because you love nVidia that much, I cannot help you.

1) I'll grant you that Amazon Warehouse has a couple of decent options and that this time it isn't all BitCoin Lottery, but the fraction of 'enough to justify not buying new' discounts that are either ambiguous or clearly BS is distressingly high.

2) Even if Greenlight, alternative operating system compatibility, fitting the card onto your computer's power supply, and [what passes for development rigor in consumer GPUs] don't matter: Electricity costs and local temperatures and HVAC conditions are very real concerns, which can make even a deliriously good AMD card end up not worth it in some places. It's almost like there's room for multiple GPU companies with different design objectives to exist, and that not everything is about money or brand loyalty!

3) Even before the medical slur, your post comes across as being made in bad faith.

Spuckuk
Aug 11, 2009

Being a bastard works



So, my 560ti really needs replacing.

Given that I'm in no hurry to move beyond 1080p gaming, is a R9 280x my best bet?

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Spuckuk posted:

So, my 560ti really needs replacing.

Given that I'm in no hurry to move beyond 1080p gaming, is a R9 280x my best bet?

All you could possibly need for 1920×1080 and good value..

Sir Unimaginative posted:

2) Even if Greenlight, alternative operating system compatibility, fitting the card onto your computer's power supply, and [what passes for development rigor in consumer GPUs] don't matter: Electricity costs and local temperatures and HVAC conditions are very real concerns, which can make even a deliriously good AMD card end up not worth it in some places. It's almost like there's room for multiple GPU companies with different design objectives to exist, and that not everything is about money or brand loyalty!

Even if there other points to be made, when NVIDIA cards don't represent the best bang for buck, most people choose NVIDIA for one of two reasons: Linux or loyalty. The inverse applies also of course, when AMD doesn't represent the best value, people may go on brand loyalty.

Obviously when something is also the best performance for the money, you don't need any further justifications.

The power/temp issue comes and goes in cycles: the GTX 480 was an incredibly hot and loud card, for example. The Radeon 6990 was loud, but the 7990 was not, etc.

HalloKitty fucked around with this message at 09:59 on Jun 26, 2014

Spuckuk
Aug 11, 2009

Being a bastard works



HalloKitty posted:

All you could possibly need for 1920×1080 and good value..


Excellent news, thanks.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Spuckuk posted:

Excellent news, thanks.

To be more specific, here's a comparison with your old card.

A lot of 280X with nice coolers come with boosted clocks so they perform like a 7970GHz or better anyway.

Edit: that said, this isn't really the thread for asking about upgrades, the main parts picking thread is for that

HalloKitty fucked around with this message at 18:10 on Jun 26, 2014

Fame Douglas
Nov 20, 2013

by Fluffdaddy

HalloKitty posted:

Linux or loyalty.

Or adaptive VSync and quieter/cooler cards.

veedubfreak
Apr 2, 2005

by Smythe
To throw an extra kink in the endless circle of gpu vram, correct me if I'm wrong here, but don't you pretty much have to have a 64bit OS to use a card with shittons of memory anyway? I might be mistaken on this mainly because it has been nearly a decade since I used a 32bit OS.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

veedubfreak posted:

To throw an extra kink in the endless circle of gpu vram, correct me if I'm wrong here, but don't you pretty much have to have a 64bit OS to use a card with shittons of memory anyway? I might be mistaken on this mainly because it has been nearly a decade since I used a 32bit OS.

Everyone should have a 64bit os at this point.

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!
So, is there any word on when SLI will support high resolution displays? Anything above 1600p seems to disable SLI from what I've read. My U2412M died and I am looking to Ignoarints it up.

warcake
Apr 10, 2010
I had 3 AMD cards die on me since last october, so when I sent my last 290 back I got an EVGA 780. I had flickering screen problems with a 280x I had that was only fixed with a bios update, and random cases of my computer going in an unawakeable sleep with the 290.

I've paid the same money for a 780 vs a 290, I know its got less VRAM but its been rock solid so far and runs way cooler.

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!

warcake posted:

and random cases of my computer going in an unawakeable sleep with the 290.

Do you know how I solved this with my 290? Change the power supply. I have no idea what it was but my guess is that the "core sleep" functionality on the R9 kills the PSU voltages when waking back up and it doesn't let it wake up.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

deimos posted:

Do you know how I solved this with my 290? Change the power supply. I have no idea what it was but my guess is that the "core sleep" functionality on the R9 kills the PSU voltages when waking back up and it doesn't let it wake up.

I've never had a problem like this (but then I have an old card that doesn't support ZeroCore), but there's an option in Afterburner that disables this mode:

HalloKitty fucked around with this message at 19:14 on Jun 26, 2014

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Fame Douglas posted:

Or adaptive VSync and quieter/cooler cards.

I'm currently using two R9 280X, and I used to use two GTX 770, and across 10 games I play regularly, SLI was stuttering with one, and the 280's are micro-stuttering with 5 or so, including Blood Dragon, Crysis 2, Path of Exile, etc. Maybe the R9 290 series is better since they don't require a cable, but for me Crossfire is a hot mess.

And as someone who churns through Open Box cards at Microcenter, I've had about ten Nvidia cards that were all perfectly fine, and about ten AMD cards with most of them artifacting or being DOA.

I'm really quite impartial since I appreciate AMD's raw value as much as Nvidia's "premium" approach to build quality and drivers, with the price premium to match. Neither company is strictly better.

Ignoarints
Nov 26, 2010

deimos posted:

So, is there any word on when SLI will support high resolution displays? Anything above 1600p seems to disable SLI from what I've read. My U2412M died and I am looking to Ignoarints it up.

Do you mean 4k? I heard there were some driver issues for using above 30 hz initially but that has been fixed (I believe it was an actual bug with DP and nvidia drivers). Flickering and blue and green poo poo on the screen can be fixed with a new SLI bridge. This is something even I've seen at 1440p, but apparently its more evident at 4k-level demand on the cards for whatever reason.

Now, how well a sli setup does in 4k is a different story. It's pretty much a case by case basis as far as I know right now

veedubfreak
Apr 2, 2005

by Smythe

Zero VGS posted:

I'm currently using two R9 280X, and I used to use two GTX 770, and across 10 games I play regularly, SLI was stuttering with one, and the 280's are micro-stuttering with 5 or so, including Blood Dragon, Crysis 2, Path of Exile, etc. Maybe the R9 290 series is better since they don't require a cable, but for me Crossfire is a hot mess.

And as someone who churns through Open Box cards at Microcenter, I've had about ten Nvidia cards that were all perfectly fine, and about ten AMD cards with most of them artifacting or being DOA.

I'm really quite impartial since I appreciate AMD's raw value as much as Nvidia's "premium" approach to build quality and drivers, with the price premium to match. Neither company is strictly better.

Xfire is only truly fixed in the 290s. The 280 is still basically just a 7970 with a new paint job. I've been running triple 290x since the week they came out and I have no stuttering.

Blame Pyrrhus
May 6, 2003

Me reaping: Well this fucking sucks. What the fuck.
Pillbug
Is it just me, or is SLI support getting really lovely lately?

Titanfall (a source engine game, that somehow runs like total poo poo) doesn't include SLI support. ESO didn't support it for a few months (not that it really needed it on my system, since I'm using a 4770k and the engine seems to be very CPU bound?). And I just bought GRID: Autosport, and I had to shoehorn SLI support by loading up nvidia inspector and copying the SLI bits or whatever the poo poo it's called from GRID 2, but now I get micro-stuttering. I hate having to do this kind of stuff.

Historically I've always used higher-res-than-typical monitors (1920x1200 since back in 2004, 2560x1600 since about 2010), and used SLI to keep decent framerates with the fillrate demands that the higher-res monitors required. So I've been stuck with SLI for a while.

In the last 4-5 years SLI became basically a non-issue, it always worked and I always recommended it. But nowadays I'm seeing it omitted or just problematic in more and more games.

I'm done with it now, swapping my aging 570s out for the best 780 or 780ti I can afford ASAP, but I can't decide if I've just been purchasing the wrong games lately, or if SLI really is becoming more trouble than it's worth.

warcake
Apr 10, 2010

Linux Nazi posted:

Is it just me, or is SLI support getting really lovely lately?

Titanfall (a source engine game, that somehow runs like total poo poo) doesn't include SLI support. ESO didn't support it for a few months (not that it really needed it on my system, since I'm using a 4770k and the engine seems to be very CPU bound?). And I just bought GRID: Autosport, and I had to shoehorn SLI support by loading up nvidia inspector and copying the SLI bits or whatever the poo poo it's called from GRID 2, but now I get micro-stuttering. I hate having to do this kind of stuff.

Historically I've always used higher-res-than-typical monitors (1920x1200 since back in 2004, 2560x1600 since about 2010), and used SLI to keep decent framerates with the fillrate demands that the higher-res monitors required. So I've been stuck with SLI for a while.

In the last 4-5 years SLI became basically a non-issue, it always worked and I always recommended it. But nowadays I'm seeing it omitted or just problematic in more and more games.

I'm done with it now, swapping my aging 570s out for the best 780 or 780ti I can afford ASAP, but I can't decide if I've just been purchasing the wrong games lately, or if SLI really is becoming more trouble than it's worth.

On the VRAM front, does having 2 cards in SLI double the amount of VRAM you have or is more of a series thing? I have no idea how it works.

EoRaptor
Sep 13, 2003

by Fluffdaddy

warcake posted:

On the VRAM front, does having 2 cards in SLI double the amount of VRAM you have or is more of a series thing? I have no idea how it works.

Each card must have identical memory contents, so the total memory available for textures (etc) doesn't increase. No, it provides no additional memory.

Adbot
ADBOT LOVES YOU

Ignoarints
Nov 26, 2010

Linux Nazi posted:

Is it just me, or is SLI support getting really lovely lately?

Titanfall (a source engine game, that somehow runs like total poo poo) doesn't include SLI support. ESO didn't support it for a few months (not that it really needed it on my system, since I'm using a 4770k and the engine seems to be very CPU bound?). And I just bought GRID: Autosport, and I had to shoehorn SLI support by loading up nvidia inspector and copying the SLI bits or whatever the poo poo it's called from GRID 2, but now I get micro-stuttering. I hate having to do this kind of stuff.

Historically I've always used higher-res-than-typical monitors (1920x1200 since back in 2004, 2560x1600 since about 2010), and used SLI to keep decent framerates with the fillrate demands that the higher-res monitors required. So I've been stuck with SLI for a while.

In the last 4-5 years SLI became basically a non-issue, it always worked and I always recommended it. But nowadays I'm seeing it omitted or just problematic in more and more games.

I'm done with it now, swapping my aging 570s out for the best 780 or 780ti I can afford ASAP, but I can't decide if I've just been purchasing the wrong games lately, or if SLI really is becoming more trouble than it's worth.

So far the only game I wanted it for and couldnt have it for was Titanfall. It was supposed to be supported, then supported afterwards very quickly... then it just went away. I'm guessing it just couldn't be done for whatever reason.

I had 770 SLI and it was my most powerful setup and I don't regret it, but a single card is nice sometimes even if it isn't as good. My opinion would probably change if everything I wanted to play didn't support SLI though. If you don't mind putting up the money single card is always ideal as long as the performance is where you want it. A 780ti will surely be better than a 570 sli setup I believe.

As for vram in SLI, I only had 2gb 770 cards in SLI. Probably one of the worst case scenarios for GPU power vs. vram availability and I had no real issues of any kind even blasting it in 1440p using over 2gb of ram with both. The only situation I could get it to just melt down and fail was with super sampling where I assumed vram was the first card to fall, but the performance with SSAA at 1440p is simply unplayable anyways with that setup. It personally confirms my heavy doubts about 4gb 770's but being as it wasn't a test or controlled study I don't usually cite it.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply