Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
dont be mean to me
May 2, 2007

I'm interplanetary, bitch
Let's go to Mars


Not all 750 Tis run on just bus power.

Also that computer you're going to put it in looks like a monument to poor choices.

Adbot
ADBOT LOVES YOU

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride

Shaocaholica posted:

I really don't think thats true. I work with a lot of post production people in LA and I haven't heard of a single project finished at 4k. Shot at 4k+ yes but not finished at 4k. I mean, why would a studio pay for that when they're already usually over budget and there are zero 4k distribution channels. There might be a theater here or there (I'm counting on 2 hands here) that have 4k projection but thats not enough to justify carrying 4k all the way through production. And I would question any studio claiming their film was 'finished at 4k'. Why? Because one of the studios I work with closely already claims that for footage that's upscaled because doing a controlled upscaling from 2k to 4k during the final stages of production is cheap and technically can be called 'finishing at 4k' as far as the press are concerned. Does it look good? Yes. Better than a real time scaler built into a TV or player? Yes. But is it native 4k? No.

All of the Alamo Drafthouses in Austin have been 4K for a while now, I can definitely tell the difference between that and the theater I go to near my house that just has DLP. Also supposedly as of 2012 all AMCs have 4K projectors.

Dilbert As FUCK
Sep 8, 2007

by Cowcaster
Pillbug

Sir Unimaginative posted:

Not all 750 Tis run on just bus power.

Also that computer you're going to put it in looks like a monument to poor choices.

I don't really care about maxing it out at +1920x1080 with full aa/af. I bought this a while back and mostly do VMware not gaming. If I can do medium at 2aa 2Af and 1920x1080 that's fine I've been gaming off the intergrated card for a while.

It's a prebuilt so meh.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

Dilbert As gently caress posted:

I don't really care about maxing it out at +1920x1080 with full aa/af. I bought this a while back and mostly do VMware not gaming. If I can do medium at 2aa 2Af and 1920x1080 that's fine I've been gaming off the intergrated card for a while.

Watchdogs could be a system killer. Id wait till it comes out before you buy anything.

Hace
Feb 13, 2012

<<Mobius 1, Engage.>>
Yeah, for whatever reason it has very demanding CPU requirements. Sometimes that stuff is overblown prior to release, but it doesn't hurt to wait and see.

Dilbert As FUCK
Sep 8, 2007

by Cowcaster
Pillbug
We'll I'm not buying a new system and without upgrading to the 7 series Apus. I don't really see anything on the road map for better than the 750 ti unless I am missing something.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Dilbert As gently caress posted:

We'll I'm not buying a new system and without upgrading to the 7 series Apus. I don't really see anything on the road map for better than the 750 ti unless I am missing something.

Your janky CPU will either run Watch Dogs or it won't, regardless of whether you have a 750 Ti or not. If you buy a 750 Ti now and your CPU isn't good enough, you'll have wasted the money on the 750 Ti. Wait to find out whether Watch Dogs really needs as much CPU as early requirements suggest.

Aexo
May 16, 2007
Don't ask, I don't know how to pronounce my name either.

Sir Unimaginative posted:

It SHOULD be fine now. It is for me, both on 335 and 337. nVidia drivers are extremely messy, though - not even DDU undoes everything; you may actually have to reinstall Windows to a bare drive or something (do I even need to remind people to have complete, current, known good backups anymore of course I do).

Thanks! I'll give it a shot tomorrow and report back.

Rastor
Jun 2, 2001

Shaocaholica posted:

I really don't think thats true. I work with a lot of post production people in LA and I haven't heard of a single project finished at 4k. Shot at 4k+ yes but not finished at 4k. I mean, why would a studio pay for that when they're already usually over budget and there are zero 4k distribution channels. There might be a theater here or there (I'm counting on 2 hands here) that have 4k projection but thats not enough to justify carrying 4k all the way through production. And I would question any studio claiming their film was 'finished at 4k'.
Here is an example that Light Iron worked on: The Girl with the Dragon Tatoo.

Dogen posted:

All of the Alamo Drafthouses in Austin have been 4K for a while now, I can definitely tell the difference between that and the theater I go to near my house that just has DLP. Also supposedly as of 2012 all AMCs have 4K projectors.
Yeah, it's definitely not 'a theater here or there'.

right arm
Oct 30, 2011

would a 780ti be a significant leap over a 780 for 2560x1440 gaming or should I just bite the bullet and SLI another 780

money really isn't much of an issue

Shaocaholica
Oct 29, 2002

Fig. 5E

Rastor posted:

Here is an example that Light Iron worked on: The Girl with the Dragon Tatoo.


GDT is more Grey's Anatomy than Avengers. A HEAVY VFX film is going to be burdened by more than just storage and IO restrictions than a movie with mostly talking heads (GDT). Which is why I still think its cost prohibitive to do something like Avengers 2, Star Wars VII, etc at 4k. I'll eat my hat if any Disney big budget VFX heavy movie (or any VFX heavy movie) is done start to finish at 4k in the next 5 years. I'm not counting Dark Knight as I think the VFX in a Marvel movie or Star Wars is significantly greater.

I'm surprised they decoded all their raw data to 10bit DPX instead of EXR which has much better compression and is floating point vs int. Probably their DI system didn't support it.

Edit: Don't get me wrong, I think 4K is a must for the big screen. I just don't think that technology is therethe cost benefit is there yet for the vast majority of movies that count. At least from the perspective of the people who handle the monies.

Shaocaholica fucked around with this message at 06:06 on May 10, 2014

BurritoJustice
Oct 9, 2012

right arm posted:

would a 780ti be a significant leap over a 780 for 2560x1440 gaming or should I just bite the bullet and SLI another 780

money really isn't much of an issue

25% faster for 35% more money. You decide.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

right arm posted:

would a 780ti be a significant leap over a 780 for 2560x1440 gaming or should I just bite the bullet and SLI another 780

money really isn't much of an issue
A 780Ti, 290X, or overclocked 290 will all do very well for 1440p gaming unless you start really getting crazy (you don't need 16x FSAA at 1440p. Really. Stop it). Any of those will be a better idea than SLI (though of course 2x780's would be faster--just needlessly so for 1440p).

A plain 780 will still do well at 1440p, you just might not be able to turn everything up to max and still get a solid 60FPS in every game.

Ignoarints
Nov 26, 2010

DrDork posted:

A 780Ti, 290X, or overclocked 290 will all do very well for 1440p gaming unless you start really getting crazy (you don't need 16x FSAA at 1440p. Really. Stop it). Any of those will be a better idea than SLI (though of course 2x780's would be faster--just needlessly so for 1440p).

A plain 780 will still do well at 1440p, you just might not be able to turn everything up to max and still get a solid 60FPS in every game.

Okay but if money isn't an issue...

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Ignoarints posted:

Okay but if money isn't an issue...
Then gently caress your 780Ti weak-sauce and get a 295X2. poo poo, get two of them, if you really just want to be excessive.

SlayVus
Jul 10, 2009
Grimey Drawer
What's the release schedule for the 6gb 780s/Tis? Tri-SLI 6GB Tis would rock.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

SlayVus posted:

What's the release schedule for the 6gb 780s/Tis? Tri-SLI 6GB Tis would rock.
No idea for the 780 Ti's, but EVGA has a 6GB 780 out for $600. 8GB Vapor-X 290X's are supposed to start hitting shelves in the very near future at around $800, though, if you really feel the need for more VRAM than you know what to do with.

Aexo
May 16, 2007
Don't ask, I don't know how to pronounce my name either.

Aexo posted:

Thanks! I'll give it a shot tomorrow and report back.

No lockups at all today! Woo!

double nine
Aug 8, 2013

As someone who doesn't follow the hardware market at all, are there releases scheduled in the near future that will drive down price on current midtier (~250€) graphics cards? I'm looking for an upgrade, but if it pays to wait a month (for example) to get a better deal, I'd be kicking myself for not taking it...

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

double nine posted:

As someone who doesn't follow the hardware market at all, are there releases scheduled in the near future that will drive down price on current midtier (~250€) graphics cards? I'm looking for an upgrade, but if it pays to wait a month (for example) to get a better deal, I'd be kicking myself for not taking it...

Nothing in the next month at least, TSMC's inability to push out 20nm GPU's has pretty much screwed the GPU market for the time being. Looks like you're getting a GTX 770 or R9 280X.

Ignoarints
Nov 26, 2010

DrDork posted:

Then gently caress your 780Ti weak-sauce and get a 295X2. poo poo, get two of them, if you really just want to be excessive.

Exactly but that would be excessive :v: a 780ti is a reasonable expense, but since he already has a 780 and is willing to SLI, I'd whole heartily recommend him getting another if $500 is no issue. 780 SLI tears up a 780 ti at 1440p and is easily worth the downsides in this case, imo.

Things change at higher resolutions, but 780 SLI scales very well here (almost 50% over a 780ti in many games) with a few exceptions. For right arm, I'd look into games that you play just to check that they do indeed work well here but that's most likely what I'd do in your shoes for the $

However (!)

- If you're just looking for a little bump, the 780 ti easily provides that and will be economical if you sell your 780.
- Even if you're looking for more than a little bump, consider your PSU cost if you don't have one already in place that can handle 2 780's.

Of course if you really had money to blow I'd go 780ti SLI. Then you might have a chance of max super sampling, which is the prettiest thing imo.

Shaocaholica
Oct 29, 2002

Fig. 5E
Re: VRAM

When did the mainstream ~$250 GPUs switch from 1GB to 2GB? Any guess when the switch from 2GB to 4GB will happen(as in 4GB minimum config)? Seems like we've been at 2GB for a while now in the mainstream lines.

Shaocaholica fucked around with this message at 18:17 on May 11, 2014

Incessant Excess
Aug 15, 2005

Cause of glitch:
Pretentiousness
I can't imagine that the mid-range cards this fall will be sporting 2GB vram still, I think the switch to 4 is imminent.

Ignoarints
Nov 26, 2010

Shaocaholica posted:

Re: VRAM

When did the mainstream ~$250 GPUs switch from 1GB to 2GB? Any guess when the switch from 2GB to 4GB will happen(as in 4GB minimum config)? Seems like we've been at 2GB for a while now in the mainstream lines.

Since the 500 series in nvidia I believe.

It will likely increase I'm sure, but I have no reason to think they're going to jump to 4 gb for the ~$250 range (for nvidia). Vram is expensive, and $250 nvidia cards simply can't really use more the 2gb as it is before performance gets so lovely it doesn't matter. But I can see an increase to 3gb and (hopefully) 384bit (320?) memory bandwidth. Memory handling and all the memory intensive features for graphics are nvidia's biggest weakness for mid range cards, and with the new wave of cheaper 4k monitors that we're seeing - perhaps it will push 1440p monitors into more reasonable price brackets which would in turn push nvidia to improve memory characteristics in general for mid range stuff.

But who knows really.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

Ignoarints posted:

Since the 500 series in nvidia I believe.

It will likely increase I'm sure, but I have no reason to think they're going to jump to 4 gb for the ~$250 range (for nvidia). Vram is expensive, and $250 nvidia cards simply can't really use more the 2gb as it is before performance gets so lovely it doesn't matter. But I can see an increase to 3gb and (hopefully) 384bit (320?) memory bandwidth. Memory handling and all the memory intensive features for graphics are nvidia's biggest weakness for mid range cards, and with the new wave of cheaper 4k monitors that we're seeing - perhaps it will push 1440p monitors into more reasonable price brackets which would in turn push nvidia to improve memory characteristics in general for mid range stuff.

But who knows really.

GTX 500 series cards actually mostly had 1.2 GB of VRAM since they were on a 320-bit bus. I think the first mainstream 2GB VRAM card from nVidia's side was the GTX 660. The Radeon HD 7850 was probably the equivalent card on the AMD side.

I think it's likely like Ignorints said that nVidia moves the 384-bit bus on their high end card down to the mid-range. Memory bandwidth from the benchmarks I've seen matters more than raw VRAM size, 2GB of VRAM is never the bottleneck at 1080p except in possibly heavily modden Skyrim. The R9 280X vs GTX 770 is a good testbench for this, both cards perform very similarly overall, but the R9 280X has much higher memory bandwidth compared to the GTX 770. The GTX 770 tends to perform relatively better at 1080p compared to the R9 280X, which performs relatively better at 1440p and above from the benchmarks I've seen. If VRAM were the issue, we'd see a ton of games where the GTX 770 has huge frametime variance issues compared to the R9 280X, which from I've seen hasn't happened yet, at least at resolutions you'd use a GTX 770 at.

Ignoarints
Nov 26, 2010
Woops, I meant 600's sorry. I was actually under the impression that most 500 series had just 1.0 gb, but I was out of the game when those were out. However with the higher memory bus it seemed to me they actually ran out of memory before the overall performance necessarily fell which was apparent in SLI setups. I attribute some of the SLI dislike to these cards due to that (along with all the earlier problems). They seemed a little unbalanced in this way, where today's 2gb/256 bit cards seem way better balanced - but they are still fairly low compared to equivalent AMD cards. Like beautiful ninja said anytime you see two comparable nvidia and AMD cards the AMD ones with better memory in general can really pull away at things that can use it (resolution, AA, etc). The 770 and 280x is indeed the best comparison I know of, especially since there is a 4gb version of the 770 as well that is tested thoroughly. The 4gb 770 has virtually no improvement over a 2gb 770 until you get resolutions or settings so high that the games are unplayable, where the 280x with the better bandwidth and memory shows immediate playable improvements as you increase resolution and AA.

I actually like AMD cards memory focus quite a bit better because I like AA a lot and I do try to run as hard as I can in 1440p now, but I like nvidias driver support more (just recently, this may change), directx focus vs mantle, and SLI. Frankly if they made crossfire just as good I might switch and they are supposed to be working on it (beyond the 290 fix). This stuff is just impressions though

Ignoarints fucked around with this message at 19:16 on May 11, 2014

right arm
Oct 30, 2011

Ignoarints posted:

Exactly but that would be excessive :v: a 780ti is a reasonable expense, but since he already has a 780 and is willing to SLI, I'd whole heartily recommend him getting another if $500 is no issue. 780 SLI tears up a 780 ti at 1440p and is easily worth the downsides in this case, imo.

Things change at higher resolutions, but 780 SLI scales very well here (almost 50% over a 780ti in many games) with a few exceptions. For right arm, I'd look into games that you play just to check that they do indeed work well here but that's most likely what I'd do in your shoes for the $

However (!)

- If you're just looking for a little bump, the 780 ti easily provides that and will be economical if you sell your 780.
- Even if you're looking for more than a little bump, consider your PSU cost if you don't have one already in place that can handle 2 780's.

Of course if you really had money to blow I'd go 780ti SLI. Then you might have a chance of max super sampling, which is the prettiest thing imo.

:tipshat:

probably just going to get another 780 unless I sell this one to a buddy who is currently building a pc

veedubfreak
Apr 2, 2005

by Smythe

right arm posted:

:tipshat:

probably just going to get another 780 unless I sell this one to a buddy who is currently building a pc

Things to remember when doing multiple cards.

1) the game has to support sli/crossfire. Right now there are a lot of games that don't. Edge 780ti
2) You need a psu big enough to support the cards. Edge 780ti
3) You can most likely pick up a used 780 for under 300 bucks if you look hard enough. Edge SLI
4) You have to sell the 780, likely for a hefty loss, to buy the 780ti. Edge SLI
5) When you go to upgrade next time around, you have 2 cards to unload instead of one. Edge 780ti

Now take this with a grain of salt, as I am currently dealing with the issues that come with having multiple 290s because I'm just a glutton for punishment. But if you can get a decent price for your 780, buying a single TI will give the you best performance overall compared to SLI. But you could also pick up a 290x which easily trades blows with a single 780Ti for 2-300 bucks less now that the coin bubble has burst.

Personally, I'm of the mindset to always buy the best card you can afford and then look at adding more cards later when their price goes down. In your case I would go with a 2nd 780 if you can find one cheap.

Also, recently made a couple changes. Can anyone spot them?

veedubfreak fucked around with this message at 19:55 on May 12, 2014

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!
New PSU and you finally put your cathodes in?

veedubfreak
Apr 2, 2005

by Smythe
New psu and went to parallel tubing on the video cards and new cpu block. Was able to back my pumps down to 4 and maintain temps on the video cards. One day I'll locate my soldering iron (or buy a new one) and actually hook up the lights.

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!

veedubfreak posted:

New psu and went to parallel tubing on the video cards and new cpu block. Was able to back my pumps down to 4 and maintain temps on the video cards. One day I'll locate my soldering iron (or buy a new one) and actually hook up the lights.

I have a really really really really old Hakko (or Weller or something) station (digital readout, analogue temp control) that I thought I had lost, yours for shipping if you want. I replaced it with a Hakko 888 when I couldn't find it.

veedubfreak
Apr 2, 2005

by Smythe

deimos posted:

I have a really really really really old Hakko (or Weller or something) station (digital readout, analogue temp control) that I thought I had lost, yours for shipping if you want. I replaced it with a Hakko 888 when I couldn't find it.

I have 2 controllers in the machine already. 1 is a 50w analog that I use to power the fans on the radiators and the 2nd is only 30w per channel so I have it hooked up to the single radiator fan and water temp monitor. Kicks the single fan on when water temps hit 35, which is usually enough to bump temps back down. Thanks though. Was hoping to replace the analog controller, but 3 fans per channel actually melted the glue on the heatsink of channel 1 ><

Ignoarints
Nov 26, 2010

veedubfreak posted:

Things to remember when doing multiple cards.

1) the game has to support sli/crossfire. Right now there are a lot of games that don't. Edge 780ti
2) You need a psu big enough to support the cards. Edge 780ti
3) You can most likely pick up a used 780 for under 300 bucks if you look hard enough. Edge SLI
4) You have to sell the 780, likely for a hefty loss, to buy the 780ti. Edge SLI
5) When you go to upgrade next time around, you have 2 cards to unload instead of one. Edge 780ti

Now take this with a grain of salt, as I am currently dealing with the issues that come with having multiple 290s because I'm just a glutton for punishment. But if you can get a decent price for your 780, buying a single TI will give the you best performance overall compared to SLI. But you could also pick up a 290x which easily trades blows with a single 780Ti for 2-300 bucks less now that the coin bubble has burst.

Personally, I'm of the mindset to always buy the best card you can afford and then look at adding more cards later when their price goes down. In your case I would go with a 2nd 780 if you can find one cheap.


What were the games you wanted to play that didn't support multiple gpus again? I remember it was like every game you wanted to play at the time.

I just got titanfall for cheap and am annoyed it still doesn't support SLI. Although I can tell it barely needs it. I should have listened when people said "lovely port" I feel like I'm playing halo 1 as far as matchmaking and menus. So far that's the only game for me that hasn't used SLI at least. I also got Crysis 3 (just bought someone's origin account) and that really whips my gpu's. I only really wanted it because it's always in benchmarks.

It got my top card to 94 fuckin degrees. I never put that backwards fan in because I'm currently distracted by my other hugely more expensive hobby (moneypit car) but that really motivated me to do something about it. I remember not too long ago the H55 quiet's were on sale for $35, now they cost more than H60's :wtc:

Does anybody have a recommendation for a secondary AIO for one GPU? I just want to drop it 20 degrees and given it's situation pretty much anything should be able to do that. It's just the only bracket I'm aware of adapts asetek style. I guess one that could do both video cards would be worth the expense

I do strongly, strongly recommend right arm to get a 780 in this particular case because 780 SLI benchmarks so well over a ti in a lot of games in 1440p. Wouldn't want to buy the PSU for that though

Ignoarints fucked around with this message at 22:04 on May 12, 2014

deptstoremook
Jan 12, 2004
my mom got scared and said "you're moving with your Aunt and Uncle in Bel-Air!"
It was a toss-up between posting this question here or in the monitor thread, but this seemed slightly better.

I have a Radeon HD 5770. This video card has 2 DVI ports, 1 HDMI port, and 1 DisplayPort.

I have three screens, two of which are monitors and the third of which is a TV:
  • Monitor A is on DVI
  • Monitor B is on DVI through a VGA
  • TV is on HDMI through a stereo receiver.

The HD5770 only supports two monitor outputs unless I get some kind of DisplayPort adapter. On my last install of Windows 7, turning on the TV resulted in: TV primary, Monitor A extended (secondary), Monitor B disabled. Turning off the TV reverted to Monitor A primary, Monitor B extended (secondary), TV disabled. This was the ideal operation because the TV is for XBMC, video games, et cetera.

This was a pretty neat trick and I think I somehow took advantage of Windows trying to be "helpful" by enabling/disabling monitors when it gets an HDMI signal. After getting a new hard drive and doing a clean Win7 install, though, I of course can't reproduce the same conditions.

At the moment, if all 3 displays are plugged in and the TV is on I get TV primary, Monitor A secondary, Monitor B disabled, like I want. However, if I turn off the TV all three displays are disabled. I've tried booting with quite a few configurations of plugs/display settings, and screwed around with AMD Catalyst a bit. It seems like there's a logic puzzle element here that I can't quite figure out.

I bet other people have or would like the same setup, so I'd appreciate any thoughts you all may have. Thanks!

veedubfreak
Apr 2, 2005

by Smythe

Ignoarints posted:

What were the games you wanted to play that didn't support multiple gpus again? I remember it was like every game you wanted to play at the time.

I just got titanfall for cheap and am annoyed it still doesn't support SLI. Although I can tell it barely needs it. I should have listened when people said "lovely port" I feel like I'm playing halo 1 as far as matchmaking and menus. So far that's the only game for me that hasn't used SLI at least. I also got Crysis 3 (just bought someone's origin account) and that really whips my gpu's. I only really wanted it because it's always in benchmarks.

It got my top card to 94 fuckin degrees. I never put that backwards fan in because I'm currently distracted by my other hugely more expensive hobby (moneypit car) but that really motivated me to do something about it. I remember not too long ago the H55 quiet's were on sale for $35, now they cost more than H60's :wtc:

Does anybody have a recommendation for a secondary AIO for one GPU? I just want to drop it 20 degrees and given it's situation pretty much anything should be able to do that. It's just the only bracket I'm aware of adapts asetek style. I guess one that could do both video cards would be worth the expense

I do strongly, strongly recommend right arm to get a 780 in this particular case because 780 SLI benchmarks so well over a ti in a lot of games in 1440p. Wouldn't want to buy the PSU for that though

Currently, its still MW:O, gently caress that game for continuing to be entertaining despite :pgi: Titanfall, which I haven't played in a few weeks. D3 which doesn't need multicard support since I only play it on 1 monitor. I did manage to find an old code that worked for Crysis3, I should get around to parusing that. I still have Farcry 3 sitting unused too. What's funny is all the AMD codes from last year still work. Oh right I should see if Thief supports xfire too. Too many games, so little ability to give a gently caress :( My attention span has gotten so short these days that about the only thing I seem to play is MW:O while getting completely hammered and bullshitting with random people on TS since no goons play anymore.

Animal
Apr 8, 2003

All I play is World of Tanks which only supports one GPU. So I will be selling one of my 780's if anyone is interested.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Animal posted:

All I play is World of Tanks which only supports one GPU. So I will be selling one of my 780's if anyone is interested.

Which 780 is it?

Animal
Apr 8, 2003

An EVGA and a Gigabyte, both reference models and look identical, with slightly different stock clocks but boost to the same speed, at least under SLI. I'll make up my mind tomorrow when I get back home from work and then post images.

Ignoarints
Nov 26, 2010
I just got this really bad urge to sell my 770's and buy a 780ti so that I can SLI it later. :ughh:

My brain is automatically justifying it like

"okay you like nvidia a bit more, but this 1440p monitor is stressing 770 memory bus even in SLI, so obviously the only answer is 780ti"

"yeah even though 770 sli is almost always better, single cards play better when they're at their limit so maybe it will be ok for a while"

"maybe this way I won't have to watercool a top card, that's savings!"

"super sampling rules, you want it"

I hate myself

Adbot
ADBOT LOVES YOU

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
VR-Zone is reporting that nVidia's dual-GPU GTX Titan Z has been canceled, likely because nVidia was not willing to launch it at a price that would be competitive with the $1500 Radeon R9 295X2. This makes some sense given that nVidia was targeting $3000 for the GTX Titan Z.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply