Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

movax posted:

Agreed - I totally did get your PM but got carried away fixing my DNS and hosting which I discovered was broken when the OP lacked any images whatsoever :downs:

Updated OP and thread title sorry for the delay :angel:

I forgot about that too! Oh sweet as poo poo I'm in the OP, I am an official star! Woohooo :c00lbert:

Adbot
ADBOT LOVES YOU

Animal
Apr 8, 2003

The new nvidia drivers give me no display on Win8 with a Geforce 780. Time to go into the odyssey of getting into Win8 safe mode and uninstalling them.

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

MondayHotDog posted:

I should have given more details. Gaming is only a very small part of what the machine will be used for. It's main purpose is just web browsing & digital painting. But it's going to be my only computer, and I spend a lot of time on it, and people say the most significant upgrade you can buy is an SSD. If anything, I was considering keeping the SSD and going with an i3 instead of the i5.

About the video card. There's only 3 modern games I'm concerned with; Skyrim, Deus Ex: HR and Civ 5. All three can run at 30 FPS at decent settings with a $100 card. There's no other new games or upcoming releases that interest me. I mostly play older games, and I'm not going to spend $150 more than I have to. Now when Fallout 4 comes out, I'll be upgrading to a proper card, but that probably won't be for awhile. That's why I said "for the time being".
Well that makes things a bit different. Under those circumstances, I retract my advice.

I will just say though that civ5 is very demanding on the CPU and very highly threaded, I'd keep the i5 if you can afford it.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Animal posted:

The new nvidia drivers give me no display on Win8 with a Geforce 780. Time to go into the odyssey of getting into Win8 safe mode and uninstalling them.

How long did you give them to install? This branch seems to take for loving ever to install with GTX 780s. Has since the 320. I thought it wasn't installing correctly until one time I finally just got distracted and left it on the black screen as my monitor ran through its input options for long enough to it to do whatever it was doing and came back to a freshly installed driver and a request to restart. It took way, way longer than any pre-320 drivers, ever.

Monday_
Feb 18, 2006

Worked-up silent dork without sex ability seeks oblivion and demise.
The Great Twist

The Lord Bude posted:

Well that makes things a bit different. Under those circumstances, I retract my advice.

I will just say though that civ5 is very demanding on the CPU and very highly threaded, I'd keep the i5 if you can afford it.

My fault for not providing enough info in my first post. And yeah, I'm afraid of the i3 for both Civ and the fact that Fallout 4 will be developed primarily for PS4/XBone. But I'll be going with that and whatever GPU can handle it well whenever it's released. Hopefully the nVidia 800 series.

Animal
Apr 8, 2003

Agreed posted:

How long did you give them to install? This branch seems to take for loving ever to install with GTX 780s. Has since the 320. I thought it wasn't installing correctly until one time I finally just got distracted and left it on the black screen as my monitor ran through its input options for long enough to it to do whatever it was doing and came back to a freshly installed driver and a request to restart. It took way, way longer than any pre-320 drivers, ever.

About three minutes... I can try again but if its an actual problem then I got to go through the hassle of getting into Windows 8 safe mode again. On the other hand 320.49 takes 30 seconds at the most.

-edit-
Another guy on HardOCP is having the same problem, it seems it cant recognize Catleap 1440p monitors. Weird.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

In a few minutes I'll give it a shot and see if it has issues with installing on my setup, with a Samsung 1080p green monitor with a red bezel, the name of which I cannot remember :saddowns:

Went fine, like the 314 drivers and before. Close Precision, install, didn't even worry about a Clean Install this time - a few minutes and a couple flashes later, no issues at all installing. Sorry no help this go 'round :ohdear:

Agreed fucked around with this message at 07:14 on Jul 19, 2013

td4guy
Jun 13, 2005

I always hated that guy.

movax posted:

Updated OP and thread title sorry for the delay :angel:
This is a small thing, but the OP still declares the GTX 680 as the fastest single-GPU card on the market in a couple of spots.

Animal
Apr 8, 2003

Agreed posted:

In a few minutes I'll give it a shot and see if it has issues with installing on my setup, with a Samsung 1080p green monitor with a red bezel, the name of which I cannot remember :saddowns:

Went fine, like the 314 drivers and before. Close Precision, install, didn't even worry about a Clean Install this time - a few minutes and a couple flashes later, no issues at all installing. Sorry no help this go 'round :ohdear:

The drivers don't work with Korean monitors. Plenty of people have confirmed it. They are racist drivers.

Aphrodite
Jun 27, 2006

Nvidia just pushing the stereotype that Asian = Bad driver.

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride
That's weird. Some kind of HDCP thing...?

Aphrodite posted:

Nvidia just pushing the stereotype that Asian = Bad driver.

:golfclap:

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Animal posted:

The drivers don't work with Korean monitors. Plenty of people have confirmed it. They are racist drivers.

Welp. There's one for the AMD vs. nVidia driver war. HOPE YOU DON'T HAVE A GOOD CHEAP MONITOR lol

Aphrodite posted:

Nvidia just pushing the stereotype that Asian = Bad driver.

:bravo:

Rosoboronexport
Jun 14, 2006

Get in the bath, baby!
Ramrod XTreme
I'm not sure if this should be on HoTS or here, but has anyone experience in HD5650 mobility + HD4200 IGP driver combo?

Wife has a HP consumer laptop with Turion P520 and HD5650 mobility and HD4200 through Powerexpress v2(the one where you choose it manually from the menu or automatically when changing between battery and AC power). Now, as AMD switched the HD4xxx series to legacy drivers, there has been no unified driver package since Catalyst 12.4 which is currently installed on the laptop. Therefore if I go through the autodetect route it will perform new driver install but will not detect either GPU after reboot.

Is it possible to just update individual drivers through Windows hardware management or is that asking for more trouble (OS is Win 7 x64)? The integrated GPU cannot be disabled in BIOS

Some fellow in Guru3D forums reported that he had found 13.1 reloaded driver package which seemed to work for him. Has anyone else some experience with drivers for both current and legacy driver issues (comedy option: wait for HD5xxx series to move to legacy drivers)?

The combo is quite powerful for a 3 year old media laptop; even Crysis 3 is close to playable in native 1366x768 resolution and low details. I'm testing Bioshock Infinite next. Also cleaning the cooling system helped immensely. While the fan noise changed in pitch, temp levels dropped close to 15 Celsius.

MagusDraco
Nov 11, 2011

even speedwagon was trolled

Rosoboronexport posted:

I'm not sure if this should be on HoTS or here, but has anyone experience in HD5650 mobility + HD4200 IGP driver combo?

Wife has a HP consumer laptop with Turion P520 and HD5650 mobility and HD4200 through Powerexpress v2(the one where you choose it manually from the menu or automatically when changing between battery and AC power). Now, as AMD switched the HD4xxx series to legacy drivers, there has been no unified driver package since Catalyst 12.4 which is currently installed on the laptop. Therefore if I go through the autodetect route it will perform new driver install but will not detect either GPU after reboot.

Is it possible to just update individual drivers through Windows hardware management or is that asking for more trouble (OS is Win 7 x64)? The integrated GPU cannot be disabled in BIOS

Some fellow in Guru3D forums reported that he had found 13.1 reloaded driver package which seemed to work for him. Has anyone else some experience with drivers for both current and legacy driver issues (comedy option: wait for HD5xxx series to move to legacy drivers)?

The combo is quite powerful for a 3 year old media laptop; even Crysis 3 is close to playable in native 1366x768 resolution and low details. I'm testing Bioshock Infinite next. Also cleaning the cooling system helped immensely. While the fan noise changed in pitch, temp levels dropped close to 15 Celsius.

I have an HP Envy 14 with a HD 5650 Mobility and whatever iGPU is built into an i5-520m. Swapping is manual like with your computer.

The only way I can update drivers is manually. The only way for me to get things updated is to go "latest HP official drivers" >>> Manual download of ATI 11.2 >>> manual download of something in ATI 12.x like ATI 12.4 >>> Manual download of ATI 13.4.

The iGPU cannot be updated on my computer beyond the latest HP official drivers.

e: If I don't use ATI 11.2 my computer crashes when updating the drivers because well...HP might have done some weird things with this laptop. The initial set of drivers wouldn't let you update to even the next set of official drivers without killing the display. You'd have to just wait ten minutes or so and then reboot to get the screen back (the drivers would be updated if you waited long enough).

e the 2nd: A manual update for 13.4 is over here. http://www.guru3d.com/files_details/amd_catalyst_13_4_whql_download.html Go pick the mobility driver win8/7/vista 64-bit drivers.

MagusDraco fucked around with this message at 19:03 on Jul 20, 2013

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva
Reminder for anyone using MSI Afterburner: Beta version 3.10 is expiring today (7/21/13) so grab the new version here:

http://event.msi.com/vga/afterburner/download.htm

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Precision X also just updated from 4.2.0 to 4.2.1 for anyone using that. Nab it, updates made to the Rivatuner backend that supports monitoring and screen capture functions which probably fix things we didn't really know were broken :v:

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

Agreed posted:

Precision X also just updated from 4.2.0 to 4.2.1 for anyone using that. Nab it, updates made to the Rivatuner backend that supports monitoring and screen capture functions which probably fix things we didn't really know were broken :v:

Does the update fix Precision X's total inability to not crash at least once a day? That's why I stopped using it.

SomekindaZombie
Sep 16, 2002
Of course this is anecdotal, but I've never had Precision X crash on me with version 3 or 4.

fookolt
Mar 13, 2012

Where there is power
There is resistance

SomekindaZombie posted:

Of course this is anecdotal, but I've never had Precision X crash on me with version 3 or 4.

Hah, I haven't either until I literally came back to my computer after a day out and it said EVGA Precision X crashed.

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

SomekindaZombie posted:

Of course this is anecdotal, but I've never had Precision X crash on me with version 3 or 4.

This was last november when I got my 680.

Rosoboronexport
Jun 14, 2006

Get in the bath, baby!
Ramrod XTreme

havenwaters posted:

I have an HP Envy 14 with a HD 5650 Mobility and whatever iGPU is built into an i5-520m. Swapping is manual like with your computer.

Your situation is a bit easier than mine as your integrated iuses Intel drivers and isn't AMD's as well (like in my case).

The situation is solved (sorta): The method for at least this laptop was as follows:
1. Get both legacy and normal drivers and extract the drivers to two different directories
2. Install drivers manually for HD5650 from device management -> update driver.
3. Install legacy driver manually for HD4200 (and for some reason I had to use the specify my own driver as it had some certificate issues)
3. Start CCC installation from 13.4 Folder\packages\binaries\setup.exe (the setup.exe in the main dir won't work) and choose custom installation. Untick video driver from the installed components and install latest CCC and other applications.
(4. optional: if after all this you get kdbsync.exe crash on startup you need to disable OpenCL by uninstalling APP SDK & Runtime from Catalyst package manager. I'm not sure if OpenCL is broken in that driver pack anyways.)

I guess you can skip part 3 if you only update the HD5650 driver, but now everything else except OpenCL works.

Edit: according to some bitcoin forums this might be due to poorly uninstalled CCC drivers/settings. If I have the time later I might try to reinstall everything from scratch.

Rosoboronexport fucked around with this message at 12:45 on Jul 21, 2013

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
This is a weeeeeird bug I'm getting with Nvidia's 326.19 drivers. It seems like the frame timer is running at half speed. As in, all Windows Aero animations are running at half speed, and games which increment frame counters for sim timing (like Kerbal Space Program) run at half speed.

I hope a reboot fixes this. E: It does.

One Eye Open
Sep 19, 2006
Am I awake?

Rosoboronexport posted:

Your situation is a bit easier than mine as your integrated iuses Intel drivers and isn't AMD's as well (like in my case).

The situation is solved (sorta): The method for at least this laptop was as follows:
1. Get both legacy and normal drivers and extract the drivers to two different directories
2. Install drivers manually for HD5650 from device management -> update driver.
3. Install legacy driver manually for HD4200 (and for some reason I had to use the specify my own driver as it had some certificate issues)
3. Start CCC installation from 13.4 Folder\packages\binaries\setup.exe (the setup.exe in the main dir won't work) and choose custom installation. Untick video driver from the installed components and install latest CCC and other applications.
(4. optional: if after all this you get kdbsync.exe crash on startup you need to disable OpenCL by uninstalling APP SDK & Runtime from Catalyst package manager. I'm not sure if OpenCL is broken in that driver pack anyways.)

I guess you can skip part 3 if you only update the HD5650 driver, but now everything else except OpenCL works.

Edit: according to some bitcoin forums this might be due to poorly uninstalled CCC drivers/settings. If I have the time later I might try to reinstall everything from scratch.

I've been in a similar situation with my HD4250 chipset with a HD6470 discrete in my HP laptop. Unfortunately any update after 11.10 (manual or automatic, clean or upgrade) either bluescreens or locks the HD6470 to 100Mhz with 150Mhz memory (as opposed to the 800/900 it usually runs at), and none of the fixes on the web do anything about it. Also I need OpenCL, so I can't uninstall it. 'Tis annoying.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

The Lord Bude posted:

Does the update fix Precision X's total inability to not crash at least once a day? That's why I stopped using it.

Symptomatic of a current driver bug. It crashes, MSI rides it out (?). Tends to happen between 24 and 36 hour mark of uptime.

Rosoboronexport
Jun 14, 2006

Get in the bath, baby!
Ramrod XTreme

One Eye Open posted:

I've been in a similar situation with my HD4250 chipset with a HD6470 discrete in my HP laptop. Unfortunately any update after 11.10 (manual or automatic, clean or upgrade) either bluescreens or locks the HD6470 to 100Mhz with 150Mhz memory (as opposed to the 800/900 it usually runs at), and none of the fixes on the web do anything about it. Also I need OpenCL, so I can't uninstall it. 'Tis annoying.

Well, the 12.4 Mobility is the latest driver that worked for this laptop without any hassle as it contained both HD4- and HD5- series drivers You could try to uninstall the current drivers and then run the AMD Catalyst Uninstall Utility, http://sites.amd.com/us/game/downloads/Pages/catalyst-uninstall-utility.aspx

When the HD6470 is locked to low frequency, have you checked the power-saving settings? If you put the video card to power saving, it will clock only very low (This HP laptop will go to 300/300 instead of 550/800).

One Eye Open
Sep 19, 2006
Am I awake?

Rosoboronexport posted:

Well, the 12.4 Mobility is the latest driver that worked for this laptop without any hassle as it contained both HD4- and HD5- series drivers You could try to uninstall the current drivers and then run the AMD Catalyst Uninstall Utility, http://sites.amd.com/us/game/downloads/Pages/catalyst-uninstall-utility.aspx

When the HD6470 is locked to low frequency, have you checked the power-saving settings? If you put the video card to power saving, it will clock only very low (This HP laptop will go to 300/300 instead of 550/800).

I just installed that, and again it went to 100/150Mhz again, so I switched off and on PowerPlay (as I did on previous occasions), as that is the common problem with the some of the 6-series according to troubleshooting forums - it locks into the low power mode with it on or off. As a last resort before reinstalling 11.10, I rebooted and tried toggling PowerPlay again, but this time it worked! Thank you so much :hfive:


Edit: And now it's locked back down to 100/150 again:( I'd better go back to the old drivers.

One Eye Open fucked around with this message at 00:22 on Jul 23, 2013

Momonari kun
Apr 6, 2002
Yes, you needed video.
Kind of a random question, but my graphics card is really old (Radeon HD2600 Pro, yeah, it's old) and I can't test this feature, but I heard that newer cards have a scaling ability built in that looks nicer than scaling in most LCDs. How much better is it and are there any screenshot comparisons out there? Reason I'm asking is I'm deciding between a 27 (1440p) and a 24 (1080p) inch monitor, but don't want to spend a whole lot of money on a graphics card (thinking the GeForce 760). I mostly play older games, but if some newer ones look decent with scaling (1080->1440), then that would be all I care about.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Way better, please take my word for it, as my iphone's camera is incapable of capturing the difference and you really need to see the whole picture rather than just trying to get the feel for it in a screenshot.

Most LCDs just have an obligatory nearest-neighbor scaling built in, which sucks and looks awful. GPUs do it fancy and nearly for free and look great.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Quadro K6000 announcement. Now witness the firepower of this fully armed and operational battle-GPU. 15 SMX GK110 (2880 shaders) running at about 900 MHz with full FP64 performance and 12 GB of 6 GHz ECC GDDR5.

In a 225W TDP.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Okay now, what? It has 12GB of ECC GDDR5 :psyduck: It is the most transistors, how does it somehow fit inside a thermal envelope that makes a GTX 770 work I don't understand this at all, they're wizards in real life.

Edit: Real talk, all the 780s having really really low ASIC scores kinda makes sense now. EVERYTHING would have to be super high leakage compared to that magnificent bastard of a card.

Agreed fucked around with this message at 17:09 on Jul 23, 2013

Parker Lewis
Jan 4, 2006

Can't Lose


PhysX chat:

I wasn't happy with the performance of Borderlands 2 on a GTX 760 with PhysX on high (it was often dropping from a solid 60 FPS down to the 30s in heavy combat, especially multiplayer) so I started looking into getting a cheap second video card to use as a dedicated PhysX card.

I settled on a used Galaxy GT 640 for $40 on eBay because it seemed to have the most CUDA cores in that price range and didn't require me to run any extra PCI-E power cables or anything.

I don't have anything other than MSI Afterburner FPS overlays to go on but adding a dedicated PhysX card did make the FPS drop less noticeable although it did not eliminate it completely as I'd hoped.

Preliminary research indicates that there's probably just problems with the way PhysX is implemented in Borderlands 2 as people with PC specs far beyond mine seem to still have the same problem, and there might be some sort of CPU-bound issue as well.

I haven't had a chance to check out other PhysX games like Metro or Arkham City yet. In theory the 760 should probably be able to handle everything on its own but I sort of like the idea of having a card dedicated to PhysX (for the few games that can use it) and for $40 I guess I'll hang onto the GT 640.

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride
I think the recommended physx card is the 650ti if I am remembering right.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Dogen posted:

I think the recommended physx card is the 650ti if I am remembering right.

Performance analogous to a 560Ti in graphics, better performance in PhysX. Yep. Runs quiet and cool, too. No performance loss with the highest possible PhysX settings (.ini adjusted), using a GTX 780 to render and the 650Ti as a headless coprocessor.

I had thought about a 650 when it was on sale for $80, but after that sale ended, I opted for a (very slightly, not even registered yet) used EVGA 650Ti for superior performance. It overclocks well and doesn't use the boost stuff so it's kind of a trip down memory lane, there; it's about as low as I would recommend going for "I want all the PhysX!" and didn't reduce my FPS in games or my score in PhysX benchmarks compared to a GTX 580 as physX card. For reference, the GTX 580 is ludicrously overpowered for simple CUDA workloads like PhysX and that was just a waste of a good card. Hope Flute is enjoying it for its real purpose :)

FluidMark might run faster if I unlock emitters and run twenty or thirty of them at once on a 580 over a 650Ti, but no game has that absurd level of resource utilization for the coprocessor. FluidMark is the Furmark of PhysX.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
The problem with the GT 640 as a PhysX card is that it can be hard to know what you're getting, especially used. Depending on the model GT 640s ranged from ~400-800 GFLOPS, the GTX 650 Ti is about 1400, and the GTX 650 Ti Boost about 1500 (but with more ROPs and memory bandwidth). I don't know if we ever determined how much computer performance was required for PhysX in modern games, but it seems wise to leave some breathing room. Additionally, a card with more capabilities (particularly a GTX 650 Ti Boost) can be used as a gaming card in a pinch.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Alereon posted:

The problem with the GT 640 as a PhysX card is that it can be hard to know what you're getting, especially used. Depending on the model GT 640s ranged from ~400-800 GFLOPS, the GTX 650 Ti is about 1400, and the GTX 650 Ti Boost about 1500 (but with more ROPs and memory bandwidth). I don't know if we ever determined how much computer performance was required for PhysX in modern games, but it seems wise to leave some breathing room. Additionally, a card with more capabilities (particularly a GTX 650 Ti Boost) can be used as a gaming card in a pinch.

I nabbed a free 20% or so overclock out of the core and upped the VRAM baseclock by over 200MHz on the 650Ti; while it lacks the memory bandwidth of the Boost and won't perform like a true price:performance gaming card, it will outrun a 560Ti and that's still good enough at 1080p that if my 780 ever shits the bed I won't be hosed 'til I get it fixed :)

I can link some benches if you're interested.

Gonkish
May 19, 2004

All this talk about using the 560 Ti as a PhysX card is making me jealous. I'm stuck with one until next month when I can finally get my hands on a 770.

Now, though, I have one final question, and it's a doozy: 2GB, or 4GB? I'm at 1080p. The 770 is already overkill, but, frankly, I like overkill (to an extent, I'm not going towards the 780 :P). 4GB is, obviously, a bit more "future-proof", but it comes with a premium. The big thing is that, for example, EVGA's ACX cooler 2gb 770s are only about $30 cheaper than Gigabyte's 4GB 770 (w/ Windforce 3x). I can shoot for MSI's 2gb Gaming edition Twin Frozr thing, for $409. ASUS's 2gb 770 is $405.

Really, my question is, outside of a heavily modified Skyrim, do I really need 4GB of VRAM at 1080p? I'm reasoning that I'll probably be replacing this card in 3 or so years anyway, so... well... the hell do I do? GOONS! HELP ME! I AM INDECISIVE!

Gonkish fucked around with this message at 05:07 on Jul 24, 2013

Parker Lewis
Jan 4, 2006

Can't Lose


Tonight I took out the GT 640 and put in my old 660 Ti that I still haven't gotten around to selling yet to try that out as the PhysX card.

Combination of GTX 760 as main card and 660 Ti as PhysX card wasn't noticeably better than GTX 760 + GT 640 (or really even just the 760 on its own).

It also doesn't make any sense to me to pair a 760 with a second card that costs more than the $40 I spent on the 640 because I'd quickly start approaching the price where I could have just gotten a 770 and enjoyed better performance in all my games, not just the handful of PhysX ones.

My conclusion is that it's not worth bothering with a second card to dedicate to PhysX, particularly for Borderlands 2 which seems to have issues that prevent it from maintaining 60 FPS in heavy PhysX scenes regardless of hardware.

Parker Lewis fucked around with this message at 05:12 on Jul 24, 2013

Parker Lewis
Jan 4, 2006

Can't Lose


Gonkish posted:

All this talk about using the 560 Ti as a PhysX card is making me jealous. I'm stuck with one until next month when I can finally get my hands on a 770.

Now, though, I have one final question, and it's a doozy: 2GB, or 4GB? I'm at 1080p. The 770 is already overkill, but, frankly, I like overkill (to an extent, I'm not going towards the 780 :P). 4GB is, obviously, a bit more "future-proof", but it comes with a premium. The big thing is that, for example, EVGA's ACX cooler 2gb 770s are only about $30 cheaper than Gigabyte's 4GB 770 (w/ Windforce 3x). I can shoot for MSI's 2gb Gaming edition Twin Frozr thing, for $409. ASUS's 2gb 770 is $405.

Really, my question is, outside of a heavily modified Skyrim, do I really need 4GB of VRAM at 1080p? I'm reasoning that I'll probably be replacing this card in 3 or so years anyway, so... well... the hell do I do? GOONS! HELP ME! I AM INDECISIVE!

For 1080p I'd personally go for one of the 2GB $400 cards, and I've had good luck with the MSI Twin Frozr series over the past few years. 4GB smells too much like "futureproofing" to me and I'd rather get the 2GB card now and trade it in towards a 4GB card in a year or two when I actually need the extra memory (plus that future card will be faster than the 770).

Gonkish
May 19, 2004

Parker Lewis posted:

For 1080p I'd personally go for one of the 2GB $400 cards, and I've had good luck with the MSI Twin Frozr series over the past few years. 4GB smells too much like "futureproofing" to me and I'd rather get the 2GB card now and trade it in towards a 4GB card in a year or two when I actually need the extra memory (plus that future card will be faster than the 770).

Yeah, I was thinking about it earlier and was wondering what 4gb really gets me right now.

Adbot
ADBOT LOVES YOU

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Parker Lewis posted:

My conclusion is that it's not worth bothering with a second card to dedicate to PhysX, particularly for Borderlands 2 which seems to have issues that prevent it from maintaining 60 FPS in heavy PhysX scenes regardless of hardware.

I dunno, my 780 and 650Ti combo do not have this problem. At all. I can make so much gunk fly you'd think I were doing some horrible monster-bukkake game and yet it runs a solid 60 vsync'd, or just takes the gently caress off to crazy-town FPS-wise if uncapped.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply