|
movax posted:Agreed - I totally did get your PM but got carried away fixing my DNS and hosting which I discovered was broken when the OP lacked any images whatsoever I forgot about that too! Oh sweet as poo poo I'm in the OP, I am an official star! Woohooo
|
# ? Jul 19, 2013 02:28 |
|
|
# ? May 29, 2024 20:28 |
|
The new nvidia drivers give me no display on Win8 with a Geforce 780. Time to go into the odyssey of getting into Win8 safe mode and uninstalling them.
|
# ? Jul 19, 2013 03:39 |
|
MondayHotDog posted:I should have given more details. Gaming is only a very small part of what the machine will be used for. It's main purpose is just web browsing & digital painting. But it's going to be my only computer, and I spend a lot of time on it, and people say the most significant upgrade you can buy is an SSD. If anything, I was considering keeping the SSD and going with an i3 instead of the i5. I will just say though that civ5 is very demanding on the CPU and very highly threaded, I'd keep the i5 if you can afford it.
|
# ? Jul 19, 2013 03:46 |
|
Animal posted:The new nvidia drivers give me no display on Win8 with a Geforce 780. Time to go into the odyssey of getting into Win8 safe mode and uninstalling them. How long did you give them to install? This branch seems to take for loving ever to install with GTX 780s. Has since the 320. I thought it wasn't installing correctly until one time I finally just got distracted and left it on the black screen as my monitor ran through its input options for long enough to it to do whatever it was doing and came back to a freshly installed driver and a request to restart. It took way, way longer than any pre-320 drivers, ever.
|
# ? Jul 19, 2013 03:49 |
|
The Lord Bude posted:Well that makes things a bit different. Under those circumstances, I retract my advice. My fault for not providing enough info in my first post. And yeah, I'm afraid of the i3 for both Civ and the fact that Fallout 4 will be developed primarily for PS4/XBone. But I'll be going with that and whatever GPU can handle it well whenever it's released. Hopefully the nVidia 800 series.
|
# ? Jul 19, 2013 03:56 |
|
Agreed posted:How long did you give them to install? This branch seems to take for loving ever to install with GTX 780s. Has since the 320. I thought it wasn't installing correctly until one time I finally just got distracted and left it on the black screen as my monitor ran through its input options for long enough to it to do whatever it was doing and came back to a freshly installed driver and a request to restart. It took way, way longer than any pre-320 drivers, ever. About three minutes... I can try again but if its an actual problem then I got to go through the hassle of getting into Windows 8 safe mode again. On the other hand 320.49 takes 30 seconds at the most. -edit- Another guy on HardOCP is having the same problem, it seems it cant recognize Catleap 1440p monitors. Weird.
|
# ? Jul 19, 2013 04:08 |
|
In a few minutes I'll give it a shot and see if it has issues with installing on my setup, with a Samsung 1080p green monitor with a red bezel, the name of which I cannot remember Went fine, like the 314 drivers and before. Close Precision, install, didn't even worry about a Clean Install this time - a few minutes and a couple flashes later, no issues at all installing. Sorry no help this go 'round Agreed fucked around with this message at 07:14 on Jul 19, 2013 |
# ? Jul 19, 2013 06:49 |
|
movax posted:Updated OP and thread title sorry for the delay
|
# ? Jul 19, 2013 12:29 |
|
Agreed posted:In a few minutes I'll give it a shot and see if it has issues with installing on my setup, with a Samsung 1080p green monitor with a red bezel, the name of which I cannot remember The drivers don't work with Korean monitors. Plenty of people have confirmed it. They are racist drivers.
|
# ? Jul 19, 2013 16:05 |
|
Nvidia just pushing the stereotype that Asian = Bad driver.
|
# ? Jul 19, 2013 16:46 |
|
That's weird. Some kind of HDCP thing...?Aphrodite posted:Nvidia just pushing the stereotype that Asian = Bad driver.
|
# ? Jul 19, 2013 17:00 |
|
Animal posted:The drivers don't work with Korean monitors. Plenty of people have confirmed it. They are racist drivers. Welp. There's one for the AMD vs. nVidia driver war. HOPE YOU DON'T HAVE A GOOD CHEAP MONITOR lol Aphrodite posted:Nvidia just pushing the stereotype that Asian = Bad driver.
|
# ? Jul 19, 2013 17:06 |
|
I'm not sure if this should be on HoTS or here, but has anyone experience in HD5650 mobility + HD4200 IGP driver combo? Wife has a HP consumer laptop with Turion P520 and HD5650 mobility and HD4200 through Powerexpress v2(the one where you choose it manually from the menu or automatically when changing between battery and AC power). Now, as AMD switched the HD4xxx series to legacy drivers, there has been no unified driver package since Catalyst 12.4 which is currently installed on the laptop. Therefore if I go through the autodetect route it will perform new driver install but will not detect either GPU after reboot. Is it possible to just update individual drivers through Windows hardware management or is that asking for more trouble (OS is Win 7 x64)? The integrated GPU cannot be disabled in BIOS Some fellow in Guru3D forums reported that he had found 13.1 reloaded driver package which seemed to work for him. Has anyone else some experience with drivers for both current and legacy driver issues (comedy option: wait for HD5xxx series to move to legacy drivers)? The combo is quite powerful for a 3 year old media laptop; even Crysis 3 is close to playable in native 1366x768 resolution and low details. I'm testing Bioshock Infinite next. Also cleaning the cooling system helped immensely. While the fan noise changed in pitch, temp levels dropped close to 15 Celsius.
|
# ? Jul 20, 2013 11:37 |
|
Rosoboronexport posted:I'm not sure if this should be on HoTS or here, but has anyone experience in HD5650 mobility + HD4200 IGP driver combo? I have an HP Envy 14 with a HD 5650 Mobility and whatever iGPU is built into an i5-520m. Swapping is manual like with your computer. The only way I can update drivers is manually. The only way for me to get things updated is to go "latest HP official drivers" >>> Manual download of ATI 11.2 >>> manual download of something in ATI 12.x like ATI 12.4 >>> Manual download of ATI 13.4. The iGPU cannot be updated on my computer beyond the latest HP official drivers. e: If I don't use ATI 11.2 my computer crashes when updating the drivers because well...HP might have done some weird things with this laptop. The initial set of drivers wouldn't let you update to even the next set of official drivers without killing the display. You'd have to just wait ten minutes or so and then reboot to get the screen back (the drivers would be updated if you waited long enough). e the 2nd: A manual update for 13.4 is over here. http://www.guru3d.com/files_details/amd_catalyst_13_4_whql_download.html Go pick the mobility driver win8/7/vista 64-bit drivers. MagusDraco fucked around with this message at 19:03 on Jul 20, 2013 |
# ? Jul 20, 2013 18:59 |
|
Reminder for anyone using MSI Afterburner: Beta version 3.10 is expiring today (7/21/13) so grab the new version here: http://event.msi.com/vga/afterburner/download.htm
|
# ? Jul 21, 2013 05:48 |
|
Precision X also just updated from 4.2.0 to 4.2.1 for anyone using that. Nab it, updates made to the Rivatuner backend that supports monitoring and screen capture functions which probably fix things we didn't really know were broken
|
# ? Jul 21, 2013 06:09 |
|
Agreed posted:Precision X also just updated from 4.2.0 to 4.2.1 for anyone using that. Nab it, updates made to the Rivatuner backend that supports monitoring and screen capture functions which probably fix things we didn't really know were broken Does the update fix Precision X's total inability to not crash at least once a day? That's why I stopped using it.
|
# ? Jul 21, 2013 07:35 |
|
Of course this is anecdotal, but I've never had Precision X crash on me with version 3 or 4.
|
# ? Jul 21, 2013 07:44 |
|
SomekindaZombie posted:Of course this is anecdotal, but I've never had Precision X crash on me with version 3 or 4. Hah, I haven't either until I literally came back to my computer after a day out and it said EVGA Precision X crashed.
|
# ? Jul 21, 2013 07:57 |
|
SomekindaZombie posted:Of course this is anecdotal, but I've never had Precision X crash on me with version 3 or 4. This was last november when I got my 680.
|
# ? Jul 21, 2013 08:19 |
|
havenwaters posted:I have an HP Envy 14 with a HD 5650 Mobility and whatever iGPU is built into an i5-520m. Swapping is manual like with your computer. Your situation is a bit easier than mine as your integrated iuses Intel drivers and isn't AMD's as well (like in my case). The situation is solved (sorta): The method for at least this laptop was as follows: 1. Get both legacy and normal drivers and extract the drivers to two different directories 2. Install drivers manually for HD5650 from device management -> update driver. 3. Install legacy driver manually for HD4200 (and for some reason I had to use the specify my own driver as it had some certificate issues) 3. Start CCC installation from 13.4 Folder\packages\binaries\setup.exe (the setup.exe in the main dir won't work) and choose custom installation. Untick video driver from the installed components and install latest CCC and other applications. (4. optional: if after all this you get kdbsync.exe crash on startup you need to disable OpenCL by uninstalling APP SDK & Runtime from Catalyst package manager. I'm not sure if OpenCL is broken in that driver pack anyways.) I guess you can skip part 3 if you only update the HD5650 driver, but now everything else except OpenCL works. Edit: according to some bitcoin forums this might be due to poorly uninstalled CCC drivers/settings. If I have the time later I might try to reinstall everything from scratch. Rosoboronexport fucked around with this message at 12:45 on Jul 21, 2013 |
# ? Jul 21, 2013 12:35 |
|
This is a weeeeeird bug I'm getting with Nvidia's 326.19 drivers. It seems like the frame timer is running at half speed. As in, all Windows Aero animations are running at half speed, and games which increment frame counters for sim timing (like Kerbal Space Program) run at half speed. I hope a reboot fixes this. E: It does.
|
# ? Jul 21, 2013 15:39 |
|
Rosoboronexport posted:Your situation is a bit easier than mine as your integrated iuses Intel drivers and isn't AMD's as well (like in my case). I've been in a similar situation with my HD4250 chipset with a HD6470 discrete in my HP laptop. Unfortunately any update after 11.10 (manual or automatic, clean or upgrade) either bluescreens or locks the HD6470 to 100Mhz with 150Mhz memory (as opposed to the 800/900 it usually runs at), and none of the fixes on the web do anything about it. Also I need OpenCL, so I can't uninstall it. 'Tis annoying.
|
# ? Jul 21, 2013 16:05 |
|
The Lord Bude posted:Does the update fix Precision X's total inability to not crash at least once a day? That's why I stopped using it. Symptomatic of a current driver bug. It crashes, MSI rides it out (?). Tends to happen between 24 and 36 hour mark of uptime.
|
# ? Jul 21, 2013 17:04 |
|
One Eye Open posted:I've been in a similar situation with my HD4250 chipset with a HD6470 discrete in my HP laptop. Unfortunately any update after 11.10 (manual or automatic, clean or upgrade) either bluescreens or locks the HD6470 to 100Mhz with 150Mhz memory (as opposed to the 800/900 it usually runs at), and none of the fixes on the web do anything about it. Also I need OpenCL, so I can't uninstall it. 'Tis annoying. Well, the 12.4 Mobility is the latest driver that worked for this laptop without any hassle as it contained both HD4- and HD5- series drivers You could try to uninstall the current drivers and then run the AMD Catalyst Uninstall Utility, http://sites.amd.com/us/game/downloads/Pages/catalyst-uninstall-utility.aspx When the HD6470 is locked to low frequency, have you checked the power-saving settings? If you put the video card to power saving, it will clock only very low (This HP laptop will go to 300/300 instead of 550/800).
|
# ? Jul 21, 2013 21:29 |
|
Rosoboronexport posted:Well, the 12.4 Mobility is the latest driver that worked for this laptop without any hassle as it contained both HD4- and HD5- series drivers You could try to uninstall the current drivers and then run the AMD Catalyst Uninstall Utility, http://sites.amd.com/us/game/downloads/Pages/catalyst-uninstall-utility.aspx I just installed that, and again it went to 100/150Mhz again, so I switched off and on PowerPlay (as I did on previous occasions), as that is the common problem with the some of the 6-series according to troubleshooting forums - it locks into the low power mode with it on or off. As a last resort before reinstalling 11.10, I rebooted and tried toggling PowerPlay again, but this time it worked! Thank you so much Edit: And now it's locked back down to 100/150 again:( I'd better go back to the old drivers. One Eye Open fucked around with this message at 00:22 on Jul 23, 2013 |
# ? Jul 22, 2013 00:25 |
|
Kind of a random question, but my graphics card is really old (Radeon HD2600 Pro, yeah, it's old) and I can't test this feature, but I heard that newer cards have a scaling ability built in that looks nicer than scaling in most LCDs. How much better is it and are there any screenshot comparisons out there? Reason I'm asking is I'm deciding between a 27 (1440p) and a 24 (1080p) inch monitor, but don't want to spend a whole lot of money on a graphics card (thinking the GeForce 760). I mostly play older games, but if some newer ones look decent with scaling (1080->1440), then that would be all I care about.
|
# ? Jul 23, 2013 05:25 |
|
Way better, please take my word for it, as my iphone's camera is incapable of capturing the difference and you really need to see the whole picture rather than just trying to get the feel for it in a screenshot. Most LCDs just have an obligatory nearest-neighbor scaling built in, which sucks and looks awful. GPUs do it fancy and nearly for free and look great.
|
# ? Jul 23, 2013 05:41 |
|
Quadro K6000 announcement. Now witness the firepower of this fully armed and operational battle-GPU. 15 SMX GK110 (2880 shaders) running at about 900 MHz with full FP64 performance and 12 GB of 6 GHz ECC GDDR5. In a 225W TDP.
|
# ? Jul 23, 2013 16:04 |
|
Okay now, what? It has 12GB of ECC GDDR5 It is the most transistors, how does it somehow fit inside a thermal envelope that makes a GTX 770 work I don't understand this at all, they're wizards in real life. Edit: Real talk, all the 780s having really really low ASIC scores kinda makes sense now. EVERYTHING would have to be super high leakage compared to that magnificent bastard of a card. Agreed fucked around with this message at 17:09 on Jul 23, 2013 |
# ? Jul 23, 2013 17:03 |
|
PhysX chat: I wasn't happy with the performance of Borderlands 2 on a GTX 760 with PhysX on high (it was often dropping from a solid 60 FPS down to the 30s in heavy combat, especially multiplayer) so I started looking into getting a cheap second video card to use as a dedicated PhysX card. I settled on a used Galaxy GT 640 for $40 on eBay because it seemed to have the most CUDA cores in that price range and didn't require me to run any extra PCI-E power cables or anything. I don't have anything other than MSI Afterburner FPS overlays to go on but adding a dedicated PhysX card did make the FPS drop less noticeable although it did not eliminate it completely as I'd hoped. Preliminary research indicates that there's probably just problems with the way PhysX is implemented in Borderlands 2 as people with PC specs far beyond mine seem to still have the same problem, and there might be some sort of CPU-bound issue as well. I haven't had a chance to check out other PhysX games like Metro or Arkham City yet. In theory the 760 should probably be able to handle everything on its own but I sort of like the idea of having a card dedicated to PhysX (for the few games that can use it) and for $40 I guess I'll hang onto the GT 640.
|
# ? Jul 23, 2013 21:53 |
|
I think the recommended physx card is the 650ti if I am remembering right.
|
# ? Jul 23, 2013 21:58 |
|
Dogen posted:I think the recommended physx card is the 650ti if I am remembering right. Performance analogous to a 560Ti in graphics, better performance in PhysX. Yep. Runs quiet and cool, too. No performance loss with the highest possible PhysX settings (.ini adjusted), using a GTX 780 to render and the 650Ti as a headless coprocessor. I had thought about a 650 when it was on sale for $80, but after that sale ended, I opted for a (very slightly, not even registered yet) used EVGA 650Ti for superior performance. It overclocks well and doesn't use the boost stuff so it's kind of a trip down memory lane, there; it's about as low as I would recommend going for "I want all the PhysX!" and didn't reduce my FPS in games or my score in PhysX benchmarks compared to a GTX 580 as physX card. For reference, the GTX 580 is ludicrously overpowered for simple CUDA workloads like PhysX and that was just a waste of a good card. Hope Flute is enjoying it for its real purpose FluidMark might run faster if I unlock emitters and run twenty or thirty of them at once on a 580 over a 650Ti, but no game has that absurd level of resource utilization for the coprocessor. FluidMark is the Furmark of PhysX.
|
# ? Jul 23, 2013 22:11 |
|
The problem with the GT 640 as a PhysX card is that it can be hard to know what you're getting, especially used. Depending on the model GT 640s ranged from ~400-800 GFLOPS, the GTX 650 Ti is about 1400, and the GTX 650 Ti Boost about 1500 (but with more ROPs and memory bandwidth). I don't know if we ever determined how much computer performance was required for PhysX in modern games, but it seems wise to leave some breathing room. Additionally, a card with more capabilities (particularly a GTX 650 Ti Boost) can be used as a gaming card in a pinch.
|
# ? Jul 23, 2013 23:14 |
|
Alereon posted:The problem with the GT 640 as a PhysX card is that it can be hard to know what you're getting, especially used. Depending on the model GT 640s ranged from ~400-800 GFLOPS, the GTX 650 Ti is about 1400, and the GTX 650 Ti Boost about 1500 (but with more ROPs and memory bandwidth). I don't know if we ever determined how much computer performance was required for PhysX in modern games, but it seems wise to leave some breathing room. Additionally, a card with more capabilities (particularly a GTX 650 Ti Boost) can be used as a gaming card in a pinch. I nabbed a free 20% or so overclock out of the core and upped the VRAM baseclock by over 200MHz on the 650Ti; while it lacks the memory bandwidth of the Boost and won't perform like a true price:performance gaming card, it will outrun a 560Ti and that's still good enough at 1080p that if my 780 ever shits the bed I won't be hosed 'til I get it fixed I can link some benches if you're interested.
|
# ? Jul 24, 2013 00:09 |
|
All this talk about using the 560 Ti as a PhysX card is making me jealous. I'm stuck with one until next month when I can finally get my hands on a 770. Now, though, I have one final question, and it's a doozy: 2GB, or 4GB? I'm at 1080p. The 770 is already overkill, but, frankly, I like overkill (to an extent, I'm not going towards the 780 :P). 4GB is, obviously, a bit more "future-proof", but it comes with a premium. The big thing is that, for example, EVGA's ACX cooler 2gb 770s are only about $30 cheaper than Gigabyte's 4GB 770 (w/ Windforce 3x). I can shoot for MSI's 2gb Gaming edition Twin Frozr thing, for $409. ASUS's 2gb 770 is $405. Really, my question is, outside of a heavily modified Skyrim, do I really need 4GB of VRAM at 1080p? I'm reasoning that I'll probably be replacing this card in 3 or so years anyway, so... well... the hell do I do? GOONS! HELP ME! I AM INDECISIVE! Gonkish fucked around with this message at 05:07 on Jul 24, 2013 |
# ? Jul 24, 2013 00:45 |
|
Tonight I took out the GT 640 and put in my old 660 Ti that I still haven't gotten around to selling yet to try that out as the PhysX card. Combination of GTX 760 as main card and 660 Ti as PhysX card wasn't noticeably better than GTX 760 + GT 640 (or really even just the 760 on its own). It also doesn't make any sense to me to pair a 760 with a second card that costs more than the $40 I spent on the 640 because I'd quickly start approaching the price where I could have just gotten a 770 and enjoyed better performance in all my games, not just the handful of PhysX ones. My conclusion is that it's not worth bothering with a second card to dedicate to PhysX, particularly for Borderlands 2 which seems to have issues that prevent it from maintaining 60 FPS in heavy PhysX scenes regardless of hardware. Parker Lewis fucked around with this message at 05:12 on Jul 24, 2013 |
# ? Jul 24, 2013 05:09 |
|
Gonkish posted:All this talk about using the 560 Ti as a PhysX card is making me jealous. I'm stuck with one until next month when I can finally get my hands on a 770. For 1080p I'd personally go for one of the 2GB $400 cards, and I've had good luck with the MSI Twin Frozr series over the past few years. 4GB smells too much like "futureproofing" to me and I'd rather get the 2GB card now and trade it in towards a 4GB card in a year or two when I actually need the extra memory (plus that future card will be faster than the 770).
|
# ? Jul 24, 2013 05:15 |
|
Parker Lewis posted:For 1080p I'd personally go for one of the 2GB $400 cards, and I've had good luck with the MSI Twin Frozr series over the past few years. 4GB smells too much like "futureproofing" to me and I'd rather get the 2GB card now and trade it in towards a 4GB card in a year or two when I actually need the extra memory (plus that future card will be faster than the 770). Yeah, I was thinking about it earlier and was wondering what 4gb really gets me right now.
|
# ? Jul 24, 2013 05:24 |
|
|
# ? May 29, 2024 20:28 |
|
Parker Lewis posted:My conclusion is that it's not worth bothering with a second card to dedicate to PhysX, particularly for Borderlands 2 which seems to have issues that prevent it from maintaining 60 FPS in heavy PhysX scenes regardless of hardware. I dunno, my 780 and 650Ti combo do not have this problem. At all. I can make so much gunk fly you'd think I were doing some horrible monster-bukkake game and yet it runs a solid 60 vsync'd, or just takes the gently caress off to crazy-town FPS-wise if uncapped.
|
# ? Jul 24, 2013 05:55 |