|
Paul MaudDib posted:Dumb question but have you tried replacing the cable? I have not. It's pretty intermittent for the most part so it's definitely a possibility. Looks like I'm 1 day over Microcenter's return policy anyway, so hopefully it's something simple.
|
# ? Sep 17, 2016 00:01 |
|
|
# ? Jun 11, 2024 02:58 |
|
Paul MaudDib posted:The 1070 is current-gen/newer, has big improvements in virtual reality performance, has slightly more VRAM, does better in DX12/Vulkan, has more DisplayPort connectors, and is a lot more power efficient. Are there any games out or that have been announced that use the Nvidia VR features? What is this speedup supposed to even be worth? This is probably a comment for another thread, but the VR game landscape doesn't look much better now than it did 6 months ago.
|
# ? Sep 17, 2016 00:08 |
Paul MaudDib posted:MSI Afterburner is a third-party tweaking app, not a driver stack. You don't need Afterburner if you don't want to mess with overclocking/undervolting/fan curves, but either way you need Crimson. Any reason why you don't run it at 144Hz all the time?
|
|
# ? Sep 17, 2016 01:19 |
|
Naffer posted:Are there any games out or that have been announced that use the Nvidia VR features? What is this speedup supposed to even be worth? This is probably a comment for another thread, but the VR game landscape doesn't look much better now than it did 6 months ago. Obduction is the first thing on my radar, but the game's out and VR support has been delayed. Bah.
|
# ? Sep 17, 2016 01:28 |
|
AVeryLargeRadish posted:Any reason why you don't run it at 144Hz all the time? My 980 Ti clocks up and pulls an additional 60 watts when I use 144 Hz, even if I'm just at the desktop. On the flip side it idles fine even with a 4K and a 1440p monitor - as long as you're at 60 Hz. AFAIK this actually still affects Pascal too, it's just that Pascal pulls less power in general. Paul MaudDib fucked around with this message at 01:53 on Sep 17, 2016 |
# ? Sep 17, 2016 01:43 |
Have you updated the driver? I have a 1070 and two monitors, one 1080/60 the other 1440/144, and its has no issue staying at idle. I think they fixed it recently.
|
|
# ? Sep 17, 2016 02:53 |
|
From last page:japtor posted:Probably gonna wait it out until Black Friday (considering I'm basically getting this for Pac Man of all things), but assuming nothing really changes in terms of any new cards coming, is the EVGA 1060 6GB the one to get if I want a short card? Doesn't seem like there's that many other options and I vaguely recall some positive posts or something about it. And is the SC version worth the extra or however much it is? I don't really plan on OCing or anything so maybe it'd be worth the higher base clock? some dillweed fucked around with this message at 20:53 on Sep 17, 2016 |
# ? Sep 17, 2016 03:35 |
|
Would it be unwise to use a riser cable so I can put my video card back-to-back with my motherboard? Not literally touching, but within like a half inch. I have an idea for a mITX setup
|
# ? Sep 17, 2016 20:19 |
|
Kaleidoscopic Gaze posted:Would it be unwise to use a riser cable so I can put my video card back-to-back with my motherboard? Not literally touching, but within like a half inch. I have an idea for a mITX setup Yeah I've had that idea too and it can work but the riser has to be long enough to fold under the motherboard if you want the graphics card fan to be facing out. Check out li-heat, they're a Chinese company that sells some nicely shielded risers that I've ordered from. Depending on your motherboard, you might still have to go to PCI 2.0 to get it stable with a riser that long.
|
# ? Sep 17, 2016 21:10 |
|
Grog posted:From last page: Kaleidoscopic Gaze posted:Would it be unwise to use a riser cable so I can put my video card back-to-back with my motherboard? Not literally touching, but within like a half inch. I have an idea for a mITX setup
|
# ? Sep 17, 2016 21:21 |
|
Kaleidoscopic Gaze posted:Would it be unwise to use a riser cable so I can put my video card back-to-back with my motherboard? Not literally touching, but within like a half inch. I have an idea for a mITX setup Smallest, full size GPU capable, mITX case. https://www.dan-cases.com/dana4.php Basically, the same size as a Razer Core eternal GPU dock.
|
# ? Sep 17, 2016 21:21 |
|
Another small size case alternative http://zaber.com.pl/sentry/
|
# ? Sep 17, 2016 22:29 |
|
So if I have a gsyc monitor and card and want to set up everything optimally I should: 1) set the monitor refresh as high as possible in nvidia control panel 2) enable gsyc in the control panel also 3) under vsync, set it to fast in the control panel 4) in games' settings, set vsync to off Is that generally correct?
|
# ? Sep 17, 2016 23:59 |
|
Set vsync to on.
|
# ? Sep 18, 2016 00:18 |
|
That last post reminded me to actually go and look at the settings for my new 1070/144hz monitor, and I didn't know I could set the desktop to 144Hz. Now I'm freaking out, everything is so smooth, maybe even way too smooth. When I bring the mouse cursor over to my old monitor I can see the cursor trails now
|
# ? Sep 18, 2016 00:23 |
|
So, is Nvidia ever gonna back off on the sign-in requirement to the GeForce Experience stuff or should I just uninstall it and stick with manually downloading drivers until they stop offering that too?
|
# ? Sep 18, 2016 00:42 |
|
No. But you could just make a bs account...
|
# ? Sep 18, 2016 01:20 |
|
Fuzzy Mammal posted:So if I have a gsyc monitor and card and want to set up everything optimally I should: 3) Fast vsync is probably not needed for every game -- yes to competitive shooters or twitch games, no real benefit to anything else. Because of the way it works, the GPU will be rendering and discarding frames as fast as it can, keeping itself 100% maxed. Less input latency, but kinda wasteful for games where you don't care about input latency (and maybe don't want your fans spinning all the way up). Fabricated posted:So, is Nvidia ever gonna back off on the sign-in requirement to the GeForce Experience stuff or should I just uninstall it and stick with manually downloading drivers until they stop offering that too?
|
# ? Sep 18, 2016 01:31 |
|
CharlieFoxtrot posted:That last post reminded me to actually go and look at the settings for my new 1070/144hz monitor, and I didn't know I could set the desktop to 144Hz. Welcome to the dark side, friend. Time to buy a whole new rack of 144 Hz monitors
|
# ? Sep 18, 2016 01:34 |
|
Subjunctive posted:Set vsync to on. No, for Gsync keep vsync off. Your monitor handles timing, you don't need to artificially limit it.
|
# ? Sep 18, 2016 01:51 |
|
Lockback posted:No, for Gsync keep vsync off. Your monitor handles timing, you don't need to artificially limit it. Vsync doesn't artificially limit it, it literally goes when the display is ready to accept a frame. Vsync off gives you tearing.
|
# ? Sep 18, 2016 02:17 |
|
Subjunctive posted:Vsync doesn't artificially limit it, it literally goes when the display is ready to accept a frame. Vsync off gives you tearing. I'm assuming it's the same with gsync but with freesync you don't need vsync on to prevent tearing. You just need to set a FPS limit below your max freesync range.
|
# ? Sep 18, 2016 02:22 |
|
When you are using gsync, the vsync toggle in the driver turns into a FPS cap. Enable it and gsync will cap your framerate at the monitors maximum 144 Hz refresh rate so you won't experience any tearing even at high framerates. Disable it and the GPU will run past 144 FPS if it can and you will experience tearing (which may be less visible to you at >144 FPS). I say leave the global vsync toggle enabled for most people on gsync monitors, with the possible exception of competitive twitch shooter players. There is little point to going over the monitors maximum refresh otherwise and it comes with some potential power/heat/noise savings beyond just eliminating tearing. Also set your windows desktop to 120 Hz and you can enjoy a higher desktop refresh while still having the gpu idle down to standard 100-200 MHz 2d clocks.
|
# ? Sep 18, 2016 02:53 |
|
C'mon, nVidia, put out official iCafes for the 10x0 series so I can stop using a mid-2015 driver.
|
# ? Sep 18, 2016 03:24 |
|
BIG HEADLINE posted:C'mon, nVidia, put out official iCafes for the 10x0 series so I can stop using a mid-2015 driver. What do these drivers do? I see some "power saving" feature, but that's it.
|
# ? Sep 18, 2016 03:28 |
|
SlayVus posted:What do these drivers do? I see some "power saving" feature, but that's it. They're on the whole far more stable than the dice-rolling GameReady drivers, because they're made for Asian internet/gaming cafes where the sysadmins don't want to have to gently caress around updating drivers for tens or hundreds of clients at once. They haven't put out a new one in over a year. You also have to disable some Chinese startup element in MSConfig, but otherwise they're the most stable drivers I've ever used.
|
# ? Sep 18, 2016 03:56 |
|
Are you specifically looking for one that works with Windows 10? Just doing a basic Google search and filtering for the past month brought up 372.60 as the most recent iCafe driver, but it only mentions being for 7 through 8.1. It doesn't look like they're really working on iCafe drivers that are compatible with 10, so if that's what you're waiting on then you'll probably be waiting a while. That Guru3D download is through the Chinese Nvidia site, LaptopVideo2Go has a link to the German site if it makes any difference. Those drivers are supposed to support all three of the 10 series cards.
|
# ? Sep 18, 2016 04:45 |
|
I'm still on Windows 7, but yeah, it seems like it's no longer a priority to them, especially since the games iCafe drivers are more often than not running are MOBAs, which aren't really that taxing on systems. Once a DX12 MOBA comes out that's worth a drat, we'll get new iCafes.
|
# ? Sep 18, 2016 04:47 |
|
I had a fun graphical hiccup today. My screen became pixelated, some of the colors messed up, and the whole thing "vibrated." It looked like something out of a movie where the bad guy subverts a computer system. After a few seconds it stopped and everything was normal, not even a notification saying the driver had to restart.
|
# ? Sep 18, 2016 04:57 |
|
Subjunctive posted:Vsync doesn't artificially limit it, it literally goes when the display is ready to accept a frame. Vsync off gives you tearing. That's how it works for regular displays, but for G-sync you need to keep V-sync OFF. Otherwise there is no effect at all. You will not get tearing if G-Sync is working.
|
# ? Sep 18, 2016 10:22 |
|
What in the world is going on here? Why is this apparently esoteric information? I had to dig up an update article from a year and a half ago to actually get the answer straight from the horse's mouth:Nvidia posted:For enthusiasts, we’ve included a new advanced control option that enables G-SYNC to be disabled when the frame rate of a game exceeds the maximum refresh rate of the G-SYNC monitor. For instance, if your frame rate can reach 250 on a 144Hz monitor, the new option will disable G-SYNC once you exceed 144 frames per second. Doing so will disable G-SYNCs goodness and reintroduce tearing, which G-SYNC eliminates, but it will improve input latency ever so slightly in games that require lighting fast reactions.
|
# ? Sep 18, 2016 10:33 |
|
It's pretty simple. The vsync setting only changes the behaviour when FPS > Max Hz. With vsync enabled it swaps from GSync to VSync when you go over, with VSync disabled it goes to default non-gsync non-vsync behaviour for FPS> Max Hz.
|
# ? Sep 18, 2016 11:31 |
|
BurritoJustice posted:It's pretty simple. The vsync setting only changes the behaviour when FPS > Max Hz. With vsync enabled it swaps from GSync to VSync when you go over, with VSync disabled it goes to default non-gsync non-vsync behaviour for FPS> Max Hz. But do you want to swap to vsync for input lag concerns if FPS exceeds refresh rate? Wouldn't capping your FPS at, say, 143 frames be a better alternative? (I have no idea).
|
# ? Sep 18, 2016 11:54 |
|
BurritoJustice posted:It's pretty simple. The vsync setting only changes the behaviour when FPS > Max Hz. With vsync enabled it swaps from GSync to VSync when you go over, with VSync disabled it goes to default non-gsync non-vsync behaviour for FPS> Max Hz. Right, that's my understanding.
|
# ? Sep 18, 2016 12:55 |
|
Shrimp or Shrimps posted:But do you want to swap to vsync for input lag concerns if FPS exceeds refresh rate? Wouldn't capping your FPS at, say, 143 frames be a better alternative?
|
# ? Sep 18, 2016 13:00 |
|
Is there any place where there is a good, up to date, writeup of all this poo poo and also what all graphics settings do and how they compare in a computational cost vs visual quality improvement point of view? Because I am always astonished at how loving esoteric all this knowledge seems to be.
|
# ? Sep 18, 2016 13:45 |
|
Geemer posted:Is there any place where there is a good, up to date, writeup of all this poo poo and also what all graphics settings do and how they compare in a computational cost vs visual quality improvement point of view? Not really , that'd be pretty long. Though it would be neat if a very generalized guide of sorts existed. But I imagine the main problem is it would be arguable wrong all the time. But honestly just avoid all that by throwing lots of money at the gpu so you can just set everything to All Please and ignore the rest
|
# ? Sep 18, 2016 13:50 |
|
The amount of visual improvement and the cost will both vary substantially according to what content you're running, too. Except Hairworks, that's always torture.
|
# ? Sep 18, 2016 14:00 |
|
Also the cost of different settings in different games can vary pretty wildly due to different implementation of features.
|
# ? Sep 18, 2016 14:00 |
|
|
# ? Jun 11, 2024 02:58 |
|
Wrar posted:Also the cost of different settings in different games can vary pretty wildly due to different implementation of features. nVidia does sometimes write some really in depth analysis of performance for popular games on their cards. They dive into the impact of individual settings. The one for The Witcher 3, for example, is extremely detailed: http://www.geforce.com/whats-new/guides/the-witcher-3-wild-hunt-graphics-performance-and-tweaking-guide
|
# ? Sep 18, 2016 15:10 |