Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
fknlo
Jul 6, 2009


Fun Shoe

Paul MaudDib posted:

Dumb question but have you tried replacing the cable?

I have not. It's pretty intermittent for the most part so it's definitely a possibility. Looks like I'm 1 day over Microcenter's return policy anyway, so hopefully it's something simple.

Adbot
ADBOT LOVES YOU

Naffer
Oct 26, 2004

Not a good chemist

Paul MaudDib posted:

The 1070 is current-gen/newer, has big improvements in virtual reality performance, has slightly more VRAM, does better in DX12/Vulkan, has more DisplayPort connectors, and is a lot more power efficient.

Are there any games out or that have been announced that use the Nvidia VR features? What is this speedup supposed to even be worth? This is probably a comment for another thread, but the VR game landscape doesn't look much better now than it did 6 months ago.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

Paul MaudDib posted:

MSI Afterburner is a third-party tweaking app, not a driver stack. You don't need Afterburner if you don't want to mess with overclocking/undervolting/fan curves, but either way you need Crimson.

It's certainly true that partner drivers have been poo poo since forever though. I haven't used once since like 2000 or something, but there are a few devices like laptops that need custom drivers. Mostly nowadays it's just a really old copy of the official drivers so the guy at the South Pole who doesn't have internet can have something better than the Default VGA Adapter driver.

edit: In other news, I just discovered that the NVIDIA control panel is smart enough to remember refresh rates in specific monitor configurations. When I turn on my second monitor (it disconnects when off) it automatically drops my main monitor to 60 Hz, which I use for working, and when I turn it off it boosts my main monitor back to 144 Hz, which I use for gaming :swoon:

Any reason why you don't run it at 144Hz all the time?

teh_Broseph
Oct 21, 2010

THE LAST METROID IS IN
CATTIVITY. THE GALAXY
IS AT PEACE...
Lipstick Apathy

Naffer posted:

Are there any games out or that have been announced that use the Nvidia VR features? What is this speedup supposed to even be worth? This is probably a comment for another thread, but the VR game landscape doesn't look much better now than it did 6 months ago.

Obduction is the first thing on my radar, but the game's out and VR support has been delayed. Bah.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

AVeryLargeRadish posted:

Any reason why you don't run it at 144Hz all the time?

My 980 Ti clocks up and pulls an additional 60 watts when I use 144 Hz, even if I'm just at the desktop. On the flip side it idles fine even with a 4K and a 1440p monitor - as long as you're at 60 Hz.

AFAIK this actually still affects Pascal too, it's just that Pascal pulls less power in general.

Paul MaudDib fucked around with this message at 01:53 on Sep 17, 2016

Watermelon Daiquiri
Jul 10, 2010
I TRIED TO BAIT THE TXPOL THREAD WITH THE WORLD'S WORST POSSIBLE TAKE AND ALL I GOT WAS THIS STUPID AVATAR.
Have you updated the driver? I have a 1070 and two monitors, one 1080/60 the other 1440/144, and its has no issue staying at idle. I think they fixed it recently.

some dillweed
Mar 31, 2007

From last page:

japtor posted:

Probably gonna wait it out until Black Friday (considering I'm basically getting this for Pac Man of all things), but assuming nothing really changes in terms of any new cards coming, is the EVGA 1060 6GB the one to get if I want a short card? Doesn't seem like there's that many other options and I vaguely recall some positive posts or something about it. And is the SC version worth the extra :10bux: or however much it is? I don't really plan on OCing or anything so maybe it'd be worth the higher base clock?
According to some reddit posts, customer reviews, and posts on EVGA's forums, the non-SC cooler is pretty bad. Some people talk about 85+°C temperatures under regular gaming loads using the stock fan settings, and the throttle point is supposed to be 83°C. That might be due to poor airflow in their cases, I don't know. The standard one supposedly uses EVGA's GTX 950 cooler which is just a solid block heatsink, where the SC version uses their 960 SC cooler which has a couple of heatpipes and seems to generally run around 10°C colder. If there isn't much of a price difference (it's $15 CAD here), you should probably get the SC.

some dillweed fucked around with this message at 20:53 on Sep 17, 2016

lock stock and Cheryl
Dec 19, 2009

by zen death robot
Would it be unwise to use a riser cable so I can put my video card back-to-back with my motherboard? Not literally touching, but within like a half inch. I have an idea for a mITX setup

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Kaleidoscopic Gaze posted:

Would it be unwise to use a riser cable so I can put my video card back-to-back with my motherboard? Not literally touching, but within like a half inch. I have an idea for a mITX setup

Yeah I've had that idea too and it can work but the riser has to be long enough to fold under the motherboard if you want the graphics card fan to be facing out. Check out li-heat, they're a Chinese company that sells some nicely shielded risers that I've ordered from. Depending on your motherboard, you might still have to go to PCI 2.0 to get it stable with a riser that long.

japtor
Oct 28, 2005

Grog posted:

From last page:
According to some reddit posts, customer reviews, and posts on EVGA's forums, the non-SC cooler is pretty bad. Some people talk about 85+°C temperatures under regular gaming loads using the stock fan settings, and the throttle point is supposed to be 83°C. That might be due to poor airflow in their cases, I don't know. The standard one supposedly uses EVGA's GTX 950 cooler which is just a solid block heatsink, where the SC version uses their 960 SC cooler which has a couple of heatpipes and seems to generally run around 10°C colder. If there isn't much of a price difference (it's $15 CAD here), you should probably get the SC.
Oh thanks, I thought it was the same just binned for higher clocks or something. I'll just shell out the little extra for the SC then.

Kaleidoscopic Gaze posted:

Would it be unwise to use a riser cable so I can put my video card back-to-back with my motherboard? Not literally touching, but within like a half inch. I have an idea for a mITX setup
That's the whole basis for the Dan A4 case isn't it?

SlayVus
Jul 10, 2009
Grimey Drawer

Kaleidoscopic Gaze posted:

Would it be unwise to use a riser cable so I can put my video card back-to-back with my motherboard? Not literally touching, but within like a half inch. I have an idea for a mITX setup

Smallest, full size GPU capable, mITX case.

https://www.dan-cases.com/dana4.php

Basically, the same size as a Razer Core eternal GPU dock.

tima
Mar 1, 2001

No longer a newbie
Another small size case alternative http://zaber.com.pl/sentry/

Fuzzy Mammal
Aug 15, 2001

Lipstick Apathy
So if I have a gsyc monitor and card and want to set up everything optimally I should:

1) set the monitor refresh as high as possible in nvidia control panel

2) enable gsyc in the control panel also

3) under vsync, set it to fast in the control panel

4) in games' settings, set vsync to off

Is that generally correct?

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Set vsync to on.

CharlieFoxtrot
Mar 27, 2007

organize digital employees



That last post reminded me to actually go and look at the settings for my new 1070/144hz monitor, and I didn't know I could set the desktop to 144Hz.

Now I'm freaking out, everything is so smooth, maybe even way too smooth. When I bring the mouse cursor over to my old monitor I can see the cursor trails now

Fabricated
Apr 9, 2007

Living the Dream
So, is Nvidia ever gonna back off on the sign-in requirement to the GeForce Experience stuff or should I just uninstall it and stick with manually downloading drivers until they stop offering that too?

penus penus penus
Nov 9, 2014

by piss__donald
No. But you could just make a bs account...

Klyith
Aug 3, 2007

GBS Pledge Week

Fuzzy Mammal posted:

So if I have a gsyc monitor and card and want to set up everything optimally I should:

3) under vsync, set it to fast in the control panel

Is that generally correct?

3) Fast vsync is probably not needed for every game -- yes to competitive shooters or twitch games, no real benefit to anything else. Because of the way it works, the GPU will be rendering and discarding frames as fast as it can, keeping itself 100% maxed. Less input latency, but kinda wasteful for games where you don't care about input latency (and maybe don't want your fans spinning all the way up).


Fabricated posted:

So, is Nvidia ever gonna back off on the sign-in requirement to the GeForce Experience stuff or should I just uninstall it and stick with manually downloading drivers until they stop offering that too?
I think they backed off on the drivers thing, probably forever. They got lots of backlash from that idea, not just from randos but also from the tech press. Reviewers don't want to deal with GFE just to get drivers, when it otherwise makes their jobs harder by messing with game settings.

Kazinsal
Dec 13, 2011

CharlieFoxtrot posted:

That last post reminded me to actually go and look at the settings for my new 1070/144hz monitor, and I didn't know I could set the desktop to 144Hz.

Now I'm freaking out, everything is so smooth, maybe even way too smooth. When I bring the mouse cursor over to my old monitor I can see the cursor trails now

Welcome to the dark side, friend. Time to buy a whole new rack of 144 Hz monitors :kheldragar:

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

Subjunctive posted:

Set vsync to on.

No, for Gsync keep vsync off. Your monitor handles timing, you don't need to artificially limit it.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Lockback posted:

No, for Gsync keep vsync off. Your monitor handles timing, you don't need to artificially limit it.

Vsync doesn't artificially limit it, it literally goes when the display is ready to accept a frame. Vsync off gives you tearing.

B-Mac
Apr 21, 2003
I'll never catch "the gay"!

Subjunctive posted:

Vsync doesn't artificially limit it, it literally goes when the display is ready to accept a frame. Vsync off gives you tearing.

I'm assuming it's the same with gsync but with freesync you don't need vsync on to prevent tearing. You just need to set a FPS limit below your max freesync range.

Indiana_Krom
Jun 18, 2007
Net Slacker
When you are using gsync, the vsync toggle in the driver turns into a FPS cap. Enable it and gsync will cap your framerate at the monitors maximum 144 Hz refresh rate so you won't experience any tearing even at high framerates. Disable it and the GPU will run past 144 FPS if it can and you will experience tearing (which may be less visible to you at >144 FPS). I say leave the global vsync toggle enabled for most people on gsync monitors, with the possible exception of competitive twitch shooter players. There is little point to going over the monitors maximum refresh otherwise and it comes with some potential power/heat/noise savings beyond just eliminating tearing.

Also set your windows desktop to 120 Hz and you can enjoy a higher desktop refresh while still having the gpu idle down to standard 100-200 MHz 2d clocks.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
C'mon, nVidia, put out official iCafes for the 10x0 series so I can stop using a mid-2015 driver.

SlayVus
Jul 10, 2009
Grimey Drawer

BIG HEADLINE posted:

C'mon, nVidia, put out official iCafes for the 10x0 series so I can stop using a mid-2015 driver.

What do these drivers do? I see some "power saving" feature, but that's it.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

SlayVus posted:

What do these drivers do? I see some "power saving" feature, but that's it.

They're on the whole far more stable than the dice-rolling GameReady drivers, because they're made for Asian internet/gaming cafes where the sysadmins don't want to have to gently caress around updating drivers for tens or hundreds of clients at once. They haven't put out a new one in over a year.

You also have to disable some Chinese startup element in MSConfig, but otherwise they're the most stable drivers I've ever used.

some dillweed
Mar 31, 2007

Are you specifically looking for one that works with Windows 10? Just doing a basic Google search and filtering for the past month brought up 372.60 as the most recent iCafe driver, but it only mentions being for 7 through 8.1. It doesn't look like they're really working on iCafe drivers that are compatible with 10, so if that's what you're waiting on then you'll probably be waiting a while. That Guru3D download is through the Chinese Nvidia site, LaptopVideo2Go has a link to the German site if it makes any difference. Those drivers are supposed to support all three of the 10 series cards.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
I'm still on Windows 7, but yeah, it seems like it's no longer a priority to them, especially since the games iCafe drivers are more often than not running are MOBAs, which aren't really that taxing on systems. Once a DX12 MOBA comes out that's worth a drat, we'll get new iCafes.

Yaoi Gagarin
Feb 20, 2014

I had a fun graphical hiccup today. My screen became pixelated, some of the colors messed up, and the whole thing "vibrated." It looked like something out of a movie where the bad guy subverts a computer system. After a few seconds it stopped and everything was normal, not even a notification saying the driver had to restart. :v:

pigdog
Apr 23, 2004

by Smythe

Subjunctive posted:

Vsync doesn't artificially limit it, it literally goes when the display is ready to accept a frame. Vsync off gives you tearing.

That's how it works for regular displays, but for G-sync you need to keep V-sync OFF. Otherwise there is no effect at all. You will not get tearing if G-Sync is working.

HMS Boromir
Jul 16, 2011

by Lowtax
What in the world is going on here? Why is this apparently esoteric information? I had to dig up an update article from a year and a half ago to actually get the answer straight from the horse's mouth:

Nvidia posted:

For enthusiasts, we’ve included a new advanced control option that enables G-SYNC to be disabled when the frame rate of a game exceeds the maximum refresh rate of the G-SYNC monitor. For instance, if your frame rate can reach 250 on a 144Hz monitor, the new option will disable G-SYNC once you exceed 144 frames per second. Doing so will disable G-SYNCs goodness and reintroduce tearing, which G-SYNC eliminates, but it will improve input latency ever so slightly in games that require lighting fast reactions.

To use this new mode, set “Vertical sync” to “Off” on a global or per-game basis in the “Manage 3D settings” section of the NVIDIA Control Panel. When your frame rate exceeds your monitor’s rated G-SYNC refresh rate, for example 144Hz, G-SYNC will be disabled.

BurritoJustice
Oct 9, 2012

It's pretty simple. The vsync setting only changes the behaviour when FPS > Max Hz. With vsync enabled it swaps from GSync to VSync when you go over, with VSync disabled it goes to default non-gsync non-vsync behaviour for FPS> Max Hz.

Shrimp or Shrimps
Feb 14, 2012


BurritoJustice posted:

It's pretty simple. The vsync setting only changes the behaviour when FPS > Max Hz. With vsync enabled it swaps from GSync to VSync when you go over, with VSync disabled it goes to default non-gsync non-vsync behaviour for FPS> Max Hz.

But do you want to swap to vsync for input lag concerns if FPS exceeds refresh rate? Wouldn't capping your FPS at, say, 143 frames be a better alternative?

(I have no idea).

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

BurritoJustice posted:

It's pretty simple. The vsync setting only changes the behaviour when FPS > Max Hz. With vsync enabled it swaps from GSync to VSync when you go over, with VSync disabled it goes to default non-gsync non-vsync behaviour for FPS> Max Hz.

Right, that's my understanding.

Indiana_Krom
Jun 18, 2007
Net Slacker

Shrimp or Shrimps posted:

But do you want to swap to vsync for input lag concerns if FPS exceeds refresh rate? Wouldn't capping your FPS at, say, 143 frames be a better alternative?

(I have no idea).
Not really, capping at 143 would produce pretty much identical results (or 1 frame per second slower actually). On a gsync or a freesync display, input lag at 144 Hz is going to be no more than ~7 MS with or without vsync. There is no fixed refresh interval to miss, if you are off by .1 MS the monitor will simply wait for it. The stacking input lag from a double buffer vsync just doesn't happen anymore thanks to the nature of the variable refresh technologies.

Geemer
Nov 4, 2010



Is there any place where there is a good, up to date, writeup of all this poo poo and also what all graphics settings do and how they compare in a computational cost vs visual quality improvement point of view?

Because I am always astonished at how loving esoteric all this knowledge seems to be.

penus penus penus
Nov 9, 2014

by piss__donald

Geemer posted:

Is there any place where there is a good, up to date, writeup of all this poo poo and also what all graphics settings do and how they compare in a computational cost vs visual quality improvement point of view?

Because I am always astonished at how loving esoteric all this knowledge seems to be.

Not really , that'd be pretty long. Though it would be neat if a very generalized guide of sorts existed. But I imagine the main problem is it would be arguable wrong all the time.

But honestly just avoid all that by throwing lots of money at the gpu so you can just set everything to All Please and ignore the rest

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

The amount of visual improvement and the cost will both vary substantially according to what content you're running, too. Except Hairworks, that's always torture.

Wrar
Sep 9, 2002


Soiled Meat
Also the cost of different settings in different games can vary pretty wildly due to different implementation of features.

Adbot
ADBOT LOVES YOU

Col.Kiwi
Dec 28, 2004
And the grave digger puts on the forceps...

Wrar posted:

Also the cost of different settings in different games can vary pretty wildly due to different implementation of features.
Yup, it's true. Also how the cost of any particular feature or setting sometimes varies depending on video card, driver, etc.

nVidia does sometimes write some really in depth analysis of performance for popular games on their cards. They dive into the impact of individual settings. The one for The Witcher 3, for example, is extremely detailed: http://www.geforce.com/whats-new/guides/the-witcher-3-wild-hunt-graphics-performance-and-tweaking-guide

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply