Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Beve Stuscemi
Jun 6, 2001




So, turn on vsync in the game then? I have gsync on already

Adbot
ADBOT LOVES YOU

Kibner
Oct 21, 2008

Acguy Supremacy

Jim Silly-Balls posted:

So, turn on vsync in the game then? I have gsync on already

No, turn on VSync through the Nvidia control panel. Turn in-game VSync off.

CoolCab
Apr 17, 2005

glem
sometimes "just loving cap it" it's called v sync or is listed in the same box as v sync bafflingly, but real v sync will lock you to 60 iirc

when the card exceeds the refresh rate and as such the variable refresh rate range of your monitor you get tearing (although it's much less noticeable as it is on the screen much shorter than at 60fps relatively). your GPU and CPU will also exhibit peaking behaviour as it pushes itself into working harder than it needs to, starts hitting thermal or power issues on one of the components and the throttling brings it back down. this means if you give it a cap it can wind up at or close to it more often than if you hadn't.

Indiana_Krom
Jun 18, 2007
Net Slacker

CoolCab posted:

sometimes "just loving cap it" it's called v sync or is listed in the same box as v sync bafflingly, but real v sync will lock you to 60 iirc
Vsync will lock you to whatever rate your monitors refresh rate is set to, so if you have a 144 Hz monitor running at 144 Hz then vsync will lock you to 144 FPS.

Question Time
Sep 12, 2010



What I was advised to do was to use “Rivatuner Statistics Server” (or RTSS) to cap FPS at 4 below the maximum of the screen. This allows the Freesync or Gsync to always work. This may be outdated now, though.

repiv
Aug 13, 2009

The advice to cap your FPS to slightly below your refresh rate is still good, but you don't need RTSS anymore, the NV control panel has a perfectly good FPS limiter built in now

not sure about the AMD situation though

Quaint Quail Quilt
Jun 19, 2006


Ask me about that time I told people mixing bleach and vinegar is okay
I link this guide every 100 pages or so.
https://blurbusters.com/gsync/gsync101-input-lag-tests-and-settings/14/

Yeah use the Nvidia built in frame limiter if you are Nvidia.
I don't think you need park control either, windows and AMD scheduling work with each other and are fixed.
Other than that it's still valid.

Dr. Video Games 0031
Jul 17, 2004

Jim Silly-Balls posted:

As for dlss I don’t really understand the settings. Again, in F1, Ultra performance mode states that it’s only for 8k screens and it looks terrible. Performance/quality/balanced modes do make a difference in framerate but I’m jot sure what they’re actually doing?

You should try to stick to quality mode when it comes to DLSS. Basically, if you're at 4K, then the quality mode will make the game render at 1440p (66.67%) and DLSS will scale it up to 4K. For balanced, 1253p (58%), for performance it's 1080p (50%), and ultra performance is 720p (33.33%). This scales with your output resolution, so at 1440p, ultra performance mode is rendering internally at 480p before being upscaled. Trying to upscale 480p into a clean 1440p image is basically impossible, which is why it looks like poo poo. No amount of AI wizardry can make that look good (with real-time rendering, at least). At 4K, quality mode will often look nearly indistinguishable from native 4K. At 1440p, quality mode looks visibly softer usually, though it's often hard to notice in motion. If the game has a sharpening slider, I recommend ticking that up a little bit to compensate. DLSS can also introduce some occasional motion artifacts, most often in the form of ghosting or trails behind moving objects. The most recent revisions of the technology has cleaned this up a lot, and it also depends on the game's implementation of it.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

Jim Silly-Balls posted:

So, turn on vsync in the game then? I have gsync on already

To provide an explanation for why you should do what others have told you :

VRR (Gsync/Freesync) cannot speed your display up. It can only slow it down. So on a 144hz display, VRR enables handling frames that come more than 6.94ms apart, but it does absolutely nothing if there is less than 6.94ms between frames. In that situation, you are getting the same behavior you would see with VRR off. Either you have vsync on and you have backpressure leading to input & display lag, or you have vsync off and you get tearing. Capping your framerate so that frames come slowly enough to keep VRR active is the only way to get both low latency and no tearing. You also want to force Vsync on in the control panel to catch any frames that come faster than intended and slow them down slightly rather than tearing. In-game you can turn vsync off for the most part, although there are some games where they won't work right unless you enable it. The reason to turn it off in-game is that some games will behave differently if Vsync is on, and those behavior changes are generally negative for VRR.

Beve Stuscemi
Jun 6, 2001




So a 165hz display should be capped at what rate then?

Indiana_Krom
Jun 18, 2007
Net Slacker

Jim Silly-Balls posted:

So a 165hz display should be capped at what rate then?

161 would be a reasonable number. Note unless you are playing some pretty old or pretty graphically simple games, for the most part you will hardly ever hit the cap. I generally never run into my cap because I have a 1440p/240 Hz display and basically nothing made in the last 10 years that I play even gets close and the games that do aren't latency sensitive at all so I can just global vsync and be done with it.

Bondematt
Jan 26, 2007

Not too stupid

Jim Silly-Balls posted:

So a 165hz display should be capped at what rate then?

164 to get Gsync/Freesync to do it's magic.

Jenny Agutter
Mar 18, 2009

Are we stuck with a current nvidia driver with idle clock issues? My 3060 is idling at 1880mhz connected to a single 1080p60 television. Any common steps I should take? Already changed power management in the nvidia control panel to normal and in windows settings to balanced

e: will running scaling on windows cause it to run higher clocks?

e2: yup it’s windows scaling. that sucks. get to choose what’s more important: legible interface from the couch or no fan noise

Jenny Agutter fucked around with this message at 00:37 on Jan 31, 2022

Beve Stuscemi
Jun 6, 2001




How is windows scaling a load on any gpu, let alone a 3060?

That’s weird.

Dr. Video Games 0031
Jul 17, 2004

I used to use windows scaling a lot, and that's not something that has ever happened to me on nvidia or amd cards. How strange. Chalk it up to another nvidia driver mystery.

kliras
Mar 27, 2021

Jenny Agutter posted:

Are we stuck with a current nvidia driver with idle clock issues? My 3060 is idling at 1880mhz connected to a single 1080p60 television. Any common steps I should take? Already changed power management in the nvidia control panel to normal and in windows settings to balanced

e: will running scaling on windows cause it to run higher clocks?

e2: yup it’s windows scaling. that sucks. get to choose what’s more important: legible interface from the couch or no fan noise
Do you have latest manufacturer-specific firmware for the GPU (VBIOS)? A lot of cards have weird idle and multi-monitor behaviour fixed by it.

CaptainSarcastic
Jul 6, 2013



I frequently just set my frame cap at 120hz on my 144hz monitor since that is a preset framerate in a lot of games, and it works just fine. The games I play aren't really going to be benefit from crazy refresh times, though, as far as I know.

TOOT BOOT
May 25, 2010

Supply seems to have improved at Micro Center. Prices are still very high though.

litany of gulps
Jun 11, 2001

Fun Shoe

TOOT BOOT posted:

Supply seems to have improved at Micro Center. Prices are still very high though.

Can vouch for this. The local MicroCenter seems to have received a bunch of AMD cards. I had mostly been looking at GeForce cards, but the prospect of being able to pick up in-person a card for just 50% over MSRP was too much to pass up. Grabbed an MSI 6900 XT for just over 1500 (plus tax) today.

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.

TOOT BOOT posted:

Supply seems to have improved at Micro Center. Prices are still very high though.

Somehow the Dallas Micro Center got some Radeon Pro W5500 cards in at MSRP, and I’ve reserved one for pickup tomorrow. My long quest for an adequate low-end Linux professional graphics solution is over.

Maybe the tide’s finally turning?

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
I need a reality check. I just enabled auto HDR a week ago and have since played two enabled games with it: Dark Souls 3 and Monster Hunter World.

Am I crazy or is this poo poo really good? There's no way to get a real comparison since these titles never had HDR (hence the whole gimmick) but I'm actually kind of amazed at how good it seems? Or am I being duped...

Honestly I hadn't turned it on before because I figured it probably sucked.

CaptainSarcastic
Jul 6, 2013



Taima posted:

I need a reality check. I just enabled auto HDR a week ago and have since played two enabled games with it: Dark Souls 3 and Monster Hunter World.

Am I crazy or is this poo poo really good? There's no way to get a real comparison since these titles never had HDR (hence the whole gimmick) but I'm actually kind of amazed at how good it seems? Or am I being duped...

Honestly I hadn't turned it on before because I figured it probably sucked.

I think it really depends on what you are using for a display. Are you outputting to an OLED TV or something? Most monitors that claim to be able to do HDR are not bright enough and don't have the hardware to actually do it well.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
Oh yeah I probably should have mentioned I do use a real HDR panel (OLED in this case)

CaptainSarcastic
Jul 6, 2013



Taima posted:

Oh yeah I probably should have mentioned I do use a real HDR panel (OLED in this case)

Yeah, it makes sense you would see a benefit there. The problem is that 90% of the displays that claim to do HDR can't do HDR - you happen to have one of the 10% that can.

Dr. Video Games 0031
Jul 17, 2004

I have one of those fake HDR monitors where it's basically just SDR but a bit brighter and with a wider color palette. Auto HDR tends to look indistinguishable from SDR for me since the one thing my monitor is good at is colors, and that feature does nothing for colors as far as I know. I've heard from people with really good HDR displays that the Auto HDR feature is surprisingly good though. It seems like something I might want to default to using whenever I eventually get a real HDR display.

Enos Cabell
Nov 3, 2004


Yeah HDR is pretty great on my OLED tv, but not too noticeable on my Dell monitor that technically supports it.

change my name
Aug 27, 2007

Legends die but anime is forever.

RIP The Lost Otakus.

I would actually use HDR on my PC monitor if it was possible to take a screenshot while it's enabled, they get saved as SRD profile images and end up totally blown out

DerekSmartymans
Feb 14, 2005

The
Copacetic
Ascetic

change my name posted:

I would actually use HDR on my PC monitor if it was possible to take a screenshot while it's enabled, they get saved as SRD profile images and end up totally blown out

I’ve no need for it personally, but there must be some software out there that allow an HDR-enabled PrntScrn or something. I don’t know about “free” or equivalent, but there just has to be some program for graphics/photo/video people who actually calibrate their professional monitors as part of a job/serious hobby that will color-match their screens 1:1 with a screengrab.

Maybe “free” is the sticking point, though, because I still have print-shop software on several different backups and the cloud because the company doesn’t exist and it’s thirty year old $1500+ we include for certain clients running well-maintained graphics machines almost as old as me. It’s harder to maintain than a 70-80 year old mechanical offset press, too.

Alan Smithee
Jan 4, 2005


A man becomes preeminent, he's expected to have enthusiasms.

Enthusiasms, enthusiasms...
https://twitter.com/Gizmodo/status/1488271268499451906?s=20&t=WlNz5vjrsMakWFYC9hTIcQ

GPU: *shuts down from my posting*

buffalo all day
Mar 13, 2019


only a 3090ti would have the power to keep up...

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
Can browsers/pcs be built to prevent that kind of fingerprinting, or is it always going to be an issue?

Too Many Birds
Jan 8, 2020


well i won a shuffle

3070 incoming.

time to build the rest of a computer i guess

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Rinkles posted:

Can browsers/pcs be built to prevent that kind of fingerprinting, or is it always going to be an issue?

it’s tough as long as webGL exists, there’s a seemingly endless parade of things you could measure to try and fingerprint users (which is all this is), but on the other hand everyone is intent on turning the browser into a second OS inside your existing OS and this is the eventual consequence of that. The more things you expose to the web, the easier it is to fingerprint and the more vulnerabilities that will exist in the long term.

Unfortunately even if you turn off these capabilities, that is still a useful fingerprint as well, as it distinctly identifies you as a user.

Paul MaudDib fucked around with this message at 00:28 on Feb 1, 2022

Sininu
Jan 8, 2014

Paul MaudDib posted:

it’s tough as long as webGL exists, there’s a seemingly endless parade of things you could measure to try and fingerprint users (which is all this is), but on the other hand everyone is intent on turning the browser into a second OS inside your existing OS and this is the eventual consequence of that. The more things you expose to the web, the easier it is to fingerprint and the more vulnerabilities that will exist in the long term.

Unfortunately even if you turn off these capabilities, that is still a useful fingerprint as well, as it distinctly identifies you as a user.

Is using a vm OS just to browse web a decent idea?

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

CaptainSarcastic posted:

Yeah, it makes sense you would see a benefit there. The problem is that 90% of the displays that claim to do HDR can't do HDR - you happen to have one of the 10% that can.

Yeah I mean... I get what you're saying, I wasn't really asking if HDR is good, I'm a big fan, hence the OLED, I was just specifically wondering if people felt that Auto HDR in particular, which is a Windows 11 technology that auto-applies HDR to games that never supported it in the first place, is good or placebo.

Because frankly, I think it's awesome, but again since I can't compare apples to apples I was wondering if I was just getting hoodwinked by a supposedly great feature.

It's the kind of feature that could easily be placebo snake oil, ya know? So i was trying to gauge the general response to it re: people who have tried it before. :cheers:

repiv
Aug 13, 2009

Paul MaudDib posted:

it’s tough as long as webGL exists, there’s a seemingly endless parade of things you could measure to try and fingerprint users (which is all this is), but on the other hand everyone is intent on turning the browser into a second OS inside your existing OS and this is the eventual consequence of that. The more things you expose to the web, the easier it is to fingerprint and the more vulnerabilities that will exist in the long term.

Unfortunately even if you turn off these capabilities, that is still a useful fingerprint as well, as it distinctly identifies you as a user.

that and it's yet another timing attack, showing once again that timers were a mistake and we've been played for absolute fools

firefox already quantizes timers to 100ms resolution if you enable privacy.resistFingerprinting for this reason

Dr. Video Games 0031
Jul 17, 2004

repiv posted:

that and it's yet another timing attack, showing once again that timers were a mistake and we've been played for absolute fools

firefox already quantizes timers to 100ms resolution if you enable privacy.resistFingerprinting for this reason

What does this mean? What is a timer in the context of web browsing, and what are the effects of that firefox setting?

repiv
Aug 13, 2009

a timer in a web browser is just a timer, web pages can run arbitrary logic using javascript and that logic might need to know how much time has passed

when a page queries the current time the value is nominally in milliseconds, but that extra-paranoid firefox mode rounds the returned value to the nearest multiple of 100 milliseconds

that has the effect of making it far more difficult to measure timing discrepancies between machines, which is one way to fingerprint users

the unwanted side effect is that 100ms is way too coarse for things like games to animate smoothly

Alan Smithee
Jan 4, 2005


A man becomes preeminent, he's expected to have enthusiasms.

Enthusiasms, enthusiasms...
btw dont buy from mtechtx

rug got pulled

or Sean T's Custom PC

unless that helps me get my refund, in which clase please do

Adbot
ADBOT LOVES YOU

Alan Smithee
Jan 4, 2005


A man becomes preeminent, he's expected to have enthusiasms.

Enthusiasms, enthusiasms...
Wrong thread

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply