|
So, turn on vsync in the game then? I have gsync on already
|
# ? Jan 30, 2022 20:34 |
|
|
# ? May 30, 2024 19:49 |
|
Jim Silly-Balls posted:So, turn on vsync in the game then? I have gsync on already No, turn on VSync through the Nvidia control panel. Turn in-game VSync off.
|
# ? Jan 30, 2022 21:04 |
|
sometimes "just loving cap it" it's called v sync or is listed in the same box as v sync bafflingly, but real v sync will lock you to 60 iirc when the card exceeds the refresh rate and as such the variable refresh rate range of your monitor you get tearing (although it's much less noticeable as it is on the screen much shorter than at 60fps relatively). your GPU and CPU will also exhibit peaking behaviour as it pushes itself into working harder than it needs to, starts hitting thermal or power issues on one of the components and the throttling brings it back down. this means if you give it a cap it can wind up at or close to it more often than if you hadn't.
|
# ? Jan 30, 2022 21:06 |
|
CoolCab posted:sometimes "just loving cap it" it's called v sync or is listed in the same box as v sync bafflingly, but real v sync will lock you to 60 iirc
|
# ? Jan 30, 2022 21:11 |
|
What I was advised to do was to use “Rivatuner Statistics Server” (or RTSS) to cap FPS at 4 below the maximum of the screen. This allows the Freesync or Gsync to always work. This may be outdated now, though.
|
# ? Jan 30, 2022 21:11 |
|
The advice to cap your FPS to slightly below your refresh rate is still good, but you don't need RTSS anymore, the NV control panel has a perfectly good FPS limiter built in now not sure about the AMD situation though
|
# ? Jan 30, 2022 21:27 |
|
I link this guide every 100 pages or so. https://blurbusters.com/gsync/gsync101-input-lag-tests-and-settings/14/ Yeah use the Nvidia built in frame limiter if you are Nvidia. I don't think you need park control either, windows and AMD scheduling work with each other and are fixed. Other than that it's still valid.
|
# ? Jan 30, 2022 21:41 |
|
Jim Silly-Balls posted:As for dlss I don’t really understand the settings. Again, in F1, Ultra performance mode states that it’s only for 8k screens and it looks terrible. Performance/quality/balanced modes do make a difference in framerate but I’m jot sure what they’re actually doing? You should try to stick to quality mode when it comes to DLSS. Basically, if you're at 4K, then the quality mode will make the game render at 1440p (66.67%) and DLSS will scale it up to 4K. For balanced, 1253p (58%), for performance it's 1080p (50%), and ultra performance is 720p (33.33%). This scales with your output resolution, so at 1440p, ultra performance mode is rendering internally at 480p before being upscaled. Trying to upscale 480p into a clean 1440p image is basically impossible, which is why it looks like poo poo. No amount of AI wizardry can make that look good (with real-time rendering, at least). At 4K, quality mode will often look nearly indistinguishable from native 4K. At 1440p, quality mode looks visibly softer usually, though it's often hard to notice in motion. If the game has a sharpening slider, I recommend ticking that up a little bit to compensate. DLSS can also introduce some occasional motion artifacts, most often in the form of ghosting or trails behind moving objects. The most recent revisions of the technology has cleaned this up a lot, and it also depends on the game's implementation of it.
|
# ? Jan 30, 2022 22:30 |
|
Jim Silly-Balls posted:So, turn on vsync in the game then? I have gsync on already To provide an explanation for why you should do what others have told you : VRR (Gsync/Freesync) cannot speed your display up. It can only slow it down. So on a 144hz display, VRR enables handling frames that come more than 6.94ms apart, but it does absolutely nothing if there is less than 6.94ms between frames. In that situation, you are getting the same behavior you would see with VRR off. Either you have vsync on and you have backpressure leading to input & display lag, or you have vsync off and you get tearing. Capping your framerate so that frames come slowly enough to keep VRR active is the only way to get both low latency and no tearing. You also want to force Vsync on in the control panel to catch any frames that come faster than intended and slow them down slightly rather than tearing. In-game you can turn vsync off for the most part, although there are some games where they won't work right unless you enable it. The reason to turn it off in-game is that some games will behave differently if Vsync is on, and those behavior changes are generally negative for VRR.
|
# ? Jan 30, 2022 22:45 |
|
So a 165hz display should be capped at what rate then?
|
# ? Jan 30, 2022 22:51 |
|
Jim Silly-Balls posted:So a 165hz display should be capped at what rate then? 161 would be a reasonable number. Note unless you are playing some pretty old or pretty graphically simple games, for the most part you will hardly ever hit the cap. I generally never run into my cap because I have a 1440p/240 Hz display and basically nothing made in the last 10 years that I play even gets close and the games that do aren't latency sensitive at all so I can just global vsync and be done with it.
|
# ? Jan 30, 2022 23:27 |
|
Jim Silly-Balls posted:So a 165hz display should be capped at what rate then? 164 to get Gsync/Freesync to do it's magic.
|
# ? Jan 30, 2022 23:27 |
|
Are we stuck with a current nvidia driver with idle clock issues? My 3060 is idling at 1880mhz connected to a single 1080p60 television. Any common steps I should take? Already changed power management in the nvidia control panel to normal and in windows settings to balanced e: will running scaling on windows cause it to run higher clocks? e2: yup it’s windows scaling. that sucks. get to choose what’s more important: legible interface from the couch or no fan noise Jenny Agutter fucked around with this message at 00:37 on Jan 31, 2022 |
# ? Jan 31, 2022 00:30 |
|
How is windows scaling a load on any gpu, let alone a 3060? That’s weird.
|
# ? Jan 31, 2022 00:42 |
|
I used to use windows scaling a lot, and that's not something that has ever happened to me on nvidia or amd cards. How strange. Chalk it up to another nvidia driver mystery.
|
# ? Jan 31, 2022 00:50 |
|
Jenny Agutter posted:Are we stuck with a current nvidia driver with idle clock issues? My 3060 is idling at 1880mhz connected to a single 1080p60 television. Any common steps I should take? Already changed power management in the nvidia control panel to normal and in windows settings to balanced
|
# ? Jan 31, 2022 00:53 |
|
I frequently just set my frame cap at 120hz on my 144hz monitor since that is a preset framerate in a lot of games, and it works just fine. The games I play aren't really going to be benefit from crazy refresh times, though, as far as I know.
|
# ? Jan 31, 2022 01:14 |
|
Supply seems to have improved at Micro Center. Prices are still very high though.
|
# ? Jan 31, 2022 01:34 |
|
TOOT BOOT posted:Supply seems to have improved at Micro Center. Prices are still very high though. Can vouch for this. The local MicroCenter seems to have received a bunch of AMD cards. I had mostly been looking at GeForce cards, but the prospect of being able to pick up in-person a card for just 50% over MSRP was too much to pass up. Grabbed an MSI 6900 XT for just over 1500 (plus tax) today.
|
# ? Jan 31, 2022 02:25 |
|
TOOT BOOT posted:Supply seems to have improved at Micro Center. Prices are still very high though. Somehow the Dallas Micro Center got some Radeon Pro W5500 cards in at MSRP, and I’ve reserved one for pickup tomorrow. My long quest for an adequate low-end Linux professional graphics solution is over. Maybe the tide’s finally turning?
|
# ? Jan 31, 2022 03:11 |
|
I need a reality check. I just enabled auto HDR a week ago and have since played two enabled games with it: Dark Souls 3 and Monster Hunter World. Am I crazy or is this poo poo really good? There's no way to get a real comparison since these titles never had HDR (hence the whole gimmick) but I'm actually kind of amazed at how good it seems? Or am I being duped... Honestly I hadn't turned it on before because I figured it probably sucked.
|
# ? Jan 31, 2022 07:12 |
|
Taima posted:I need a reality check. I just enabled auto HDR a week ago and have since played two enabled games with it: Dark Souls 3 and Monster Hunter World. I think it really depends on what you are using for a display. Are you outputting to an OLED TV or something? Most monitors that claim to be able to do HDR are not bright enough and don't have the hardware to actually do it well.
|
# ? Jan 31, 2022 07:25 |
|
Oh yeah I probably should have mentioned I do use a real HDR panel (OLED in this case)
|
# ? Jan 31, 2022 07:34 |
|
Taima posted:Oh yeah I probably should have mentioned I do use a real HDR panel (OLED in this case) Yeah, it makes sense you would see a benefit there. The problem is that 90% of the displays that claim to do HDR can't do HDR - you happen to have one of the 10% that can.
|
# ? Jan 31, 2022 07:38 |
|
I have one of those fake HDR monitors where it's basically just SDR but a bit brighter and with a wider color palette. Auto HDR tends to look indistinguishable from SDR for me since the one thing my monitor is good at is colors, and that feature does nothing for colors as far as I know. I've heard from people with really good HDR displays that the Auto HDR feature is surprisingly good though. It seems like something I might want to default to using whenever I eventually get a real HDR display.
|
# ? Jan 31, 2022 07:44 |
|
Yeah HDR is pretty great on my OLED tv, but not too noticeable on my Dell monitor that technically supports it.
|
# ? Jan 31, 2022 15:53 |
|
I would actually use HDR on my PC monitor if it was possible to take a screenshot while it's enabled, they get saved as SRD profile images and end up totally blown out
|
# ? Jan 31, 2022 16:00 |
|
change my name posted:I would actually use HDR on my PC monitor if it was possible to take a screenshot while it's enabled, they get saved as SRD profile images and end up totally blown out I’ve no need for it personally, but there must be some software out there that allow an HDR-enabled PrntScrn or something. I don’t know about “free” or equivalent, but there just has to be some program for graphics/photo/video people who actually calibrate their professional monitors as part of a job/serious hobby that will color-match their screens 1:1 with a screengrab. Maybe “free” is the sticking point, though, because I still have print-shop software on several different backups and the cloud because the company doesn’t exist and it’s thirty year old $1500+ we include for certain clients running well-maintained graphics machines almost as old as me. It’s harder to maintain than a 70-80 year old mechanical offset press, too.
|
# ? Jan 31, 2022 16:14 |
|
https://twitter.com/Gizmodo/status/1488271268499451906?s=20&t=WlNz5vjrsMakWFYC9hTIcQ GPU: *shuts down from my posting*
|
# ? Jan 31, 2022 23:04 |
|
Alan Smithee posted:https://twitter.com/Gizmodo/status/1488271268499451906?s=20&t=WlNz5vjrsMakWFYC9hTIcQ only a 3090ti would have the power to keep up...
|
# ? Jan 31, 2022 23:05 |
|
Can browsers/pcs be built to prevent that kind of fingerprinting, or is it always going to be an issue?
|
# ? Jan 31, 2022 23:10 |
|
well i won a shuffle 3070 incoming. time to build the rest of a computer i guess
|
# ? Feb 1, 2022 00:09 |
|
Rinkles posted:Can browsers/pcs be built to prevent that kind of fingerprinting, or is it always going to be an issue? it’s tough as long as webGL exists, there’s a seemingly endless parade of things you could measure to try and fingerprint users (which is all this is), but on the other hand everyone is intent on turning the browser into a second OS inside your existing OS and this is the eventual consequence of that. The more things you expose to the web, the easier it is to fingerprint and the more vulnerabilities that will exist in the long term. Unfortunately even if you turn off these capabilities, that is still a useful fingerprint as well, as it distinctly identifies you as a user. Paul MaudDib fucked around with this message at 00:28 on Feb 1, 2022 |
# ? Feb 1, 2022 00:25 |
|
Paul MaudDib posted:it’s tough as long as webGL exists, there’s a seemingly endless parade of things you could measure to try and fingerprint users (which is all this is), but on the other hand everyone is intent on turning the browser into a second OS inside your existing OS and this is the eventual consequence of that. The more things you expose to the web, the easier it is to fingerprint and the more vulnerabilities that will exist in the long term. Is using a vm OS just to browse web a decent idea?
|
# ? Feb 1, 2022 00:40 |
|
CaptainSarcastic posted:Yeah, it makes sense you would see a benefit there. The problem is that 90% of the displays that claim to do HDR can't do HDR - you happen to have one of the 10% that can. Yeah I mean... I get what you're saying, I wasn't really asking if HDR is good, I'm a big fan, hence the OLED, I was just specifically wondering if people felt that Auto HDR in particular, which is a Windows 11 technology that auto-applies HDR to games that never supported it in the first place, is good or placebo. Because frankly, I think it's awesome, but again since I can't compare apples to apples I was wondering if I was just getting hoodwinked by a supposedly great feature. It's the kind of feature that could easily be placebo snake oil, ya know? So i was trying to gauge the general response to it re: people who have tried it before.
|
# ? Feb 1, 2022 00:44 |
|
Paul MaudDib posted:it’s tough as long as webGL exists, there’s a seemingly endless parade of things you could measure to try and fingerprint users (which is all this is), but on the other hand everyone is intent on turning the browser into a second OS inside your existing OS and this is the eventual consequence of that. The more things you expose to the web, the easier it is to fingerprint and the more vulnerabilities that will exist in the long term. that and it's yet another timing attack, showing once again that timers were a mistake and we've been played for absolute fools firefox already quantizes timers to 100ms resolution if you enable privacy.resistFingerprinting for this reason
|
# ? Feb 1, 2022 01:51 |
|
repiv posted:that and it's yet another timing attack, showing once again that timers were a mistake and we've been played for absolute fools What does this mean? What is a timer in the context of web browsing, and what are the effects of that firefox setting?
|
# ? Feb 1, 2022 02:13 |
|
a timer in a web browser is just a timer, web pages can run arbitrary logic using javascript and that logic might need to know how much time has passed when a page queries the current time the value is nominally in milliseconds, but that extra-paranoid firefox mode rounds the returned value to the nearest multiple of 100 milliseconds that has the effect of making it far more difficult to measure timing discrepancies between machines, which is one way to fingerprint users the unwanted side effect is that 100ms is way too coarse for things like games to animate smoothly
|
# ? Feb 1, 2022 02:31 |
|
btw dont buy from mtechtx rug got pulled or Sean T's Custom PC unless that helps me get my refund, in which clase please do
|
# ? Feb 1, 2022 06:55 |
|
|
# ? May 30, 2024 19:49 |
|
Wrong thread
|
# ? Feb 1, 2022 12:06 |