|
K8.0 posted:The chrome bug still exists. Probably will be fixed in about... 10 years. that being said, the new nvidia looks really nice. people act like the old control panel is perfect in every way, but i personally don't like how it takes a whole thirty seconds to find and load up the settings for a game so i can do something basic like cap the fps
|
# ? Feb 22, 2024 22:30 |
|
|
# ? Jun 10, 2024 12:25 |
|
SlowBloke posted:Nvidia also had one of the few persistent fido2 implementation, meaning you could log on nvidia with a yubikey and nothing else. They recently added passkey so again, no need to use passwords unless you really want it. Hey that's great... for a bank's website, on the internet. You know what it's not great for? An application running locally on my computer that updates drivers and adjusts GPU settings. Geese. For all eternity.
|
# ? Feb 22, 2024 22:44 |
|
I'm curious, what do you guys think is the best "bang for your buck" refresh rate? (with VRR obviously, since it's so standard at this point). I've been thinking about it a lot because path tracing has effectively reset the capabilities of even top tier graphics cards. On some games I find myself having to worry about if 60 fps is enough, or 50 fps, or 70... A lot of people would say that above 60 fps is where you start to get negligible improvements. I would definitely disagree with that; going from 60 fps to 80-90 fps is a huge quality of life increase imo. Though, going above 90-100 fps, that's where I think it starts to matter less. On my path traced games I've therefore been angling for about 80 fps. I think that is the sweet spot. 60 fps is too low, and above 80 fps involves DLSS concessions that I'm generally not willing to make (Balanced is the lowest i'll go). Just curious where everyone is with it though?
|
# ? Feb 22, 2024 22:51 |
|
165 hz is pretty good for me. but it can get a little trickier if you want 4k hdr over older displayport specs, since you run into bandwidth problems which means having to use chroma subsampling or dsc with 165, your fps is probably going to be lower than your refresh rate anyway, so you might not even have to cap fps for g-sync etc to kick in, unless it's to deal with a lack of 60fps caps in main menus consoles obviously aren't going to need more than 120 hz, unless you're some kind of input latency pervert
|
# ? Feb 22, 2024 23:00 |
|
Cyrano4747 posted:When is the last time a driver update has really hosed up an older, established game? At least with the established players, maybe Intel is still in their wild early days. Forza Horizon 4 was broken on Nvidia GPU's from Aug/Sep 2022 to like Jan/Feb 2023. It would just crash after about 30 mins of gameplay. The solution was to roll back to older drivers. There's also that Chrome bug that's still here. And it took them 6+ months to fix Discord streaming. They hosed up their encoder in some release which made the image occasionally go monochrome and ultra low quality for a minute or two. They fixed it about two months ago. Sininu fucked around with this message at 23:06 on Feb 22, 2024 |
# ? Feb 22, 2024 23:00 |
|
I remember Nvidia drivers throwing minor errors in Doom 2016’s Vulkan mode a few years ago - single frame polygon rendering glitches in models, missing effects on some surfaces (especially during a key cinematic where their absence is really notable), and other weirdness that didn’t manifest in the OpenGL renderer. Nobody’s drivers are perfect - GPUs are too complicated and programming provides too many edge cases for perfection to last - but Nvidia’s made it a point to throw lots of engineering at these problems and then work to keep solutions from breaking afterward. There’s a reason I snagged a GeForce for the family PC recently: the drivers work 99+% of the time, the feature set is solid to leading edge, and they’ll be supported for much longer than the hardware will be generally relevant or performant. AMD’s made huge strides but still tends toward a reactive stance on problems and doesn’t have the same engineering budget as Nvidia - I genuinely think their near-starvation/construction core years stunted their GPGPU ambitions for good. Intel’s doing better than I expected considering how late they started and how historically shabby their IGP drivers were for Windows. They’ll never tunnel down to the engineering challenge of ensuring pre-DX 9 games run well on Arc, but Intel is throwing a lot of work into becoming proficient for modern software, and AMD’s graphics division has something to worry about if Battlemage executes well. edit: The worst GPU driver snafu I remember was playing Unreal II on the last released Radeon 7500 drivers, shooting a spider egg, and having my otherwise stable machine bluescreen and poo poo its pants. This was reproducible. I reported the bug, the driver maintainer wrote back to me, and basically said “yep, does the same thing here, but we’re done with these drivers, so good luck.” Clownassery.
|
# ? Feb 22, 2024 23:14 |
|
If I have RTX HDR enabled, do I want to turn off Windows AutoHDR?
|
# ? Feb 22, 2024 23:42 |
|
can't say for sure, but rtx hdr is supposed to have a better handle on the gamma curve iirc, so you could try to disable autohdr and see if the brightness of the game visibly changes (on account of a difference in gamma)
|
# ? Feb 22, 2024 23:45 |
|
You can press AltF3 ingame and modify peak brightness, midpoint, contrast and saturation in real time, it's great.
|
# ? Feb 23, 2024 00:17 |
|
K8.0 posted:The chrome bug still exists. Probably will be fixed in about... 10 years. It's not a NVIDIA driver bug, it's a bug with Microsoft DirectComposition. It's already fixed in Windows 26002 and newer, you need to be on W11 Dev or Canary builds to get it. Hopefully they back patch it to stable.
|
# ? Feb 23, 2024 00:20 |
|
How do folks feel about the 4070S for VR? I want to hold out for 5000 series but I'm impatient.
|
# ? Feb 23, 2024 00:21 |
|
5000 will probably be over a year away
|
# ? Feb 23, 2024 00:24 |
|
MixMasterMalaria posted:How do folks feel about the 4070S for VR? I want to hold out for 5000 series but I'm impatient. A lot of VR apps tend to target the rendering hardware native in the headset or significantly lower-end hardware, despite (or because of) the high effective resolutions the games have to run at. A 4070S will chunk through 95% of VR apps without a care in the world.
|
# ? Feb 23, 2024 00:40 |
|
Subjunctive posted:Microsoft should set a better example there, then! Well, they kind of do. For stuff like Settings in Win11, I think WinUI works decently. It's when they this have his hodgepodge of stuff like combining winforms/xaml/winui in explorer and the laughably glacial pace in modernizing all of the other legacy control panels that it goes to poo poo. The main problem with recent MS attempts at a modern UX is they don't follow their own examples! NVCP that used WinUI and looks like any other modern Win app and properly conforms to your DPI settings would have been preferable imo. Still, this is already a huge improvement over the old, the lack of login requirement and the massive speed increase alone is what I've been waiting for - going on a decade now. Hell even if you do want to login, it now properly (imo) opens up a web browser, drat I hate when apps integrate their own UI for login so you have to manually cut and paste from your password manager. Quite a bit of stuff to come though. I don't like how the GFE Optimize settings is now more prominent over the regular control panel settings when you select a game's profile, that should be an option to either remove it entirely or have it come 2nd. Ideally you would be able to adjust each game's settings individually instead of just a slider too, but I've wanted that for years even in GFE and they've never done it so who knows. Also adding games profiles manually is a bitch, you only have navigating folders, no way to click on recently used games - adding profiles for Gamepass games is far more cumbersome atm. No way that I can see to search for game titles either. So yeah, beta. But the fact they're pretty explicit in what will not be migrating over from the NVCP gives me hope that it will eventually be feature complete. If not, there's always NVinspector. But goddamn, finally. I realize I'm in the minority who gave a poo poo about the NVCP but christ did I hate it. Happy_Misanthrope fucked around with this message at 02:56 on Feb 23, 2024 |
# ? Feb 23, 2024 02:49 |
|
Taima posted:I'm curious, what do you guys think is the best "bang for your buck" refresh rate? (with VRR obviously, since it's so standard at this point). ~120 FPS IMO. The leap from 60-120 is absolutely mindblowing and you really can't go back once you've crossed it. I haven't seen 240 in person before so I can't tell if it'll be similar, but I can't really care too much about the difference between 120 and 160.
|
# ? Feb 23, 2024 03:34 |
|
Shipon posted:~120 FPS IMO. The leap from 60-120 is absolutely mindblowing and you really can't go back once you've crossed it. I haven't seen 240 in person before so I can't tell if it'll be similar, but I can't really care too much about the difference between 120 and 160. A laptop at my job that has a 2080ti in it has a monitor that says it does 300hz. Didn't know that was a thing til I saw that monitor.
|
# ? Feb 23, 2024 04:22 |
|
Thanks for the responses on driver updates. Sounds like I’m kinda doing the right thing.
|
# ? Feb 23, 2024 06:44 |
|
RodShaft posted:I don't know how far of topic this is, but I just bought a 6750xt off eBay. It's power draw is 250w. My processor is an i5-11400 at 65w. I have a single SSD and 2 16gb RAM sticks and nothing else but a couple fans in there. It's an Inspiron 3891 that had a 250w PSU. I upgraded to a 500w from a server I wasn't using anymore. Thanks for everyone's opinions, my server PSU would have been fine according to everyone here and every online calculator I checked, but it was old enough it only had one 6pin GPU cord so I decided this seemed like too good of a deal to pass up when I'd have to buy molex or SATA adapters of questionable quality. Filled in with Pokemon cards for free shipping. Got a good enough deal on the graphics card that I still came in under budget.
|
# ? Feb 23, 2024 12:58 |
|
Shipon posted:~120 FPS IMO. The leap from 60-120 is absolutely mindblowing and you really can't go back once you've crossed it. I haven't seen 240 in person before so I can't tell if it'll be similar, but I can't really care too much about the difference between 120 and 160. I've been on 240 for many years now, up in the 180+ range it is pretty smooth, but not mind blowing better than 120, well into diminishing returns (360 Hz is an option with some 1080p displays even). I will mainly stick with 240 Hz because on a native gsync display with up to 240 Hz it means I can pretty much skip all that frame capping bullshit and just run vsync because I will hardly ever reach it and even if I do the maximum latency penalty for buffering 2 frames is 9 MS so who gives a gently caress...
|
# ? Feb 23, 2024 13:03 |
|
SpaceDrake posted:A lot of VR apps tend to target the rendering hardware native in the headset or significantly lower-end hardware, despite (or because of) the high effective resolutions the games have to run at. A 4070S will chunk through 95% of VR apps without a care in the world. This changes somewhat with newer 4K and up headsets, particularly with ones like quest 3 that use streaming rather than native video output for PCVR (and hence the GPU needs to encode the stream in high framerate and bitrate at the same time as rendering). Especially sim games can be heavily GPU limited all the way to 4090. e: Or to put it another way the streaming quality/resolution selection in Virtual Desktop looks like this (it's an old screenshot for quest 2 and they've added more options for the 40 series and quest 3, but you get the idea): I'm using that selection on a 3070ti/quest 3 and I get dips below 90 FPS on some fairly basic things like Tabletop Simulator, nevermind something like modded Skyrim/Elite/Flight Simulator/No Man's Sky. I also still see artifacting on occasion. Compared to that I used to run Lenovo Explorer without trouble on my 1050ti, but that was native output 60 FPS at 1440x1440 per eye. E: All that said 4070S (especially when using AV1 streaming or native) should be able to handle everything but a few sims maxed. Private Speech fucked around with this message at 18:25 on Feb 23, 2024 |
# ? Feb 23, 2024 14:05 |
|
Taima posted:I'm curious, what do you guys think is the best "bang for your buck" refresh rate? (with VRR obviously, since it's so standard at this point). I'm happy with 90. Diminishing returns after that.
|
# ? Feb 23, 2024 14:24 |
|
PirateBob posted:I'm happy with 90. Diminishing returns after that. Yup, 60 to 90 is a noticeable jump but past 90 I don't feel much difference. I almost exclusively play single player stuff though, so might be different if I was into MP.
|
# ? Feb 23, 2024 14:39 |
|
Depends slightly on the game for me but yeah. My monitor goes to 165 but past 90 the difference (to my middle-aged eyes) is negligible. and I usually don't bother trying to get beyond that.
|
# ? Feb 23, 2024 15:50 |
|
I'm going to miss the nvidia control panel. It was functional. I also like that older aesthetic because it's what I grew up with and associate with good functionality.
|
# ? Feb 23, 2024 15:57 |
|
Freakazoid_ posted:I'm going to miss the nvidia control panel. It was functional. I also like that older aesthetic because it's what I grew up with and associate with good functionality. Is it straight up gone with the new drivers or is it still in there somewhere? Is there a way to sideload it? Like I think someone said there's no contrast slider bar in the new thing. That would make desktop HDR unusable for me. Windows literally doesn't understand the concept of "black" unless you crank the NVidia contrast to 100%, that's the only way to make a black background the same black level as turning off HDR.
|
# ? Feb 23, 2024 15:59 |
|
it's still there in the version the press got because the new UI doesn't cover everything the old one did yet, but that might be temporary comedy option they pull a microsoft and never get around to porting everything to the new UI
|
# ? Feb 23, 2024 16:00 |
|
repiv posted:it's still there in the version the press got because the new UI doesn't cover everything the old one did yet, but that might be temporary
|
# ? Feb 23, 2024 16:20 |
|
Zero VGS posted:Is it straight up gone with the new drivers or is it still in there somewhere? Is there a way to sideload it? Are you running win 10 or 11? I don't think I have my contrast setting up to the max and my black levels seem okay with HDR on all the time and I have an OLED panel so I feel like I'd notice raised black levels, like how in cyberpunk's HDR mode I have to use reshade to get the black level to go from 0.2nits or so to properly 0.
|
# ? Feb 23, 2024 17:38 |
|
Kagrenak posted:Are you running win 10 or 11? I don't think I have my contrast setting up to the max and my black levels seem okay with HDR on all the time and I have an OLED panel so I feel like I'd notice raised black levels, like how in cyberpunk's HDR mode I have to use reshade to get the black level to go from 0.2nits or so to properly 0. I'm on Windows 11 and picked a solid black desktop background. If I turn the NVidia contrast back down to the default 50 the black washes out, and pure white windows (even if I use a website to generate #FFFFFF) dim dramatically. This is with the Windows settings panel having HDR enabled and "SDR Content Brightness" set to 100. It just doesn't give a poo poo as far as I can tell.
|
# ? Feb 23, 2024 17:59 |
|
Looking at the Gigabyte 4080 SUPER Gaming OC on tom's hardware to get an idea on whether or not I wanna pull the trigger on it. It's loud-ish on stock bios but honestly seems like it runs just as cool on quiet. How are Gigabyte in terms of warranty and service? I'm not expecting much because I know just about every Taiwan/China based company is a nightmare to RMA through but worth asking. ASUS MSI and Gigabyte aren't really better or worse on that front right? Is the Strix model actually bigger why is its cooler so much quieter on the 4080 supers?
|
# ? Feb 23, 2024 21:29 |
|
PirateBob posted:I'm happy with 90. Diminishing returns after that. Yeah, I recall 85 Hz in the CRT days where flickering entirely stopped in my perpherial vision. Under that it was visible. There's a reason Valve decided on 90 for the Vive. More is surely going to help with stability to some degree, but diminishing returns beyond that are very steep
|
# ? Feb 23, 2024 21:33 |
|
CatelynIsAZombie posted:Looking at the Gigabyte 4080 SUPER Gaming OC on tom's hardware to get an idea on whether or not I wanna pull the trigger on it. It's loud-ish on stock bios but honestly seems like it runs just as cool on quiet. My main concern was noise & temps as well and it's luckily exceeded expectations.
|
# ? Feb 23, 2024 22:01 |
|
HalloKitty posted:Yeah, I recall 85 Hz in the CRT days where flickering entirely stopped in my perpherial vision. This was also true for me.
|
# ? Feb 23, 2024 22:09 |
|
The Joe Man posted:My main concern was noise & temps as well and it's luckily exceeded expectations.
|
# ? Feb 23, 2024 22:18 |
|
HalloKitty posted:There's a reason Valve decided on 90 for the Vive. I think they wanted 120 as much as we did, but a) those screens were impossible to get in quantity and quality needed even if the pricing wasn't prohibitive (and goddamn it was prohibitive), and b) it was hard enough to get game developers to hold a solid 90 and making it 120 was just not feasible at that time. Quest 3 has a 120 Hz display but by default runs at 90 Hz for content, because asking to cut the frame budget to 8ms is not going to work for a lot of content.
|
# ? Feb 23, 2024 22:20 |
|
Zero VGS posted:I'm on Windows 11 and picked a solid black desktop background. If I turn the NVidia contrast back down to the default 50 the black washes out, and pure white windows (even if I use a website to generate #FFFFFF) dim dramatically. This is with the Windows settings panel having HDR enabled and "SDR Content Brightness" set to 100. It just doesn't give a poo poo as far as I can tell. Did you do the HDR calibration? That's the only thing I can think of. I just tested my slider and blacks only raise when I bring it under 50.
|
# ? Feb 23, 2024 22:38 |
|
CatelynIsAZombie posted:Looking at the Gigabyte 4080 SUPER Gaming OC on tom's hardware to get an idea on whether or not I wanna pull the trigger on it. It's loud-ish on stock bios but honestly seems like it runs just as cool on quiet. I found gigabyte support dramatically worse than other similar companies, but support among these companies is pretty random so experiences always vary among people
|
# ? Feb 23, 2024 22:41 |
|
FuzzySlippers posted:I found gigabyte support dramatically worse than other similar companies, but support among these companies is pretty random so experiences always vary among people RIP EVGA
|
# ? Feb 23, 2024 22:45 |
|
CatelynIsAZombie posted:thanks for the info, did you try out the quiet bios?
|
# ? Feb 23, 2024 23:08 |
|
|
# ? Jun 10, 2024 12:25 |
|
I currently am running a MSI 390X that is starting to show its age, I had upgraded everything else 2-3 years ago when GPUs were hard to come by at reasonable prices. I'm thinking of upgrading to a 4070 Super or a 4070 TI Super. Is there a consensus what makes most sense to buy for 1440p? I'm currently gaming at 1080p but planning on getting a 1440p monitor soonish.
|
# ? Feb 23, 2024 23:30 |