Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
kliras
Mar 27, 2021

K8.0 posted:

The chrome bug still exists. Probably will be fixed in about... 10 years.
gonna need another trillion of market cap to fix that

that being said, the new nvidia looks really nice. people act like the old control panel is perfect in every way, but i personally don't like how it takes a whole thirty seconds to find and load up the settings for a game so i can do something basic like cap the fps

Adbot
ADBOT LOVES YOU

VorpalFish
Mar 22, 2007
reasonably awesometm

SlowBloke posted:

Nvidia also had one of the few persistent fido2 implementation, meaning you could log on nvidia with a yubikey and nothing else. They recently added passkey so again, no need to use passwords unless you really want it.

Hey that's great... for a bank's website, on the internet.

You know what it's not great for? An application running locally on my computer that updates drivers and adjusts GPU settings.

Geese. For all eternity.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
I'm curious, what do you guys think is the best "bang for your buck" refresh rate? (with VRR obviously, since it's so standard at this point).

I've been thinking about it a lot because path tracing has effectively reset the capabilities of even top tier graphics cards. On some games I find myself having to worry about if 60 fps is enough, or 50 fps, or 70...

A lot of people would say that above 60 fps is where you start to get negligible improvements. I would definitely disagree with that; going from 60 fps to 80-90 fps is a huge quality of life increase imo. Though, going above 90-100 fps, that's where I think it starts to matter less.

On my path traced games I've therefore been angling for about 80 fps. I think that is the sweet spot. 60 fps is too low, and above 80 fps involves DLSS concessions that I'm generally not willing to make (Balanced is the lowest i'll go). Just curious where everyone is with it though?

kliras
Mar 27, 2021
165 hz is pretty good for me. but it can get a little trickier if you want 4k hdr over older displayport specs, since you run into bandwidth problems which means having to use chroma subsampling or dsc

with 165, your fps is probably going to be lower than your refresh rate anyway, so you might not even have to cap fps for g-sync etc to kick in, unless it's to deal with a lack of 60fps caps in main menus

consoles obviously aren't going to need more than 120 hz, unless you're some kind of input latency pervert

Sininu
Jan 8, 2014

Cyrano4747 posted:

When is the last time a driver update has really hosed up an older, established game? At least with the established players, maybe Intel is still in their wild early days.

Forza Horizon 4 was broken on Nvidia GPU's from Aug/Sep 2022 to like Jan/Feb 2023. It would just crash after about 30 mins of gameplay. The solution was to roll back to older drivers.

There's also that Chrome bug that's still here.
And it took them 6+ months to fix Discord streaming. They hosed up their encoder in some release which made the image occasionally go monochrome and ultra low quality for a minute or two. They fixed it about two months ago.

Sininu fucked around with this message at 23:06 on Feb 22, 2024

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.
I remember Nvidia drivers throwing minor errors in Doom 2016’s Vulkan mode a few years ago - single frame polygon rendering glitches in models, missing effects on some surfaces (especially during a key cinematic where their absence is really notable), and other weirdness that didn’t manifest in the OpenGL renderer. Nobody’s drivers are perfect - GPUs are too complicated and programming provides too many edge cases for perfection to last - but Nvidia’s made it a point to throw lots of engineering at these problems and then work to keep solutions from breaking afterward. There’s a reason I snagged a GeForce for the family PC recently: the drivers work 99+% of the time, the feature set is solid to leading edge, and they’ll be supported for much longer than the hardware will be generally relevant or performant.

AMD’s made huge strides but still tends toward a reactive stance on problems and doesn’t have the same engineering budget as Nvidia - I genuinely think their near-starvation/construction core years stunted their GPGPU ambitions for good. Intel’s doing better than I expected considering how late they started and how historically shabby their IGP drivers were for Windows. They’ll never tunnel down to the engineering challenge of ensuring pre-DX 9 games run well on Arc, but Intel is throwing a lot of work into becoming proficient for modern software, and AMD’s graphics division has something to worry about if Battlemage executes well.

edit: The worst GPU driver snafu I remember was playing Unreal II on the last released Radeon 7500 drivers, shooting a spider egg, and having my otherwise stable machine bluescreen and poo poo its pants. This was reproducible. I reported the bug, the driver maintainer wrote back to me, and basically said “yep, does the same thing here, but we’re done with these drivers, so good luck.” Clownassery.

ijyt
Apr 10, 2012

If I have RTX HDR enabled, do I want to turn off Windows AutoHDR?

kliras
Mar 27, 2021
can't say for sure, but rtx hdr is supposed to have a better handle on the gamma curve iirc, so you could try to disable autohdr and see if the brightness of the game visibly changes (on account of a difference in gamma)

Profanity
Aug 26, 2005
Grimey Drawer
You can press AltF3 ingame and modify peak brightness, midpoint, contrast and saturation in real time, it's great.

BurritoJustice
Oct 9, 2012

K8.0 posted:

The chrome bug still exists. Probably will be fixed in about... 10 years.

It's not a NVIDIA driver bug, it's a bug with Microsoft DirectComposition. It's already fixed in Windows 26002 and newer, you need to be on W11 Dev or Canary builds to get it.

Hopefully they back patch it to stable.

MixMasterMalaria
Jul 26, 2007
How do folks feel about the 4070S for VR? I want to hold out for 5000 series but I'm impatient.

YerDa Zabam
Aug 13, 2016




5000 will probably be over a year away

SpaceDrake
Dec 22, 2006

I can't avoid filling a game with awful memes, even if I want to. It's in my bones...!

MixMasterMalaria posted:

How do folks feel about the 4070S for VR? I want to hold out for 5000 series but I'm impatient.

A lot of VR apps tend to target the rendering hardware native in the headset or significantly lower-end hardware, despite (or because of) the high effective resolutions the games have to run at. A 4070S will chunk through 95% of VR apps without a care in the world.

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

Subjunctive posted:

Microsoft should set a better example there, then!

Well, they kind of do. For stuff like Settings in Win11, I think WinUI works decently. It's when they this have his hodgepodge of stuff like combining winforms/xaml/winui in explorer and the laughably glacial pace in modernizing all of the other legacy control panels that it goes to poo poo. The main problem with recent MS attempts at a modern UX is they don't follow their own examples!

NVCP that used WinUI and looks like any other modern Win app and properly conforms to your DPI settings would have been preferable imo.

Still, this is already a huge improvement over the old, the lack of login requirement and the massive speed increase alone is what I've been waiting for - going on a decade now. Hell even if you do want to login, it now properly (imo) opens up a web browser, drat I hate when apps integrate their own UI for login so you have to manually cut and paste from your password manager.

Quite a bit of stuff to come though. I don't like how the GFE Optimize settings is now more prominent over the regular control panel settings when you select a game's profile, that should be an option to either remove it entirely or have it come 2nd. Ideally you would be able to adjust each game's settings individually instead of just a slider too, but I've wanted that for years even in GFE and they've never done it so who knows. Also adding games profiles manually is a bitch, you only have navigating folders, no way to click on recently used games - adding profiles for Gamepass games is far more cumbersome atm. No way that I can see to search for game titles either.

So yeah, beta. But the fact they're pretty explicit in what will not be migrating over from the NVCP gives me hope that it will eventually be feature complete. If not, there's always NVinspector. But goddamn, finally. I realize I'm in the minority who gave a poo poo about the NVCP but christ did I hate it.

Happy_Misanthrope fucked around with this message at 02:56 on Feb 23, 2024

Shipon
Nov 7, 2005

Taima posted:

I'm curious, what do you guys think is the best "bang for your buck" refresh rate? (with VRR obviously, since it's so standard at this point).

I've been thinking about it a lot because path tracing has effectively reset the capabilities of even top tier graphics cards. On some games I find myself having to worry about if 60 fps is enough, or 50 fps, or 70...

A lot of people would say that above 60 fps is where you start to get negligible improvements. I would definitely disagree with that; going from 60 fps to 80-90 fps is a huge quality of life increase imo. Though, going above 90-100 fps, that's where I think it starts to matter less.

On my path traced games I've therefore been angling for about 80 fps. I think that is the sweet spot. 60 fps is too low, and above 80 fps involves DLSS concessions that I'm generally not willing to make (Balanced is the lowest i'll go). Just curious where everyone is with it though?

~120 FPS IMO. The leap from 60-120 is absolutely mindblowing and you really can't go back once you've crossed it. I haven't seen 240 in person before so I can't tell if it'll be similar, but I can't really care too much about the difference between 120 and 160.

hark
May 10, 2023

I'm sleep

Shipon posted:

~120 FPS IMO. The leap from 60-120 is absolutely mindblowing and you really can't go back once you've crossed it. I haven't seen 240 in person before so I can't tell if it'll be similar, but I can't really care too much about the difference between 120 and 160.

A laptop at my job that has a 2080ti in it has a monitor that says it does 300hz. Didn't know that was a thing til I saw that monitor.

Red_Fred
Oct 21, 2010


Fallen Rib
Thanks for the responses on driver updates. Sounds like I’m kinda doing the right thing.

:tipshat:

RodShaft
Jul 31, 2003
Like an evil horny Santa Claus.


RodShaft posted:

I don't know how far of topic this is, but I just bought a 6750xt off eBay. It's power draw is 250w. My processor is an i5-11400 at 65w. I have a single SSD and 2 16gb RAM sticks and nothing else but a couple fans in there. It's an Inspiron 3891 that had a 250w PSU. I upgraded to a 500w from a server I wasn't using anymore.

AMD recommends 650w PSU for the 6750xt, but since it draws 250w and I've already upgraded that many watts, do I need a new PSU?


Here's my exact setup right now, if you need it. I'll be swapping out the 1060 for the 6750xt.

Thanks for everyone's opinions, my server PSU would have been fine according to everyone here and every online calculator I checked, but it was old enough it only had one 6pin GPU cord so I decided this seemed like too good of a deal to pass up when I'd have to buy molex or SATA adapters of questionable quality.
Filled in with Pokemon cards for free shipping.

Got a good enough deal on the graphics card that I still came in under budget.

Indiana_Krom
Jun 18, 2007
Net Slacker

Shipon posted:

~120 FPS IMO. The leap from 60-120 is absolutely mindblowing and you really can't go back once you've crossed it. I haven't seen 240 in person before so I can't tell if it'll be similar, but I can't really care too much about the difference between 120 and 160.

I've been on 240 for many years now, up in the 180+ range it is pretty smooth, but not mind blowing better than 120, well into diminishing returns (360 Hz is an option with some 1080p displays even). I will mainly stick with 240 Hz because on a native gsync display with up to 240 Hz it means I can pretty much skip all that frame capping bullshit and just run vsync because I will hardly ever reach it and even if I do the maximum latency penalty for buffering 2 frames is 9 MS so who gives a gently caress...

Private Speech
Mar 30, 2011

I HAVE EVEN MORE WORTHLESS BEANIE BABIES IN MY COLLECTION THAN I HAVE WORTHLESS POSTS IN THE BEANIE BABY THREAD YET I STILL HAVE THE TEMERITY TO CRITICIZE OTHERS' COLLECTIONS

IF YOU SEE ME TALKING ABOUT BEANIE BABIES, PLEASE TELL ME TO

EAT. SHIT.


SpaceDrake posted:

A lot of VR apps tend to target the rendering hardware native in the headset or significantly lower-end hardware, despite (or because of) the high effective resolutions the games have to run at. A 4070S will chunk through 95% of VR apps without a care in the world.

This changes somewhat with newer 4K and up headsets, particularly with ones like quest 3 that use streaming rather than native video output for PCVR (and hence the GPU needs to encode the stream in high framerate and bitrate at the same time as rendering).

Especially sim games can be heavily GPU limited all the way to 4090.

e: Or to put it another way the streaming quality/resolution selection in Virtual Desktop looks like this (it's an old screenshot for quest 2 and they've added more options for the 40 series and quest 3, but you get the idea):



I'm using that selection on a 3070ti/quest 3 and I get dips below 90 FPS on some fairly basic things like Tabletop Simulator, nevermind something like modded Skyrim/Elite/Flight Simulator/No Man's Sky. I also still see artifacting on occasion.

Compared to that I used to run Lenovo Explorer without trouble on my 1050ti, but that was native output 60 FPS at 1440x1440 per eye.

E: All that said 4070S (especially when using AV1 streaming or native) should be able to handle everything but a few sims maxed.

Private Speech fucked around with this message at 18:25 on Feb 23, 2024

PirateBob
Jun 14, 2003

Taima posted:

I'm curious, what do you guys think is the best "bang for your buck" refresh rate? (with VRR obviously, since it's so standard at this point).

I've been thinking about it a lot because path tracing has effectively reset the capabilities of even top tier graphics cards. On some games I find myself having to worry about if 60 fps is enough, or 50 fps, or 70...

A lot of people would say that above 60 fps is where you start to get negligible improvements. I would definitely disagree with that; going from 60 fps to 80-90 fps is a huge quality of life increase imo. Though, going above 90-100 fps, that's where I think it starts to matter less.

On my path traced games I've therefore been angling for about 80 fps. I think that is the sweet spot. 60 fps is too low, and above 80 fps involves DLSS concessions that I'm generally not willing to make (Balanced is the lowest i'll go). Just curious where everyone is with it though?

I'm happy with 90. Diminishing returns after that.

Enos Cabell
Nov 3, 2004


PirateBob posted:

I'm happy with 90. Diminishing returns after that.

Yup, 60 to 90 is a noticeable jump but past 90 I don't feel much difference. I almost exclusively play single player stuff though, so might be different if I was into MP.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.
Depends slightly on the game for me but yeah. My monitor goes to 165 but past 90 the difference (to my middle-aged eyes) is negligible. and I usually don't bother trying to get beyond that.

Freakazoid_
Jul 5, 2013


Buglord
I'm going to miss the nvidia control panel. It was functional. I also like that older aesthetic because it's what I grew up with and associate with good functionality.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Freakazoid_ posted:

I'm going to miss the nvidia control panel. It was functional. I also like that older aesthetic because it's what I grew up with and associate with good functionality.

Is it straight up gone with the new drivers or is it still in there somewhere? Is there a way to sideload it?

Like I think someone said there's no contrast slider bar in the new thing. That would make desktop HDR unusable for me. Windows literally doesn't understand the concept of "black" unless you crank the NVidia contrast to 100%, that's the only way to make a black background the same black level as turning off HDR.

repiv
Aug 13, 2009

it's still there in the version the press got because the new UI doesn't cover everything the old one did yet, but that might be temporary

comedy option they pull a microsoft and never get around to porting everything to the new UI

orcane
Jun 13, 2012

Fun Shoe

repiv posted:

it's still there in the version the press got because the new UI doesn't cover everything the old one did yet, but that might be temporary

comedy option they pull a microsoft and never get around to porting everything to the new UI
Microsoft is getting there, give it another 10 years. I'm sure Nvidia can beat that.

Kagrenak
Sep 8, 2010

Zero VGS posted:

Is it straight up gone with the new drivers or is it still in there somewhere? Is there a way to sideload it?

Like I think someone said there's no contrast slider bar in the new thing. That would make desktop HDR unusable for me. Windows literally doesn't understand the concept of "black" unless you crank the NVidia contrast to 100%, that's the only way to make a black background the same black level as turning off HDR.

Are you running win 10 or 11? I don't think I have my contrast setting up to the max and my black levels seem okay with HDR on all the time and I have an OLED panel so I feel like I'd notice raised black levels, like how in cyberpunk's HDR mode I have to use reshade to get the black level to go from 0.2nits or so to properly 0.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Kagrenak posted:

Are you running win 10 or 11? I don't think I have my contrast setting up to the max and my black levels seem okay with HDR on all the time and I have an OLED panel so I feel like I'd notice raised black levels, like how in cyberpunk's HDR mode I have to use reshade to get the black level to go from 0.2nits or so to properly 0.

I'm on Windows 11 and picked a solid black desktop background. If I turn the NVidia contrast back down to the default 50 the black washes out, and pure white windows (even if I use a website to generate #FFFFFF) dim dramatically. This is with the Windows settings panel having HDR enabled and "SDR Content Brightness" set to 100. It just doesn't give a poo poo as far as I can tell.

CatelynIsAZombie
Nov 16, 2006

I can't wait to bomb DO-DON-GOES!
Looking at the Gigabyte 4080 SUPER Gaming OC on tom's hardware to get an idea on whether or not I wanna pull the trigger on it. It's loud-ish on stock bios but honestly seems like it runs just as cool on quiet.

How are Gigabyte in terms of warranty and service? I'm not expecting much because I know just about every Taiwan/China based company is a nightmare to RMA through but worth asking. ASUS MSI and Gigabyte aren't really better or worse on that front right?

Is the Strix model actually bigger why is its cooler so much quieter on the 4080 supers?

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

PirateBob posted:

I'm happy with 90. Diminishing returns after that.

Yeah, I recall 85 Hz in the CRT days where flickering entirely stopped in my perpherial vision.
Under that it was visible. There's a reason Valve decided on 90 for the Vive.

More is surely going to help with stability to some degree, but diminishing returns beyond that are very steep

The Joe Man
Apr 7, 2007

Flirting With Apathetic Waitresses Since 1984

CatelynIsAZombie posted:

Looking at the Gigabyte 4080 SUPER Gaming OC on tom's hardware to get an idea on whether or not I wanna pull the trigger on it. It's loud-ish on stock bios but honestly seems like it runs just as cool on quiet.

How are Gigabyte in terms of warranty and service? I'm not expecting much because I know just about every Taiwan/China based company is a nightmare to RMA through but worth asking. ASUS MSI and Gigabyte aren't really better or worse on that front right?

Is the Strix model actually bigger why is its cooler so much quieter on the 4080 supers?
I've had the Gigabyte model you linked installed for a few days now. Highly recommend. Dead silent, mounting bracket is well engineered and temps haven't gone above 40ish when testing with Hogwart's (everything Ultra/RTX 1440 60FPS). 4yr warranty if you can hammer through their garbage website to register it.

My main concern was noise & temps as well and it's luckily exceeded expectations.

Kibner
Oct 21, 2008

Acguy Supremacy

HalloKitty posted:

Yeah, I recall 85 Hz in the CRT days where flickering entirely stopped in my perpherial vision.
Under that it was visible. There's a reason Valve decided on 90 for the Vive.

More is surely going to help with stability to some degree, but diminishing returns beyond that are very steep

This was also true for me.

CatelynIsAZombie
Nov 16, 2006

I can't wait to bomb DO-DON-GOES!

The Joe Man posted:

My main concern was noise & temps as well and it's luckily exceeded expectations.
thanks for the info, did you try out the quiet bios?

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

HalloKitty posted:

There's a reason Valve decided on 90 for the Vive.

I think they wanted 120 as much as we did, but a) those screens were impossible to get in quantity and quality needed even if the pricing wasn't prohibitive (and goddamn it was prohibitive), and b) it was hard enough to get game developers to hold a solid 90 and making it 120 was just not feasible at that time.

Quest 3 has a 120 Hz display but by default runs at 90 Hz for content, because asking to cut the frame budget to 8ms is not going to work for a lot of content.

Kagrenak
Sep 8, 2010

Zero VGS posted:

I'm on Windows 11 and picked a solid black desktop background. If I turn the NVidia contrast back down to the default 50 the black washes out, and pure white windows (even if I use a website to generate #FFFFFF) dim dramatically. This is with the Windows settings panel having HDR enabled and "SDR Content Brightness" set to 100. It just doesn't give a poo poo as far as I can tell.

Did you do the HDR calibration? That's the only thing I can think of. I just tested my slider and blacks only raise when I bring it under 50.

FuzzySlippers
Feb 6, 2009

CatelynIsAZombie posted:

Looking at the Gigabyte 4080 SUPER Gaming OC on tom's hardware to get an idea on whether or not I wanna pull the trigger on it. It's loud-ish on stock bios but honestly seems like it runs just as cool on quiet.

How are Gigabyte in terms of warranty and service? I'm not expecting much because I know just about every Taiwan/China based company is a nightmare to RMA through but worth asking. ASUS MSI and Gigabyte aren't really better or worse on that front right?

Is the Strix model actually bigger why is its cooler so much quieter on the 4080 supers?

I found gigabyte support dramatically worse than other similar companies, but support among these companies is pretty random so experiences always vary among people

FuturePastNow
May 19, 2014


FuzzySlippers posted:

I found gigabyte support dramatically worse than other similar companies, but support among these companies is pretty random so experiences always vary among people

RIP EVGA

The Joe Man
Apr 7, 2007

Flirting With Apathetic Waitresses Since 1984

CatelynIsAZombie posted:

thanks for the info, did you try out the quiet bios?
Yep, I've had it enabled since the beginning. The black switch is microscopically tiny but it's there, to the left of the power port. Switch it to the right.

Adbot
ADBOT LOVES YOU

Zampano
Jun 23, 2010

Shut. The. Fuck. Up.
I currently am running a MSI 390X that is starting to show its age, I had upgraded everything else 2-3 years ago when GPUs were hard to come by at reasonable prices. I'm thinking of upgrading to a 4070 Super or a 4070 TI Super. Is there a consensus what makes most sense to buy for 1440p? I'm currently gaming at 1080p but planning on getting a 1440p monitor soonish.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply