Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
It's another part of the :pcgaming: experience that consoles do better. On PS5 when I save a live clip it immediately lets me choose the length and lets me preview it. It's snappy and straightforward. Sony even added automatic temporary uploads for recent captures.

Adbot
ADBOT LOVES YOU

BurritoJustice
Oct 9, 2012

Rinkles posted:

It's another part of the :pcgaming: experience that consoles do better. On PS5 when I save a live clip it immediately lets me choose the length and lets me preview it. It's snappy and straightforward. Sony even added automatic temporary uploads for recent captures.

The PS5 clip quality is really poor though, I'm guessing it's H.264 using RDNA's notoriously weak hardware encoder.

I was watching back some of my Ragnarok replays and they're extremely blocky.

I do love the convenience though, I screenshare PIP with my SIL and we both play singleplayer story games together/simultaneously.

Dr. Video Games 0031
Jul 17, 2004

Rinkles posted:

I'm getting big frame drops in Shadowplay recordings, that weren't there during gameplay. Turning down the bitrate might have helped a bit, but not entirely. Any ideas? This is a 3060ti.

Not sure what's up here. I have to correct the previous goon and say that nvidia's background recording is about as free as it gets. It takes up virtually no CPU resources since it's all done on the GPU, and it's very light on the GPU. Most benchmarks that test this find an in-game performance drop of a few percent at most, and it's generally designed to record smoothly with little to no fuss. I've never had any choppiness in my recordings with any of my nvidia GPUs (or AMD, for that matter). Try locking your in-game frame rate and recording at that frame rate? (or a number that divides evenly into it.) You could also try recording at 1080p or something if you're currently recording at 1440p.

Alternatively, try switching to something like OBS for your background recording. It's more powerful than GFE, but it takes a bit more setup.

Dr. Video Games 0031 fucked around with this message at 07:28 on Jan 16, 2023

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
I already locked the framerate because I'm on a 60Hz display. The only complication I can think of is that I'm using DLDSR (because Just Cause 4's TAA is dreadful). The GPU is still only at 60-70% use, but maybe there's a hitch in the downsampling process. The video still comes out 1080p (tv's native), if you didn't know.

e:does OBS have a similarly light footprint?

Rinkles fucked around with this message at 07:48 on Jan 16, 2023

Dr. Video Games 0031
Jul 17, 2004

Rinkles posted:

e:does OBS have a similarly light footprint?

It can use NVENC, so the encoding part at least should be about as unobtrusive. You can also set different encoding speed settings, unlike GFE, which may lighten (or increase) the load on the GPU. You can also set it to keep the replay buffer in memory if you don't want to be constantly writing to disk. Which, come to think of it, could also be a cause of recording stutters. Especially if you're writing to an HDD that is being accessed by other apps at the same time.

The app itself is fairly lightweight, though I don't know how exactly it compares to the GFE overlay in terms of resource usage.

Dr. Video Games 0031 fucked around with this message at 08:14 on Jan 16, 2023

tehinternet
Feb 14, 2005

Semantically, "you" is both singular and plural, though syntactically it is always plural. It always takes a verb form that originally marked the word as plural.

Also, there is no plural when the context is an argument with an individual rather than a group. Somfin shouldn't put words in my mouth.
To be honest, I haven’t pulled the trigger on a 4090 because I’ve only had EVGA cards since I got back into PC gaming in 2013. Spending $1600 on an unknown for me seems not great tbh

This 2080 Ti is gonna last a while, prob til 5 or 6 series if pricing stays like this and EVGA stays out.

Enos Cabell
Nov 3, 2004


tehinternet posted:

To be honest, I haven’t pulled the trigger on a 4090 because I’ve only had EVGA cards since I got back into PC gaming in 2013. Spending $1600 on an unknown for me seems not great tbh

This 2080 Ti is gonna last a while, prob til 5 or 6 series if pricing stays like this and EVGA stays out.

EVGA was great, and most of my recent cards were EVGA branded, but Gigabyte, Asus and MSI have been making GPUs for longer than EVGA has been a company. Obviously don't buy a 4090 if you can't use the performance, but it's not exactly a gamble buying a non-EVGA card.

Shipon
Nov 7, 2005

Enos Cabell posted:

EVGA was great, and most of my recent cards were EVGA branded, but Gigabyte, Asus and MSI have been making GPUs for longer than EVGA has been a company. Obviously don't buy a 4090 if you can't use the performance, but it's not exactly a gamble buying a non-EVGA card.

EVGA's service was top tier, which was important because their cards did also fail like anyone else's. On the other hand I have heard many horror stories about Gigabyte being "gently caress you" with regard to RMA service

Copper Vein
Mar 14, 2007

...and we liked it that way.
Can I throw a PSU question in here? Since my new GPU was why I bought a new PSU.

I bought a Seasonic Prime TX-1600 to get overhead for my 4090 compared to my old 1000w PSU. I used the 4x8pin to 12VHPWR cable that came with the GPU and four 8 pin PCIe cables from the PSU and it's all fine.

Only now did I go back through the bag of cables that came with the PSU and find that it came with it's own 12VHPWR cables that are labeled 600w. However, each 12VHPWR cable only has two PCIe plugs on one end and the 12VHPWR end only has two sensing pins. Also, the PCIe plugs on the cables are the 8 pin form factor, only 6 pins are populated.

Would these 12VHPWR cables have been any use to a 4090? I don't know if these cables simply have half the current capacity that the 4090 wants, or if this PSU can supply the same power over half the conductors. I'm guessing it's the former.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Copper Vein posted:

I bought a Seasonic Prime TX-1600 to get overhead for my 4090 compared to my old 1000w PSU.

I hope that someone here has an answer, but I'm just floored that 1600W PSUs exist. Won't that trip a household breaker if you spend a lot of time over 1200W, especially with efficiency losses? 15A breakers are only rated for 12A continuous current, which is 1440W from the outlet. This gaming computer PSU can deliver more wattage than a level 1 EV charger!

njsykora
Jan 23, 2012

Robots confuse squirrels.


If they’re labelled 600W it’s probably the latter. But I’d check the documentation to make sure. The only reason I can think of to pack that cable in with a high power PSU would be specifically for a 4090 though.

KS
Jun 10, 2003
Outrageous Lumpwad
https://www.reddit.com/r/nvidia/comments/y1gw03/12vhpwr_cables_new_seasonic_power_supplies_and/

I had the same question and found this that purports to be an official answer.

power crystals
Jun 6, 2007

Who wants a belly rub??

Twerk from Home posted:

I hope that someone here has an answer, but I'm just floored that 1600W PSUs exist. Won't that trip a household breaker if you spend a lot of time over 1200W, especially with efficiency losses? 15A breakers are only rated for 12A continuous current, which is 1440W from the outlet. This gaming computer PSU can deliver more wattage than a level 1 EV charger!

In the US at least 20A breakers exist and I think are reasonably common in new construction even. Though then you'd need to trust that the builders/electricians used 20A rated wiring in the walls, which is an even worse problem...

repiv
Aug 13, 2009

Twerk from Home posted:

I hope that someone here has an answer, but I'm just floored that 1600W PSUs exist. Won't that trip a household breaker if you spend a lot of time over 1200W, especially with efficiency losses? 15A breakers are only rated for 12A continuous current, which is 1440W from the outlet. This gaming computer PSU can deliver more wattage than a level 1 EV charger!

the absolute highest wattage PSUs on the market aren't sold in the US, or are de-rated to a lower wattage when used at US voltage, since as you say they're brushing up against the amperage limits of standard wiring/breakers there

silverstone makes a 2050W PSU but it's only rated for that in 230V markets, in 110V markets it's de-rated to 1650W

KS posted:

https://www.reddit.com/r/nvidia/comments/y1gw03/12vhpwr_cables_new_seasonic_power_supplies_and/

I had the same question and found this that purports to be an official answer.

the reason for the discrepancy is the one-size-fits-all dongle that comes with the card has to assume the user has a garbo PSU where each 8pin can only supply 150W (the bare minimum the spec requires) and so they need four of them to make up 600W

with the official cable seasonic controls both ends of the connection, so they can know for a fact that their PSU-side connectors are good for at least 300W and they only need two of them to drive a 12VHPWR to 600W

repiv fucked around with this message at 16:42 on Jan 16, 2023

mobby_6kl
Aug 9, 2009

by Fluffdaddy

Twerk from Home posted:

I hope that someone here has an answer, but I'm just floored that 1600W PSUs exist. Won't that trip a household breaker if you spend a lot of time over 1200W, especially with efficiency losses? 15A breakers are only rated for 12A continuous current, which is 1440W from the outlet. This gaming computer PSU can deliver more wattage than a level 1 EV charger!
It's pretty crazy but... there's no way a gaming PC pulls more than 1000W, even with a 4090. So I don't think it'll be an actual problem.

kliras
Mar 27, 2021
speaking of gpu's and power, are there any risks to look out for with evga's powerlink? think i might get one just as a fiddle project

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
I can't decide where the best place is to post this question but I think here is good:

I use a S95B QD OLED as a gaming monitor, and it's fine, but it does that OLED thing (I forget what it's called) maybe pixel shift? Where it shifts the entire screen around a little bit all the time?

It's unobtrusive with media, but a Windows Desktop gets all hosed up in the sense that often times part of the taskbar is shuffled into a spot where it can't be seen, etc.

I'm just wondering, for people who game and use an OLED as their primary screen for other purposes: can I turn off pixel shift or does it make a huge difference to the longevity of the panel?

I knew burn in would be a possibility on this panel, so I did get a burn in warranty, so if the worst happens, I'm covered, but I don't want to go SEEKING burn in either so I just want to make sure I'm not totally loving my display up by turning off pixel shift.

e: by the way if anyone is interested in using a QD OLED to game, Samsung runs an extremely awesome program for educators and people in front line professions where you can get their panels cheaply. You can get a 55 inch S95B for just over a grand. And in the past (a deal that would probably come back) you could get a 65 inch for $1400.

It's the best gaming display right now, so if you already have a 4090, it's worth considering. And the quantum dot tech uses panel tech that is more resistant to burn in vs a regular OLED for technical reasons that it's not worth getting into.

Taima fucked around with this message at 18:00 on Jan 16, 2023

MarcusSA
Sep 23, 2007

So I just ordered an LG 42 and everyone said to just turn off pixel shift and then set the taskbar to auto hide.

I don’t have mine in yet but I’m going to give the pixel shift a try and see how it goes but I have to imagine I’ll be turning it off.

Burn in is more of a risk but as you said you bought the warranty so :shrug:

Out of curiosity what’s the link for the Samsung deal? I got my 42 for $650 so I can’t imagine it’s a much better deal.

change my name
Aug 27, 2007

Legends die but anime is forever.

RIP The Lost Otakus.

MarcusSA posted:

So I just ordered an LG 42 and everyone said to just turn off pixel shift and then set the taskbar to auto hide.

I don’t have mine in yet but I’m going to give the pixel shift a try and see how it goes but I have to imagine I’ll be turning it off.

Burn in is more of a risk but as you said you bought the warranty so :shrug:

Out of curiosity what’s the link for the Samsung deal? I got my 42 for $650 so I can’t imagine it’s a much better deal.

Where did you get a C2 for $650? That's an insane price

MarcusSA
Sep 23, 2007

change my name posted:

Where did you get a C2 for $650? That's an insane price

It was from the LG refurb store on eBay.

According to slick deals though it wasn’t an amazing price because about a month ago it was $699 new so :shrug:

Even at $699 it’s a hell of deal for an oled monitor.

Enos Cabell
Nov 3, 2004


I've had my C2 for 3-4 weeks now, and pixel shift hasn't really bugged me.

CatHorse
Jan 5, 2008

Taima posted:

I use a S95B QD OLED as a gaming monitor, and it's fine, but it does that OLED thing (I forget what it's called) maybe pixel shift? Where it shifts the entire screen around a little bit all the time?

It's unobtrusive with media, but a Windows Desktop gets all hosed up in the sense that often times part of the taskbar is shuffled into a spot where it can't be seen, etc.


Make sure you are using pc mode on the tv and check in gpu driver software if you have the correct resolution and not using some over/underscan.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

MarcusSA posted:

So I just ordered an LG 42 and everyone said to just turn off pixel shift and then set the taskbar to auto hide.

I don’t have mine in yet but I’m going to give the pixel shift a try and see how it goes but I have to imagine I’ll be turning it off.

Burn in is more of a risk but as you said you bought the warranty so :shrug:

Out of curiosity what’s the link for the Samsung deal? I got my 42 for $650 so I can’t imagine it’s a much better deal.

Thank you for your thoughts friend.

Here is the deal:
https://slickdeals.net/f/16340866-s...teSearchV2Algo1

QD OLED is really loving good and I highly recommend it. $650 for a C2 is pretty dope though.

MikusR posted:

Make sure you are using pc mode on the tv and check in gpu driver software if you have the correct resolution and not using some over/underscan.

I think I do? Pixel shift just shifts the whole image around. I think that's what it does, so my understanding is that it must be disabled to fix, and is separate from over/under scan and things of that nature.

Hey by the way does anyone know what Change ECC State is in the Nvidia Control panel? Should I enable it?

repiv
Aug 13, 2009

ECC is the error correcting memory mode, it protects against errant bit flips at the expense of performance

if you're only gaming you'll want to leave it disabled

Shipon
Nov 7, 2005
Just saw that the Gigabyte Gaming OC 4090 is in stock again at Newegg. I was able to order one on Friday and they just shipped it out today so I should be getting it this week.

MarcusSA
Sep 23, 2007

Shipon posted:

Just saw that the Gigabyte Gaming OC 4090 is in stock again at Newegg. I was able to order one on Friday and they just shipped it out today so I should be getting it this week.

It’s a chonky boi

Also it’s pretty drat quiet my new system is way more quiet and it’s awesome.

Lord Stimperor
Jun 13, 2018

I'm a lovable meme.

MarcusSA posted:

Man the 4090 is ridiculous.

I just finished my Lancool III build and this is a big case lol




I’m not particularly happy with that video card cable there but there’s not much I can do about that.


With that ridiculous CPU cooler it looks almost proportional again


What a neat little SFF system and a cute little 10 inch screen

tehinternet
Feb 14, 2005

Semantically, "you" is both singular and plural, though syntactically it is always plural. It always takes a verb form that originally marked the word as plural.

Also, there is no plural when the context is an argument with an individual rather than a group. Somfin shouldn't put words in my mouth.

Enos Cabell posted:

EVGA was great, and most of my recent cards were EVGA branded, but Gigabyte, Asus and MSI have been making GPUs for longer than EVGA has been a company. Obviously don't buy a 4090 if you can't use the performance, but it's not exactly a gamble buying a non-EVGA card.

Oh, it’s not that their failure rate was so much better, it’s that if I’m spending $1600 on a card, I want service to match so I’m not out $1600 or dealing with “lol sux 2 b u” service when I get an unlucky failure at that price point

strange feelings re Daisy
Aug 2, 2000

EVGA was also great for selling cards at MSRP on a waiting list. Instead of refreshing a bunch of sites with scalped prices 10 times a day and following a discord server I just signed up for the EVGA wait list and got a card that way as cheap as possible.

Animal
Apr 8, 2003

strange feelings re Daisy posted:

EVGA was also great for selling cards at MSRP on a waiting list. Instead of refreshing a bunch of sites with scalped prices 10 times a day and following a discord server I just signed up for the EVGA wait list and got a card that way as cheap as possible.

If EVGA were still around I'd buy a new 4090. As things are now, I'll stretch my 3080 and leisurely wait for FE 4090's to be available used at a discount.

8-bit Miniboss
May 24, 2005

CORPO COPS CAME FOR MY :filez:
12HVPWR cable installed. Looks way better:



Before with the adapter for reference:

MarcusSA
Sep 23, 2007

8-bit Miniboss posted:

12HVPWR cable installed. Looks way better:



Before with the adapter for reference:



That is significantly better. I’m definitely not happy with the 4 cable setup.

Lamquin
Aug 11, 2007
Received my Zotac 4070 TI in the mail, and while the installation wasn't painful (Same size and power cables into an adapter as my old 1080), I really wasn't prepared for the coil whine this thing has at higher frame rates. Yikes. It's easily solved by capping the frame rate (so 100 FPS instead of 120), but hoooly moly it's loud at full tilt. Any tricks on how to counteract it?

Other than that, so far so good! It's nice to finally not pull settings down to medium/low. :shobon:

njsykora
Jan 23, 2012

Robots confuse squirrels.


Which version Zotac card? I have their base Trinity card and haven't noticed any whine even when running Portal RTX as high as possible for shits and giggles. Though granted I haven't used it for very long since I shoved it in my TV PC briefly to try it for a day before putting it back in the box in anticipation of the rest of my new build parts that were supposed to be here on Saturday and were delayed again this morning. So I have a case, SSD, GPU and CPU on my desk taunting me.

Lamquin
Aug 11, 2007

njsykora posted:

Which version Zotac card? I have their base Trinity card and haven't noticed any whine even when running Portal RTX as high as possible for shits and giggles.

The baseline (cheapest) variant, as apparently the performance gains were so low they didn't justify the extra cost one bit.
Luckily, my coil whine was easily resolved; I switched out from using the PSU PCI-E 2x8 pin cable (Y) to connecting a second 8-pin to the PSU and running them separately. I don't get how electricity works and why it made a difference, but hey, it made it much more bearable. I can still hear it if I listen for it with the game muted, but the fans and game audio drown it out. Huzzah!

Lamquin fucked around with this message at 08:56 on Jan 17, 2023

ughhhh
Oct 17, 2012

Give undervolting or power limit a try if it's really bothering you. I have the same card and was gonna play around with all those setting next weekend, but I had done that with my 2070 super for both temperature and noise reasons without any problems.

pairofdimes
May 20, 2001

blehhh
What tool do people typically use to undervolt a GPU? I got an MSI Gaming X Trio 4090 today and the card starts making a loud buzzing noise once it draws close to 200W and it only gets worse above that. It's loud enough I can hear it easily over the fans after they spool up to deal with the heat.

SuperTeeJay
Jun 14, 2015

Coil whine (the buzzing) can often be mitigated by capping frames per second through the Nvidia control panel. Try setting it to 2-5 frames lower than your monitor’s refresh rate.

MSI Afterburner is the most popular tool for changing power settings, but I doubt undervolting the GPU would help and you probably don’t want to power limit the card to 200W or so as you’d be giving up a lot of performance.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

SuperTeeJay posted:

MSI Afterburner is the most popular tool for changing power settings, but I doubt undervolting the GPU would help and you probably don’t want to power limit the card to 200W or so as you’d be giving up a lot of performance.

Well it helped my 3060ti quiet done.

Adbot
ADBOT LOVES YOU

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
shouldn't we be looking for an alternative to MSI Afterburner?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply