Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
repiv
Aug 13, 2009

Twibbit posted:

The switch version has taa by default

yeah it's funny, aside from the modded PC version the switch version has the most stable image by far

https://www.youtube.com/watch?v=u8kvSGUwAyc&t=299s

Adbot
ADBOT LOVES YOU

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

repiv posted:

yeah it's funny, aside from the modded PC version the switch version has the most stable image by far

https://www.youtube.com/watch?v=u8kvSGUwAyc&t=299s

What's the iOS version doing? I've heard really good things about the mobile port being really high effort and running / looking better than PS4 / bone.

repiv
Aug 13, 2009

i didn't know it was on mobile

judging by how stable the videos of it look, and it being ported by feral who also did the switch port, i assume it's using the same TAA

Sininu
Jan 8, 2014

chippy posted:

I've never had onboard sound that didn't have that IC interference really badly, especially when using headphones and a mic. But I've used an external interface for years as I do music production. Is it less of a problem nowadays?

I don't think it is fully solved yet even on mobos that advertise high end or "advanced" audio. My Asus ROG Strix B450 is unusable for outputting into my speakers, headphones are fine though. I had to use externally powered DAC with toslink cable to get rid of all the noise.

Sininu fucked around with this message at 08:42 on Jul 7, 2022

ZombieCrew
Apr 1, 2019

CaptainSarcastic posted:

motherboards with vacuum tubes to run my audio

That sounds reeeally cool though. Even if its just a custom pc design idea and not truly functional. Not steampunk with bulbs and pipes everywhere, but more like 1960's/70s head unit for a guitar or bass

chippy
Aug 16, 2006

OK I DON'T GET IT

Sininu posted:

I don't think it is fully solved yet even on mobos that advertise high end or "advanced" audio. My Asus ROG Strix B450 is unusable for outputting into my speakers, headphones are fine though. I had to use externally powered DAC with toslink cable to get rid of all the noise.

Coincidentally, shortly after I asked on here, a friend messaged me saying he was getting buzzing in his speakers when his GPU was loaded and asking for advice.

Sininu
Jan 8, 2014

chippy posted:

Coincidentally, shortly after I asked on here, a friend messaged me saying he was getting buzzing in his speakers when his GPU was loaded and asking for advice.

Yuuuup, I had the exact same issue. Solved it by using 20 euro Fiio Taishan DAC.

Sininu fucked around with this message at 09:34 on Jul 7, 2022

Cygni
Nov 12, 2005

raring to post

While there are still some boards out there with issues, the vast majority of problems with audio hissing or interference are not coming purely from the board side, in my experience.

Kivi
Aug 1, 2006
I care

repiv posted:

GPUs only output digital audio so all the quality-affecting analog conversion is happening in the monitor

Maybe your monitor has decent audio but nobody tests that so it's a complete crapshoot

My Dell monitor had shockingly bad audio quality on the head phone output socket. Super tinny and awful. Maybe it was "tuned" for voice / office use?

E: Vacuum tubes on MBs eat awful lot of space so they're not as convenient, see https://global.aopen.com/products_detail.aspx?auno=53 :v:

Kivi fucked around with this message at 10:47 on Jul 7, 2022

CoolCab
Apr 17, 2005

glem
no it's just insanely fuckin cheap because no one bothers to test or shop on them. a bit like front panel audio really - if you're the kind of person who it would bother you'd never use it anyway.

Kivi
Aug 1, 2006
I care

CoolCab posted:

no it's just insanely fuckin cheap because no one bothers to test or shop on them. a bit like front panel audio really - if you're the kind of person who it would bother you'd never use it anyway.
They sell speakers for these monitors so I'd guess it wouldn't be that awful? However the GPU audio for me is unusable - my GPU passthrough setup has it always stuttering. Same with cheap USB DACs, the only ones that work properly are with separate wall warts for power, for some reason.

orcane
Jun 13, 2012

Fun Shoe

Cygni posted:

While there are still some boards out there with issues, the vast majority of problems with audio hissing or interference are not coming purely from the board side, in my experience.
Onboard audio can have the newest chip with the best codecs and all the Japanese gold capacitors, but almost none of them are actually shielded against the interference a 400W GPU and 1 kW PSU can put out.

Hardware uncle Igor did several tests on this, the most recent one was just a few weeks ago. A soundcard (yes really) or an external DAC will massively improve the situation (some DACs will require further steps if their USB power connection is transferring the interference from the GPU/PSU).

CoolCab
Apr 17, 2005

glem
do you guys suspect LHR will come back, or they won't bother? if it does come back will it be better, i wonder.

track day bro!
Feb 17, 2005

#essereFerrari
Grimey Drawer
I run my audio from the gpus hdmi that goes into my tv, into the optical out (forget what its called) and into a soundbar. Although I treat my gaming pc like a glorified console anyway

Kibner
Oct 21, 2008

Acguy Supremacy

CoolCab posted:

do you guys suspect LHR will come back, or they won't bother? if it does come back will it be better, i wonder.

I assume it will be the default so will not have any special title.

CoolCab
Apr 17, 2005

glem
well right now it's defeated i mean, but i would hope fixing that is just a matter of issuing new keys or whatever and not having catastrophic leaks again. but i wonder if it will be expanded, it's funny but it kind of turned out that nope they targed with 100% precision exactly what was the problem but if they wanted to future proof it they could expand it to other coins potentially, or to detect mining like behaviour and be coin agnostic somehow, that would be ideal.

orcane
Jun 13, 2012

Fun Shoe
I don't expect them to, miners are (preferred) customers leading to soaring revenue, GPUs going to mining farms is always only addressed once the negative effect on consumers gets overwhelming and risks outdamaging profits from selling in bulk to miners ie. usually when mining demand starts to drop anyway.

Both Nvidia and AMD have zero incentive to cripple mining in advance. Unless they manage to properly limit consumer cards while providing enough chips so higher priced cards without LHR etc. become available and worthwhile to miners.

Limiting mining is hard enough while keeping the longest benchmark bars for games, while the supply issues are not gone with the state the foundry market is in (with a near-monopoly for TSMC).

Cygni
Nov 12, 2005

raring to post

orcane posted:

Onboard audio can have the newest chip with the best codecs and all the Japanese gold capacitors, but almost none of them are actually shielded against the interference a 400W GPU and 1 kW PSU can put out.

Hardware uncle Igor did several tests on this, the most recent one was just a few weeks ago. A soundcard (yes really) or an external DAC will massively improve the situation (some DACs will require further steps if their USB power connection is transferring the interference from the GPU/PSU).

Yeah, I read the piece. I’m not gonna say a lot cause I think it will piss off a lot of the Real AudioHeads and I’m certainly not that, but if you are using high impedance headphones to play video games or hooking an oscilloscope up to the output of a $100 motherboard, I think you already know you don’t want to use the onboard sound regardless of the perceived quality. For the vast majority of users, I am skeptical that spending money on an external dac/amp is really a good use of money. But if the built in sound is doing something that bothers you or you have already maxed everything else out on a gaming PC and want to try audio, why not go wild!

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Mods please don't ban me, I don't own crypto but I do follow some poo poo about it to see what nonsense my fellow man is up to now and this is topically relevant.

Eth was the only really profitable GPU-mined coin. Alternatives never were as good as it and in the current environment it's just kind of a joke trying to make money mining anything, hence miners unloading their cards in massive quantities trying to make something back, and crypto firms eating poo poo left and right.

Even if it were still profitable, there's a high-efficiency ASIC now for mining Eth (which has long been the case with Bitcoin, ASICs drastically outperforming GPUs for the same power usage and thus GPUs just not being used for that anymore) that's been behind the scenes for a while but is releasing into the market, uses a bit under 2000W but offers the hash rate of like 24 RTX 3080s. And it also is not profitable right now, costing more to run than it generates. There's a "difficulty bomb" and upcoming proof-of-stake changes that start with Eth that are looming to make GPU mining even moreso very unprofitable. Unless something changes drastically I think the era of GPU mining might be wrapping up.

Poor nVidia and AMD may need to actually try to sell the next generation to gamers first, not as money printers but just for entertainment.



On the motherboard audio stuff, I actually recorded four or five tracks last year using motherboard audio as my only interface for guitar and it worked fine. Not the world's best signal to noise ratio, not the world's best anything honestly but the tracks still turned out. Synths, bass, and percussion elements were all in-the-box, on my now very old Haswell 4770K build at that, having to freeze everything to work with anything else and not have most of my master bus processing going at any one time. Basically only hear what it "really" sounds like on final render, keeping it in my head in the meantime. Constraints are not ruinous, they're just constraints, it was still a more powerful audio processing setup than shitloads of hits were made on in the '90s and 2000s (though they certainly had better preamps - and, to be clear, I didn't achieve anywhere near the level of their results, I just did my best learning to use my tools). All the pinned tracks at the top's guitar parts were into an old Z87 Sabertooth motherboard's line in: https://soundcloud.com/user-612970086

This year has been dramatically more fun, built a 12900K based PC and got a real audio interface, kitted out for recording guitar through pedals and an amp rather than relying solely on modelers - but I also have my baby boy to take care of in addition to my two older kiddos, and my wife's new job this year doesn't involve any working-from-home so I have to be in bed at a decent time to get up and take care of everybody and the baby, and as such I haven't actually released a track done with the new PC yet. Can't buy time!

Agreed fucked around with this message at 17:04 on Jul 7, 2022

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
I strongly suspect the "memory hard" nature of mining can be detected fairly easily at a perfcounter level if they really put their minds to it.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Didn't they already delay proof-of-stake in 2021? Seems like if they have the ability to vote on it, greed is going to win out each time.

The other points are good, but I wouldn't bank on that one.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

They just did it on the Sepolia test network, within the last 24 hours - we'll see how it shakes out but it does seem to be genuinely looming for this year, at least as of now.

orcane
Jun 13, 2012

Fun Shoe
Eth has been about go to PoS any day now for how many years?

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Agreed posted:

They just did it on the Sepolia test network, within the last 24 hours - we'll see how it shakes out but it does seem to be genuinely looming for this year, at least as of now.

There's going to be a fork, right? I feel like the miners are going to insist that some form of PoW ethereum stays around, which means we get Ethereum Classic 2.0.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Wouldn't non-profitability of POW be a good environment in which to make the switch? I am far from a crypto expert but if greed is the usual limiter, a time when greed is getting poo poo-hammered in the market seems riper to switch than a time when the money printers did indeed go brrrrrrr.

Factor in some of this may be wishful thinking on my part as I long to return to simpler times for the sake of shiny pixels ofc

chippy
Aug 16, 2006

OK I DON'T GET IT
Trying to decide between a 1440p and 4K monitor. Is a (10GB) 3080 ok to run stuff at 4K without worrying too much about having to optimize settings etc.? I'm not worried about consistently hitting 144hz or anything like that, I'm perfectly happy with 60FPS+, but I do generally like to be able to just max out settings and get a smooth experience without too much fiddling. Maybe on High rather than Ultra if necessary. Is 1440p a better choice for that sort of experience with this card or is 4K going to be ok, letting DLSS do its magic to keep the framerate nice?

Gamewise I'm playing through DOOM Eternal at the moment which I know isn't taxing, planning on getting Cyberpunk 2077 at some point, probably Elden Ring too.

I suppose another consideration might be that I think 27" is about the max I want to go size-wise. I don't have much space on my desk and I sit quite close the monitor. I'm also a remote-working developer so this will be the screen I'll be coding on all day too. For that reason I'm inclined towards 4K but I don't know if it will make much difference at this size. I don't have room for two monitors so I have my work laptop on a stand for a second screen intead.

Also, does G-Sync actually work well? Don't think it even existed last time I had a new PC, bought a monitor, etc. I absolutely hate screen tearing so it seems like a pretty dope thing to have.

I know this is the GPU thread not the monitor thread, figured I'd head over there once I've answered the GPU performance question.

hobbesmaster
Jan 28, 2008

Nothing can play cyberpunk at 4k native with all settings maxed. Even a 3090 needs some RT effects turned off or DLSS enabled. Given that I’d be uncertain of a 3080’s longevity if you must have 4k performance but it should be excellent at 1440p for some time to come.

fat bossy gerbil
Jul 1, 2007

I know that nothing is really a good buy right now but is an RX6600 at $280 likely to drop by a significant margin anytime soon? I’m considering building a budget rig and I’d hate to pay nearly $300 now only to find out I could have had it for $200 a couple months down the line. After the first crypto related shortage I just checked the gently caress out of PC gaming when I realized the days of a decent $200 video card were gone and probably weren’t coming back.

Kibner
Oct 21, 2008

Acguy Supremacy

chippy posted:

I suppose another consideration might be that I think 27" is about the max I want to go size-wise. I don't have much space on my desk and I sit quite close the monitor. I'm also a remote-working developer so this will be the screen I'll be coding on all day too. For that reason I'm inclined towards 4K but I don't know if it will make much difference at this size. I don't have room for two monitors so I have my work laptop on a stand for a second screen intead.

For this situation, it can often be helpful to put a second monitor into portrait mode to save space, but still have very usable screen real estate.

chippy posted:

Also, does G-Sync actually work well? Don't think it even existed last time I had a new PC, bought a monitor, etc. I absolutely hate screen tearing so it seems like a pretty dope thing to have.

Yes, it works very well. There are some settings that you will have to enable to have the best experience, but it works very well.

chippy
Aug 16, 2006

OK I DON'T GET IT

Kibner posted:

For this situation, it can often be helpful to put a second monitor into portrait mode to save space, but still have very usable screen real estate.

Yes, it works very well. There are some settings that you will have to enable to have the best experience, but it works very well.

It's a good suggestion, thank you, but unfortunately also not an option. Main reason being I also have a couple of studio monitors (speakers) on the desk and any additional monitors would block those. Ideally I need to keep a bigger desk or some stands for the speaker but I'm really pushed for space. I could get a monitor arm maybe but the screen would probably have to go way off to the side or above my existing screen, neither of which would be great ergonomically.


hobbesmaster posted:

Nothing can play cyberpunk at 4k native with all settings maxed. Even a 3090 needs some RT effects turned off or DLSS enabled. Given that I’d be uncertain of a 3080’s longevity if you must have 4k performance but it should be excellent at 1440p for some time to come.

I suppose being able to play Cyberpunk maxed out is not the be all and end all if that's a bit of a anomaly, but I guess it doesn't bode well for games coming out over then next few years either. I'm not averse to using DLSS though, if that's all it takes to make it work.

CaptainSarcastic
Jul 6, 2013



chippy posted:

Also, does G-Sync actually work well? Don't think it even existed last time I had a new PC, bought a monitor, etc. I absolutely hate screen tearing so it seems like a pretty dope thing to have.

Kibner posted:

Yes, it works very well. There are some settings that you will have to enable to have the best experience, but it works very well.

Also be aware that G-Sync and Freesync are pretty much interchangeable nowadays, so if you have a newer Nvidia GPU and a newer Freesync monitor then it should work. Official G-Sync monitors tend to cost extra and often have things that could be annoying, like cooling fans built into the monitor.

Going with an Nvidia card does not mean you have to get a G-Sync monitor - Freesync-compatible will achieve basically the same results.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Cyberpunk does look awesome but if you play other games that look awesome in the current gen and take advantage of ray tracing and all that, it's still an anomaly for performance. I think it might not be the world's most optimized engine, any more than it's the world's most bug-free game.

Enos Cabell
Nov 3, 2004


chippy posted:

I suppose being able to play Cyberpunk maxed out is not the be all and end all if that's a bit of a anomaly, but I guess it doesn't bode well for games coming out over then next few years either. I'm not averse to using DLSS though, if that's all it takes to make it work.

I play at 1440p with a 3080ti, and there are several games I have to turn down settings on to hit a 90-120fps target.

Dr. Video Games 0031
Jul 17, 2004

Cyberpunk is an anomaly, but it does represent what kind of performance you should expect from "next-gen" AAA gaming experiences. Which I'm sure we'll start getting more of at some point... right?

The 3080 can handle 4K fine with DLSS for now. But, I dunno. Am I alone in feeling like 4K gaming at 27" isn't all that large of an upgrade over 1440p gaming at 27"? Like, it's nicer, sure, but I feel like it's not worth the performance you lose. I'd rather have 1440p 120fps than 4K 60fps at that size, but maybe that just speaks to my personal preference for higher frame rates. For a 32" monitor or bigger, I can see how 4K makes more sense though.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

Dr. Video Games 0031 posted:

The 3080 can handle 4K fine with DLSS for now. But, I dunno. Am I alone in feeling like 4K gaming at 27" isn't all that large of an upgrade over 1440p gaming at 27"?

I think you're completely right. Personally, the only reason I game at 4K is because I use a 65 inch TV that I'm about 6 feet away from. I would go 1440p without worrying about it on a 27 inch.

We're firmly in the age of diminishing returns.

I switched to 4K 144hz yesterday with my new QD OLED and frankly the difference between 144 and 120 (on the games that even work at such high frame rates) is hard to discern. I can see the 4090 being the last card I purchase for a long time because frankly we're hitting that wall where bigger numbers are starting to not matter.

Taima fucked around with this message at 19:21 on Jul 7, 2022

CoolCab
Apr 17, 2005

glem
27 inch 1440p supremacy

chippy
Aug 16, 2006

OK I DON'T GET IT
Hmm, starting to lean towards 1440p given these replies. Although part of me feels like that's not going to seem like that much of an upgrade from 1900x1200 24". The PPI difference isn't huge.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
You're probably right. that is quite a small monitor. But you'll definitely see the difference for such things as productivity/spreadsheet work. 1440p is a larger step up than you probably think.

But yeah man, if you're happy with your current 1080p monitor you could certainly make the argument that the step up you are describing won't be life changing. Maybe you should save the cash.

If you do want a 4K monitor I don't think the benefits really start appearing until more like 30 inches if not more.

e: For productivity though, PPI is huge. I used to run a 34 inch ultrawide 5k2k monitor, which is very high PPI. That was too small for me- I now run a 40 inch 5K2K monitor instead for that very reason- but if you are coding or in spreadsheets or both all day? PPI can be worth it in a way that it's simply not if you aren't consistently doing those things.

Taima fucked around with this message at 19:28 on Jul 7, 2022

Hemish
Jan 25, 2005

chippy posted:

Hmm, starting to lean towards 1440p given these replies. Although part of me feels like that's not going to seem like that much of an upgrade from 1900x1200 24". The PPI difference isn't huge.

I went from 1200p to 1440p and it was a big difference. I have bad eyesight even with glasses and it took me quite a while to adjust and had to zoom text and such where I didn't feel the need at all with my old 1200p 24".

I don't really see the difference between 2k and 4k in games but I do notice how smooth it looks when I'm in a game where I can crank the settings and have something close to 144fps because I'm on a 27" 1440p 144hz monitor. I think even if we could do 4k 60fps 100% of the time, I would still prefer 2k 144fps 100% of the time.

Adbot
ADBOT LOVES YOU

Enos Cabell
Nov 3, 2004


Hemish posted:

I think even if we could do 4k 60fps 100% of the time, I would still prefer 2k 144fps 100% of the time.

100% this

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply