Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
BurritoJustice
Oct 9, 2012

repiv posted:

yes, it's been the norm on consoles since the PS4Pro/XB1X generation since those systems weren't really capable of driving native 4K, and that tech is filtering it's way onto PC now

speaking of which, spiderman PC offers FSR2, DLSS and insomniacs own TAAU they use on the playstation, i wonder how that stacks up to the usual PC scalers

I fiddled with it a bit, TAAU has greater than margin (~10%) better performance than DLSS or FSR2 but it looks significantly worse than both.

Performance:

TAAU > DLSS > FSR

Image:

DLSS > FSR >> TAAU

I wouldn't use it ever really, given that FSR2 should work on anything that TAAU does for Spiderman and the performance difference shouldn't be a dealbreaker. I'm using DLAA personally because I end up single core CPU bottlenecked most of the time when DXR is enabled (3080 10GB), raytracing seems to add a massive load to one core that caps performance on my 9900K. It's slightly oversharpened but otherwise looks fantastic.

Adbot
ADBOT LOVES YOU

tehinternet
Feb 14, 2005

Semantically, "you" is both singular and plural, though syntactically it is always plural. It always takes a verb form that originally marked the word as plural.

Also, there is no plural when the context is an argument with an individual rather than a group. Somfin shouldn't put words in my mouth.

teagone posted:

I've always used my PC's primarily for gaming, and I would likely see no difference in gaming quality/experience between 4k and 1080p on a 24-25" panel at the distance I'm sitting away from the monitor. I also have always skewed towards mainstream GPUs, and the models in that price bracket ($200-$250ish) typically haven't been able to push past 200+ FPS at 4K resolution.

Ohhhhhhhh okay, yeah if you’re at 24” 4K makes zero sense. Didn’t factor in PPI or screen size

DoombatINC
Apr 20, 2003

Here's the thing, I'm a feminist.





teagone posted:

Yeah, if I ever moved up beyond 24 or 25" displays, I'd go for 27" 1440p at the most. But my old XG2560 (240Hz TN G-Sync panel) and my newer XG2405 (FreeSync Premium panel that I also have an Xbox Series S hooked up to) are pretty much perfect for me.

Reason I want OLED is for better colors and eliminating backlight bleed. I care more about higher refresh rate than I do higher resolution.

This is where I'm at, too - I'm happy with 1080p because I'm happy with ~24 inch monitors, and I'd be more willing to spend a couple extra bucks on a nicer quality / higher refresh rate / lower latency monitor than a bigger one with higher resolutions

repiv
Aug 13, 2009

BurritoJustice posted:

I fiddled with it a bit, TAAU has greater than margin (~10%) better performance than DLSS or FSR2 but it looks significantly worse than both.

Performance:

TAAU > DLSS > FSR

Image:

DLSS > FSR >> TAAU

I wouldn't use it ever really, given that FSR2 should work on anything that TAAU does for Spiderman and the performance difference shouldn't be a dealbreaker. I'm using DLAA personally because I end up single core CPU bottlenecked most of the time when DXR is enabled (3080 10GB), raytracing seems to add a massive load to one core that caps performance on my 9900K. It's slightly oversharpened but otherwise looks fantastic.

i suppose it makes sense that they'd prioritize performance over image quality when they were developing their own TAAU for the dinky little PS4Pro GPU

Indiana_Krom
Jun 18, 2007
Net Slacker
Having used a 24" 1080p/240 Hz TN monitor before, and now using a 27" 1440p/240 Hz IPS monitor, I honestly think the 24"/1080p setup was better. 27"/1440p is almost too big, the higher pixel density isn't particularly useful and it is much harder to keep the frame rates above 100. I was using 150% scaling at 1080p and I'm still using it at 1440p so I have slightly more usable desktop space, but otherwise there is not much benefit. The 27" monitor has more/deeper colors than the old TN (which itself had a very respectable ~98% sRGB with a delta E in the low to mid 1 range). This monitor has more/brighter/deeper colors but it is actually LESS accurate because it is running like ~130% sRGB but you get used to it. The biggest benefit is less color/contrast shifting at angles thanks to the IPS panel, but it came at a cost of uniformity and some glow. The glow is easy to mitigate; nobody should ever use a LCD of any type in a dark room but the uniformity is a bit harder to deal with, especially on an edge lit display like this.

tehinternet
Feb 14, 2005

Semantically, "you" is both singular and plural, though syntactically it is always plural. It always takes a verb form that originally marked the word as plural.

Also, there is no plural when the context is an argument with an individual rather than a group. Somfin shouldn't put words in my mouth.
I got a Korean special 1440p 27” 60hz monitor back in 2013 and haven’t looked back.

I use 24” 1440p monitors at work and yeah, the sharpness is nice, but the selling point IMO is how much stuff you can fit on the screen and still have it legible.

Switching to 32” 4K has been a huge quality of life improvement even from 1440p. 4K being a multiple of 1080p means that it plays nice with the limited output of screen sharing programs like Teams, Zoom, etc. since if you want to share something in 1080p, you only need to share a quarter of your screen.

It’s a pet peeve — and a negative of higher resolution screens— that streaming is limited to 1080p with those programs but from a datacenter perspective I can see the benefit to limiting it. It just makes collaboration a little more difficult that you can’t share in native resolution so clarity isn’t lost. Lotta window resizing/rescaling.

I agree though that we’re rapidly approaching the point where a higher resolution monitor won’t make sense. We’re already almost to the point that you can’t see pixels at 4K on 32”. I’d imagine you’d be hard pressed to see them on a 27” monitor. And at that point, what’s the point of going higher?

powderific
May 13, 2004

Grimey Drawer
I brought my 60hz 4k 27” to the office and replaced it with a 120hz 1440p 27” cause it was cheaper and I figured it’d be better for games and I can really, really tell the difference during non-game stuff and it bugs me. A month in I still haven’t had a chance to game on it so maybe the high refresh and easier GPU will win me over but right now wish I would have just gotten another 60hz 4k. OTOH my 24” 1920x1200 screen doesn’t bother me at all even though it’s slightly lower PPI.

Cygni
Nov 12, 2005

raring to post

https://twitter.com/videocardz/status/1558398025600565248?s=21&t=-cmkppzCjdvBA-G5SBSRmg

That’s it. It’s time to give up on making architecture “code names” that really only get used in the press. I don’t want to really be overheating and cranking on my Pink Sardine, nor overworking my tired Hotpink Blowfish.

njsykora
Jan 23, 2012

Robots confuse squirrels.


BRB maintaining my Wheat NAS.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
GPU Megat[H]read - Lil Wheat Nas X

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Hotpink and the Bonefish.

wargames
Mar 16, 2008

official yospos cat censor

Cygni posted:

https://twitter.com/videocardz/status/1558398025600565248?s=21&t=-cmkppzCjdvBA-G5SBSRmg

That’s it. It’s time to give up on making architecture “code names” that really only get used in the press. I don’t want to really be overheating and cranking on my Pink Sardine, nor overworking my tired Hotpink Blowfish.

these are so much better then what NVidia is doing.

Cream-of-Plenty
Apr 21, 2010

"The world is a hellish place, and bad writing is destroying the quality of our suffering."
You open the door to my office. The wall of heat from my Pink Sardine hits you head-on; it is stifling.

tehinternet
Feb 14, 2005

Semantically, "you" is both singular and plural, though syntactically it is always plural. It always takes a verb form that originally marked the word as plural.

Also, there is no plural when the context is an argument with an individual rather than a group. Somfin shouldn't put words in my mouth.

wargames posted:

these are so much better then what NVidia is doing.

“Plum Bonito” and “Wheat Nas” are better than the names of notable scientists/computer people?

I mean, I guess we are talking about these stupid rear end names so there’s that I guess?

hobbesmaster
Jan 28, 2008

Cygni posted:

https://twitter.com/videocardz/status/1558398025600565248?s=21&t=-cmkppzCjdvBA-G5SBSRmg

That’s it. It’s time to give up on making architecture “code names” that really only get used in the press. I don’t want to really be overheating and cranking on my Pink Sardine, nor overworking my tired Hotpink Blowfish.

The problem is that engineers need to be working on stuff for years before marketing even needs to look at a product. And, the product may even be cancelled and that’d be a waste of time to try and put a real name together. So instead engineering has to call the thing they’re working on something. Product management and Engineering will have to use that name everywhere from MRDs to code, in sprint planning and so forth. Then when it can run or they need s partner’s help on something, application engineering (or whatever customer facing engineers are called) need to call it something. Now the name is outside the company. That company will probably need to call the project something and so forth.

Now that more companies involved the probability of leaks increases no matter how tight the security is. Enthusiast press learns the code name and starts asking everyone they can about it. Then readers pick up on it and show that they are knowledgeable about the latest tech by using the engineering code names.

Now your codenames are everywhere and you may as well just make it official.

njsykora
Jan 23, 2012

Robots confuse squirrels.


tehinternet posted:

“Plum Bonito” and “Wheat Nas” are better than the names of notable scientists/computer people?

I mean, I guess we are talking about these stupid rear end names so there’s that I guess?

I can hear the names of some old guy anywhere though, I can't say before today I'd heard the combination of words that is "Hotpink Bonefish"

repiv
Aug 13, 2009

hobbesmaster posted:

Now that more companies involved the probability of leaks increases no matter how tight the security is. Enthusiast press learns the code name and starts asking everyone they can about it. Then readers pick up on it and show that they are knowledgeable about the latest tech by using the engineering code names.

Now your codenames are everywhere and you may as well just make it official.

that's what happened with nvidia, they've been doing the scientist codenames almost since the very beginning but they only started leaning into them as branding once they starting leaking into public discussions

the early nvidia codenames you may not have heard of were Fahrenheit, Celsius, Kelvin, Rankine, Curie and Tesla, before Fermi, Kepler, Maxwell, etc

DoombatINC
Apr 20, 2003

Here's the thing, I'm a feminist.





tehinternet posted:

“Plum Bonito” and “Wheat Nas” are better than the names of notable scientists/computer people?

I mean, I guess we are talking about these stupid rear end names so there’s that I guess?

I would take an RNG spitting out a random assortment of color + fish combinations over any cooperate codenames feigning meaning or pathos, absolutely and without question

Cygni
Nov 12, 2005

raring to post

At least from what I've heard lately, most companies dont actually use the codenames we hear in the public internally anymore and they are mostly made up for the press (or to purposefully "leak" :ssh: ). Someone more connected probably knows more about that than i do though. i doubt anyone in the lunchroom is talking about their work on Hotpink Blowfish.

Shipon
Nov 7, 2005

tehinternet posted:

I don’t really get this — how can you not care for a certain resolution? It seems immaterial beyond higher is better, but higher isn’t better for you?

Is it because it’s cost prohibitive or is it a nostalgia thing or something else? I could see wanting to play on old tech for older games, that could make sense.

I actually can't tell the difference between 1440p and 4k on a computer monitor (my 32" 4K monitor vs my 34" 1440p ultrawide) when it comes to gaming, but oddly enough I can when it comes to browsing and doing productivity work.

GuyonthecoucH
Apr 10, 2004

Integer scaling old fps games to get the sharp pixels of my dreams is magic. I caved and bought a FE 3060 ti and this has been the favorite feature. It works much better than the lossless scaling app off steam. I'm on a 24" 1440p 165hz Dell monitor and I'm now torn between the ViewSonic blurbuster monitor or something 32" 4k related for work, with integer scaling in play to get better frames when necessary for games.

LRADIKAL
Jun 10, 2001

Fun Shoe
Maybe we need to disclose not only distance and size of display, but also vision quality at that range.

Dr. Video Games 0031
Jul 17, 2004

BurritoJustice posted:

I fiddled with it a bit, TAAU has greater than margin (~10%) better performance than DLSS or FSR2 but it looks significantly worse than both.

Performance:

TAAU > DLSS > FSR

Image:

DLSS > FSR >> TAAU

I wouldn't use it ever really, given that FSR2 should work on anything that TAAU does for Spiderman and the performance difference shouldn't be a dealbreaker. I'm using DLAA personally because I end up single core CPU bottlenecked most of the time when DXR is enabled (3080 10GB), raytracing seems to add a massive load to one core that caps performance on my 9900K. It's slightly oversharpened but otherwise looks fantastic.

Digital Foundry's analysis of these techniques in spiderman is coming later, but they briefly showed one comparison in the video they posted yesterday, and yeah, their TAAU solution ("IGTI") was noticeably blurrier than DLSS and FSR 2.0. It looked more like FSR 1.0. FSR 2.0 and DLSS looked almost identical except there was a spot in DF's comparison that flickered distractingly with FSR 2.0 and IGTI but not with DLSS. Seen at 50 seconds in:

https://www.youtube.com/watch?v=xI2VUQsPqJo&t=49s

repiv posted:

i suppose it makes sense that they'd prioritize performance over image quality when they were developing their own TAAU for the dinky little PS4Pro GPU

It may be faster at the same source resolution, but I suspect that FSR 2.0 and DLSS would give more performance when normalizing for image quality and look better when normalizing for performance, so I don't think IGTI has any real advantage.

MarcusSA
Sep 23, 2007

LRADIKAL posted:

Maybe we need to disclose not only distance and size of display, but also vision quality at that range.

Yeah I’m gonna start needing to see some prescriptions here.

Listerine
Jan 5, 2005

Exquisite Corpse

Dr. Video Games 0031 posted:

I figured it would be something like that, and I'm glad it wasn't a sneaky crypto miner.

I still think 82C for 148W on a 3060 Ti Founders edition is kind of alarming, but maybe F@H was hammering it in a way that heats up the GPU more? I'm looking at that case, and it seems like the GPUs should at least have plenty of room to breathe. That's also a lot of fans, though the positioning of them is a little odd and not conducive to great airflow (nothing you can do about it, that's just how the case is built). But if everything works well during the normal, intended usage of those GPUs, then I guess I wouldn't worry about it.

I'm also glad it was something relatively benign, but man I have no idea what FaH was doing, I did a few render tests with subsurface scattering settings cranked up and neither card got as hot as the 3060Ti yesterday.

These cards are dynamite though, rendering in 1 minute what used to take me 9. I might regret it when the 4000 series drops but I think I'd rather have the bird in the hand then gamble on being able to get the new cards at launch, after being stuck using a 780 in my rig for the past two years.

Shipon
Nov 7, 2005
So something odd started happening today, I have a C2 OLED connected to my PC and playing Flight Sim on it with HDR enabled, it worked fine until today when I started getting this random flickering. I first saw this happen in Flight Sim, but just launched Dyson Sphere Program and saw similar behavior as well. Turning off HDR seems to get rid of this issue - is this just another example of HDR being broken on PC, or could it be something like the cable going bad?

https://i.imgur.com/9fOWcvQ.mp4

kliras
Mar 27, 2021
what graphics card is it? i randomly read this the other day about amd cards doing some weird stuff with hdr that you need to account for

https://old.reddit.com/r/OLED_Gaming/comments/wlfjkx/psa_in_the_latest_amd_drivers_10bit_pixel_format/

repiv
Aug 13, 2009

it could be the cable, enabling HDR bumps the required bandwidth up by 25% so it might be pushing it over the point where the signal becomes flaky

tehinternet
Feb 14, 2005

Semantically, "you" is both singular and plural, though syntactically it is always plural. It always takes a verb form that originally marked the word as plural.

Also, there is no plural when the context is an argument with an individual rather than a group. Somfin shouldn't put words in my mouth.

Shipon posted:

I actually can't tell the difference between 1440p and 4k on a computer monitor (my 32" 4K monitor vs my 34" 1440p ultrawide) when it comes to gaming, but oddly enough I can when it comes to browsing and doing productivity work.

I have the same setup and find that the only difference I really 100% notice is in FPS games where long distance shooting is a thing. Having that extra resolution lets you see people more clearly.

But then you’re giving up the aspect ratio benefit of the ultrawide. It’s a tough call for me. UW def still shines in strategy games too.

DoombatINC posted:

I would take an RNG spitting out a random assortment of color + fish combinations over any cooperate codenames feigning meaning or pathos, absolutely and without question

Oh, I don’t really read into the pathos as they’re still a corporation. I think having a consistent naming convention is better than a meme generated one. Admittedly, I’ve got ZANY IRONY burnout having been on SA for almost 20 years.

Also, get off my lawn.

Shipon
Nov 7, 2005

repiv posted:

it could be the cable, enabling HDR bumps the required bandwidth up by 25% so it might be pushing it over the point where the signal becomes flaky
I was sort of thinking that might be the case too but I bought the C2 and the cable a month ago, would be odd for a cable to crap out that early no?

kliras
Mar 27, 2021
it wouldn't crap out as much as run into a new bottleneck. with tv and movies, you have 12-bit dolby vision, but usually at 24p, whereas with videogames, your framerate can go up to 120

Dr. Video Games 0031
Jul 17, 2004

Shipon posted:

So something odd started happening today, I have a C2 OLED connected to my PC and playing Flight Sim on it with HDR enabled, it worked fine until today when I started getting this random flickering. I first saw this happen in Flight Sim, but just launched Dyson Sphere Program and saw similar behavior as well. Turning off HDR seems to get rid of this issue - is this just another example of HDR being broken on PC, or could it be something like the cable going bad?

https://i.imgur.com/9fOWcvQ.mp4

Have you done a driver update in the last day? If so, try rolling it back.

Shipon
Nov 7, 2005

Dr. Video Games 0031 posted:

Have you done a driver update in the last day? If so, try rolling it back.

No driver updates...but you did just make me check and realize that I'm still on geforce drivers from december lmao

EDIT: Yeah rebooted to install updates and the problem is gone now.

Shipon fucked around with this message at 22:45 on Aug 13, 2022

hobbesmaster
Jan 28, 2008

Cygni posted:

At least from what I've heard lately, most companies dont actually use the codenames we hear in the public internally anymore and they are mostly made up for the press (or to purposefully "leak" :ssh: ). Someone more connected probably knows more about that than i do though. i doubt anyone in the lunchroom is talking about their work on Hotpink Blowfish.

Yeah they’d probably shorten it to blowfish.

Dr. Video Games 0031
Jul 17, 2004

MarcusSA
Sep 23, 2007

Not a terrible deal tbh

ijyt
Apr 10, 2012

tehinternet posted:

I don’t really get this — how can you not care for a certain resolution? It seems immaterial beyond higher is better, but higher isn’t better for you?

Is it because it’s cost prohibitive or is it a nostalgia thing or something else? I could see wanting to play on old tech for older games, that could make sense.

If you offer me a 16:9 4K 144Hz panel or a 21:9 1440 120Hz QD-OLED panel, I'm taking the ultrawide every time. The wider horizontal is way more valuable to me for media consumption than any boost in pixel density. It's also way easier to run so I can turn on all the bells and whistles in a games setting without seeing it dropping below 30fps.

Lord Stimperor
Jun 13, 2018

I'm a lovable meme.


That is not at all enough explanation, I need like 5-6 effort posts on this

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
It's a Frankenstein's monster that probably flashes 'please kill me' for a single frame every once in a while

Adbot
ADBOT LOVES YOU

tehinternet
Feb 14, 2005

Semantically, "you" is both singular and plural, though syntactically it is always plural. It always takes a verb form that originally marked the word as plural.

Also, there is no plural when the context is an argument with an individual rather than a group. Somfin shouldn't put words in my mouth.

Zedsdeadbaby posted:

It's a Frankenstein's monster that probably flashes 'please kill me' for a single frame every once in a while

This is all the explanation that’s needed for that photo.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply