|
repiv posted:yes, it's been the norm on consoles since the PS4Pro/XB1X generation since those systems weren't really capable of driving native 4K, and that tech is filtering it's way onto PC now I fiddled with it a bit, TAAU has greater than margin (~10%) better performance than DLSS or FSR2 but it looks significantly worse than both. Performance: TAAU > DLSS > FSR Image: DLSS > FSR >> TAAU I wouldn't use it ever really, given that FSR2 should work on anything that TAAU does for Spiderman and the performance difference shouldn't be a dealbreaker. I'm using DLAA personally because I end up single core CPU bottlenecked most of the time when DXR is enabled (3080 10GB), raytracing seems to add a massive load to one core that caps performance on my 9900K. It's slightly oversharpened but otherwise looks fantastic.
|
# ? Aug 13, 2022 16:13 |
|
|
# ? Jun 6, 2024 00:54 |
|
teagone posted:I've always used my PC's primarily for gaming, and I would likely see no difference in gaming quality/experience between 4k and 1080p on a 24-25" panel at the distance I'm sitting away from the monitor. I also have always skewed towards mainstream GPUs, and the models in that price bracket ($200-$250ish) typically haven't been able to push past 200+ FPS at 4K resolution. Ohhhhhhhh okay, yeah if you’re at 24” 4K makes zero sense. Didn’t factor in PPI or screen size
|
# ? Aug 13, 2022 17:25 |
|
teagone posted:Yeah, if I ever moved up beyond 24 or 25" displays, I'd go for 27" 1440p at the most. But my old XG2560 (240Hz TN G-Sync panel) and my newer XG2405 (FreeSync Premium panel that I also have an Xbox Series S hooked up to) are pretty much perfect for me. This is where I'm at, too - I'm happy with 1080p because I'm happy with ~24 inch monitors, and I'd be more willing to spend a couple extra bucks on a nicer quality / higher refresh rate / lower latency monitor than a bigger one with higher resolutions
|
# ? Aug 13, 2022 17:33 |
|
BurritoJustice posted:I fiddled with it a bit, TAAU has greater than margin (~10%) better performance than DLSS or FSR2 but it looks significantly worse than both. i suppose it makes sense that they'd prioritize performance over image quality when they were developing their own TAAU for the dinky little PS4Pro GPU
|
# ? Aug 13, 2022 17:39 |
|
Having used a 24" 1080p/240 Hz TN monitor before, and now using a 27" 1440p/240 Hz IPS monitor, I honestly think the 24"/1080p setup was better. 27"/1440p is almost too big, the higher pixel density isn't particularly useful and it is much harder to keep the frame rates above 100. I was using 150% scaling at 1080p and I'm still using it at 1440p so I have slightly more usable desktop space, but otherwise there is not much benefit. The 27" monitor has more/deeper colors than the old TN (which itself had a very respectable ~98% sRGB with a delta E in the low to mid 1 range). This monitor has more/brighter/deeper colors but it is actually LESS accurate because it is running like ~130% sRGB but you get used to it. The biggest benefit is less color/contrast shifting at angles thanks to the IPS panel, but it came at a cost of uniformity and some glow. The glow is easy to mitigate; nobody should ever use a LCD of any type in a dark room but the uniformity is a bit harder to deal with, especially on an edge lit display like this.
|
# ? Aug 13, 2022 17:49 |
|
I got a Korean special 1440p 27” 60hz monitor back in 2013 and haven’t looked back. I use 24” 1440p monitors at work and yeah, the sharpness is nice, but the selling point IMO is how much stuff you can fit on the screen and still have it legible. Switching to 32” 4K has been a huge quality of life improvement even from 1440p. 4K being a multiple of 1080p means that it plays nice with the limited output of screen sharing programs like Teams, Zoom, etc. since if you want to share something in 1080p, you only need to share a quarter of your screen. It’s a pet peeve — and a negative of higher resolution screens— that streaming is limited to 1080p with those programs but from a datacenter perspective I can see the benefit to limiting it. It just makes collaboration a little more difficult that you can’t share in native resolution so clarity isn’t lost. Lotta window resizing/rescaling. I agree though that we’re rapidly approaching the point where a higher resolution monitor won’t make sense. We’re already almost to the point that you can’t see pixels at 4K on 32”. I’d imagine you’d be hard pressed to see them on a 27” monitor. And at that point, what’s the point of going higher?
|
# ? Aug 13, 2022 18:08 |
|
I brought my 60hz 4k 27” to the office and replaced it with a 120hz 1440p 27” cause it was cheaper and I figured it’d be better for games and I can really, really tell the difference during non-game stuff and it bugs me. A month in I still haven’t had a chance to game on it so maybe the high refresh and easier GPU will win me over but right now wish I would have just gotten another 60hz 4k. OTOH my 24” 1920x1200 screen doesn’t bother me at all even though it’s slightly lower PPI.
|
# ? Aug 13, 2022 18:14 |
|
https://twitter.com/videocardz/status/1558398025600565248?s=21&t=-cmkppzCjdvBA-G5SBSRmg That’s it. It’s time to give up on making architecture “code names” that really only get used in the press. I don’t want to really be overheating and cranking on my Pink Sardine, nor overworking my tired Hotpink Blowfish.
|
# ? Aug 13, 2022 18:40 |
|
BRB maintaining my Wheat NAS.
|
# ? Aug 13, 2022 18:44 |
|
GPU Megat[H]read - Lil Wheat Nas X
|
# ? Aug 13, 2022 18:51 |
|
Hotpink and the Bonefish.
|
# ? Aug 13, 2022 18:55 |
|
Cygni posted:https://twitter.com/videocardz/status/1558398025600565248?s=21&t=-cmkppzCjdvBA-G5SBSRmg these are so much better then what NVidia is doing.
|
# ? Aug 13, 2022 19:56 |
You open the door to my office. The wall of heat from my Pink Sardine hits you head-on; it is stifling.
|
|
# ? Aug 13, 2022 19:57 |
|
wargames posted:these are so much better then what NVidia is doing. “Plum Bonito” and “Wheat Nas” are better than the names of notable scientists/computer people? I mean, I guess we are talking about these stupid rear end names so there’s that I guess?
|
# ? Aug 13, 2022 20:06 |
|
Cygni posted:https://twitter.com/videocardz/status/1558398025600565248?s=21&t=-cmkppzCjdvBA-G5SBSRmg The problem is that engineers need to be working on stuff for years before marketing even needs to look at a product. And, the product may even be cancelled and that’d be a waste of time to try and put a real name together. So instead engineering has to call the thing they’re working on something. Product management and Engineering will have to use that name everywhere from MRDs to code, in sprint planning and so forth. Then when it can run or they need s partner’s help on something, application engineering (or whatever customer facing engineers are called) need to call it something. Now the name is outside the company. That company will probably need to call the project something and so forth. Now that more companies involved the probability of leaks increases no matter how tight the security is. Enthusiast press learns the code name and starts asking everyone they can about it. Then readers pick up on it and show that they are knowledgeable about the latest tech by using the engineering code names. Now your codenames are everywhere and you may as well just make it official.
|
# ? Aug 13, 2022 20:11 |
|
tehinternet posted:“Plum Bonito” and “Wheat Nas” are better than the names of notable scientists/computer people? I can hear the names of some old guy anywhere though, I can't say before today I'd heard the combination of words that is "Hotpink Bonefish"
|
# ? Aug 13, 2022 20:17 |
|
hobbesmaster posted:Now that more companies involved the probability of leaks increases no matter how tight the security is. Enthusiast press learns the code name and starts asking everyone they can about it. Then readers pick up on it and show that they are knowledgeable about the latest tech by using the engineering code names. that's what happened with nvidia, they've been doing the scientist codenames almost since the very beginning but they only started leaning into them as branding once they starting leaking into public discussions the early nvidia codenames you may not have heard of were Fahrenheit, Celsius, Kelvin, Rankine, Curie and Tesla, before Fermi, Kepler, Maxwell, etc
|
# ? Aug 13, 2022 20:18 |
|
tehinternet posted:“Plum Bonito” and “Wheat Nas” are better than the names of notable scientists/computer people? I would take an RNG spitting out a random assortment of color + fish combinations over any cooperate codenames feigning meaning or pathos, absolutely and without question
|
# ? Aug 13, 2022 20:19 |
|
At least from what I've heard lately, most companies dont actually use the codenames we hear in the public internally anymore and they are mostly made up for the press (or to purposefully "leak" ). Someone more connected probably knows more about that than i do though. i doubt anyone in the lunchroom is talking about their work on Hotpink Blowfish.
|
# ? Aug 13, 2022 20:21 |
|
tehinternet posted:I don’t really get this — how can you not care for a certain resolution? It seems immaterial beyond higher is better, but higher isn’t better for you? I actually can't tell the difference between 1440p and 4k on a computer monitor (my 32" 4K monitor vs my 34" 1440p ultrawide) when it comes to gaming, but oddly enough I can when it comes to browsing and doing productivity work.
|
# ? Aug 13, 2022 20:47 |
|
Integer scaling old fps games to get the sharp pixels of my dreams is magic. I caved and bought a FE 3060 ti and this has been the favorite feature. It works much better than the lossless scaling app off steam. I'm on a 24" 1440p 165hz Dell monitor and I'm now torn between the ViewSonic blurbuster monitor or something 32" 4k related for work, with integer scaling in play to get better frames when necessary for games.
|
# ? Aug 13, 2022 21:14 |
|
Maybe we need to disclose not only distance and size of display, but also vision quality at that range.
|
# ? Aug 13, 2022 21:23 |
|
BurritoJustice posted:I fiddled with it a bit, TAAU has greater than margin (~10%) better performance than DLSS or FSR2 but it looks significantly worse than both. Digital Foundry's analysis of these techniques in spiderman is coming later, but they briefly showed one comparison in the video they posted yesterday, and yeah, their TAAU solution ("IGTI") was noticeably blurrier than DLSS and FSR 2.0. It looked more like FSR 1.0. FSR 2.0 and DLSS looked almost identical except there was a spot in DF's comparison that flickered distractingly with FSR 2.0 and IGTI but not with DLSS. Seen at 50 seconds in: https://www.youtube.com/watch?v=xI2VUQsPqJo&t=49s repiv posted:i suppose it makes sense that they'd prioritize performance over image quality when they were developing their own TAAU for the dinky little PS4Pro GPU It may be faster at the same source resolution, but I suspect that FSR 2.0 and DLSS would give more performance when normalizing for image quality and look better when normalizing for performance, so I don't think IGTI has any real advantage.
|
# ? Aug 13, 2022 21:26 |
|
LRADIKAL posted:Maybe we need to disclose not only distance and size of display, but also vision quality at that range. Yeah I’m gonna start needing to see some prescriptions here.
|
# ? Aug 13, 2022 21:27 |
|
Dr. Video Games 0031 posted:I figured it would be something like that, and I'm glad it wasn't a sneaky crypto miner. I'm also glad it was something relatively benign, but man I have no idea what FaH was doing, I did a few render tests with subsurface scattering settings cranked up and neither card got as hot as the 3060Ti yesterday. These cards are dynamite though, rendering in 1 minute what used to take me 9. I might regret it when the 4000 series drops but I think I'd rather have the bird in the hand then gamble on being able to get the new cards at launch, after being stuck using a 780 in my rig for the past two years.
|
# ? Aug 13, 2022 22:14 |
|
So something odd started happening today, I have a C2 OLED connected to my PC and playing Flight Sim on it with HDR enabled, it worked fine until today when I started getting this random flickering. I first saw this happen in Flight Sim, but just launched Dyson Sphere Program and saw similar behavior as well. Turning off HDR seems to get rid of this issue - is this just another example of HDR being broken on PC, or could it be something like the cable going bad? https://i.imgur.com/9fOWcvQ.mp4
|
# ? Aug 13, 2022 22:16 |
|
what graphics card is it? i randomly read this the other day about amd cards doing some weird stuff with hdr that you need to account for https://old.reddit.com/r/OLED_Gaming/comments/wlfjkx/psa_in_the_latest_amd_drivers_10bit_pixel_format/
|
# ? Aug 13, 2022 22:18 |
|
it could be the cable, enabling HDR bumps the required bandwidth up by 25% so it might be pushing it over the point where the signal becomes flaky
|
# ? Aug 13, 2022 22:20 |
|
Shipon posted:I actually can't tell the difference between 1440p and 4k on a computer monitor (my 32" 4K monitor vs my 34" 1440p ultrawide) when it comes to gaming, but oddly enough I can when it comes to browsing and doing productivity work. I have the same setup and find that the only difference I really 100% notice is in FPS games where long distance shooting is a thing. Having that extra resolution lets you see people more clearly. But then you’re giving up the aspect ratio benefit of the ultrawide. It’s a tough call for me. UW def still shines in strategy games too. DoombatINC posted:I would take an RNG spitting out a random assortment of color + fish combinations over any cooperate codenames feigning meaning or pathos, absolutely and without question Oh, I don’t really read into the pathos as they’re still a corporation. I think having a consistent naming convention is better than a meme generated one. Admittedly, I’ve got ZANY IRONY burnout having been on SA for almost 20 years. Also, get off my lawn.
|
# ? Aug 13, 2022 22:20 |
|
repiv posted:it could be the cable, enabling HDR bumps the required bandwidth up by 25% so it might be pushing it over the point where the signal becomes flaky
|
# ? Aug 13, 2022 22:22 |
|
it wouldn't crap out as much as run into a new bottleneck. with tv and movies, you have 12-bit dolby vision, but usually at 24p, whereas with videogames, your framerate can go up to 120
|
# ? Aug 13, 2022 22:26 |
|
Shipon posted:So something odd started happening today, I have a C2 OLED connected to my PC and playing Flight Sim on it with HDR enabled, it worked fine until today when I started getting this random flickering. I first saw this happen in Flight Sim, but just launched Dyson Sphere Program and saw similar behavior as well. Turning off HDR seems to get rid of this issue - is this just another example of HDR being broken on PC, or could it be something like the cable going bad? Have you done a driver update in the last day? If so, try rolling it back.
|
# ? Aug 13, 2022 22:29 |
|
Dr. Video Games 0031 posted:Have you done a driver update in the last day? If so, try rolling it back. No driver updates...but you did just make me check and realize that I'm still on geforce drivers from december lmao EDIT: Yeah rebooted to install updates and the problem is gone now. Shipon fucked around with this message at 22:45 on Aug 13, 2022 |
# ? Aug 13, 2022 22:31 |
|
Cygni posted:At least from what I've heard lately, most companies dont actually use the codenames we hear in the public internally anymore and they are mostly made up for the press (or to purposefully "leak" ). Someone more connected probably knows more about that than i do though. i doubt anyone in the lunchroom is talking about their work on Hotpink Blowfish. Yeah they’d probably shorten it to blowfish.
|
# ? Aug 13, 2022 22:50 |
|
|
# ? Aug 13, 2022 23:12 |
|
Not a terrible deal tbh
|
# ? Aug 14, 2022 00:38 |
|
tehinternet posted:I don’t really get this — how can you not care for a certain resolution? It seems immaterial beyond higher is better, but higher isn’t better for you? If you offer me a 16:9 4K 144Hz panel or a 21:9 1440 120Hz QD-OLED panel, I'm taking the ultrawide every time. The wider horizontal is way more valuable to me for media consumption than any boost in pixel density. It's also way easier to run so I can turn on all the bells and whistles in a games setting without seeing it dropping below 30fps.
|
# ? Aug 14, 2022 11:07 |
|
That is not at all enough explanation, I need like 5-6 effort posts on this
|
# ? Aug 14, 2022 11:29 |
|
It's a Frankenstein's monster that probably flashes 'please kill me' for a single frame every once in a while
|
# ? Aug 14, 2022 11:33 |
|
|
# ? Jun 6, 2024 00:54 |
|
Zedsdeadbaby posted:It's a Frankenstein's monster that probably flashes 'please kill me' for a single frame every once in a while This is all the explanation that’s needed for that photo.
|
# ? Aug 14, 2022 12:37 |