|
Paul MaudDib posted:as an amateur photographer it was very interesting getting an LG C1 OLED and then watching the LOTR trilogy. It's sort of similar to a "zone system" for luminosity in addition to tonality, things can have greater/lesser luminosity in addition to the actual tonal color. The shift between dark caves and bright plains etc. And really the peak luminosity is only used for certain things like lightning and gandalf the white's staff and the balrog's whip etc, it was kinda obvious when they were firing up 1000 nits for an effect. the quality of HDR re-grades is all over the place, i saw a heatmap for one of the early star wars films in "HDR" and nearly everything was in the LDR range with the sole exception of the lightsabers which they'd manually masked out and cranked up to eye searing
|
# ? Apr 5, 2023 02:13 |
|
|
# ? Jun 6, 2024 16:35 |
|
The Smaug encounter in the second Hobbit movie was a really interesting showcase for HDR. They really cranked the luminosity for the lava and dragon's breath. They also used the wider dci-p3 color gamut to show more saturated reds. i switched back and forth between sdr and hdr masters when watching that (because I'm a giant nerd), and on sdr the dragon breath was just an orange smear almost while it was way more detailed in hdr. kinda interesting, but absolutely a gimmick for that scene. an impressive gimmick nonetheless
|
# ? Apr 5, 2023 02:21 |
|
repiv posted:the quality of HDR re-grades is all over the place, i saw a heatmap for one of the early star wars films in "HDR" and nearly everything was in the LDR range with the sole exception of the lightsabers which they'd manually masked out and cranked up to eye searing exactly that's what I'm saying is, it's very deliberately mastered right now where some specific things deploy max-nits. But instead of just a full-LDR master being mastered around let's say 400 nits, you also have low-mastered 200 nit content, high-mastered 600 nits content, and "gently caress it 1000 nits" content like lightsaber bladess and lightning and balrog whip tips and gandalf's staff glow etc. And content seems to be very consciously/deliberately binned into one of those zones - this is a low-mastered 200 nits scene with 400 nits lantern light, this is a 400 nits mastered scene with 1000 nits balrog whips, etc. I personally think it's very important to perceptually re-master because the response of the human eye isn't linear at all and staring at a ~400 nits average panel isn't natural either. HDR increases the amount of effect range that you can use - it's like a "zone system" luminosity range expansion (but with a similar amount of contrast since that's a more localized perceptual phenomenon - you are comparing "backlight zones" against each other, just more fluidly). But at the same time you can't go too far with it or it looks tacky. nobody likes the "can't look at it" bloom or the "wow that's just 1000 nits on that effect" kinda stuff. this cinematographic trope seems lessened in newer content, I assume mastered with a bit more restraint, and I wonder whether games are mastered for the masses at 400 nits or with HDR in mind these days? consoles do have a decent HDTV population after all. Paul MaudDib fucked around with this message at 02:50 on Apr 5, 2023 |
# ? Apr 5, 2023 02:48 |
|
the new asus ally handheld looks neat https://twitter.com/VideoCardz/status/1643177208310120448?s=20 quote:The console is said to offer 50% higher performance at 15W than Steam Deck and twice the performance at 35W.
|
# ? Apr 5, 2023 03:13 |
|
And how much of that performance will be lost to running windows 11 instead of steamOS?
|
# ? Apr 5, 2023 03:15 |
|
Arivia posted:And how much of that performance will be lost to running windows 11 instead of steamOS? The difference on the Steam Deck isn't all that big, honestly, and some games even run faster on Win11. This 50% performance improvement was presumably already measured with the Ally on Win11 and the Steam Deck on SteamOS.
|
# ? Apr 5, 2023 03:20 |
|
Everyone forgets Microsoft made Windows 8 run on loving Snapdragon.
|
# ? Apr 5, 2023 03:25 |
|
i wouldn't say it's that surprising that windows games tend to run a bit better on windows and native directx drivers, than though a linux/vulkan compatibility layer proton is impressive but it's not magic
|
# ? Apr 5, 2023 03:27 |
|
the shader caching stuff is a plus for steam deck tho my main issue with it these days is newer games are getting out of its performance envelope even with heavy tweaking
|
# ? Apr 5, 2023 03:30 |
|
No guys seriously this phoronix benchmark from 2013 clearly shows that team fortress 2 runs better on linux than on windows vista and therefore
|
# ? Apr 5, 2023 03:33 |
|
Kazinsal posted:No guys seriously this phoronix benchmark from 2013 clearly shows that team fortress 2 runs better on linux than on windows vista and therefore
|
# ? Apr 5, 2023 03:37 |
|
sleep/suspend would be the biggest potential issue with a windows handheld microsoft keeps on breaking it for laptops with a clear suspend signal (lid is shut)
|
# ? Apr 5, 2023 03:41 |
|
Kazinsal posted:No guys seriously this phoronix benchmark from 2013 clearly shows that team fortress 2 runs better on linux than on windows vista and therefore i'd legit love to see an apple silicon port, I bet M1/M2/M3 would actually do pretty well given the "interpreter" format of goldsrc/src/src2/respawn/etc can u guys imagine if there was a mac update tf2, that would be awesome
|
# ? Apr 5, 2023 04:00 |
|
shrike82 posted:the shader caching stuff is a plus for steam deck tho the shader caching stuff is a bit of give and take, when it successfully primes the cache it's nice to have but if it doesn't (i.e. right after an update when the shared cache hasn't propagated yet) the stuttering can be much worse than it is on windows apex for example was notorious for becoming basically unplayable on linux after every update, although apparently it's better now if you opt in to the DX12 beta repiv fucked around with this message at 04:04 on Apr 5, 2023 |
# ? Apr 5, 2023 04:01 |
|
I've asked this in a couple of different places, but the PS5 thread in Games suggested I ask here. My current laptop has a Laptop 3060 inside. Specifically, this laptop from Newegg. Sometimes when I have company I'll hook it up to the living room TV (which is a 4K) to play games. I'm considering getting a PS5 for the main room instead, but I don't want to buy one if it's about the same performance as using the laptop. I've had a hard time finding good comparisons between the laptop 3060 and a PS5. Does anyone have any advice about comparing the performance between the two and if getting the PS5 is worth it?
|
# ? Apr 5, 2023 21:39 |
|
YorexTheMad posted:I've asked this in a couple of different places, but the PS5 thread in Games suggested I ask here. Common wisdom suggests that the PS5 is roughly equivalent to the RTX 2070 Super or RX 6650 XT. (rasterization only) https://www.youtube.com/watch?v=S1sCLpkOkhY From this, we know the laptop 3060 is around 8% slower at 1440p than the desktop 3060. Reviews of the desktop 3060 tell us that it's another 8% slower than the 2070 Super. Multiply both together, and the laptop 3060 is around 15 - 16% slower than the PS5. However, and this is very important, this only applies to the highest-spec 3060 laptops. There's are large differences in performance depending on how much power your 3060 is allowed to use. 3060s that use over 100W perform something like 30 - 40% better than 65W 3060s. This could end up with a pretty large gap between the laptop 3060 and the PS5 depending on your exact laptop model, potentially up to a 2x difference if you have one of the weaker 3060 laptops.
|
# ? Apr 6, 2023 00:42 |
|
Dr. Video Games 0031 posted:However, and this is very important, this only applies to the highest-spec 3060 laptops. There's are large differences in performance depending on how much power your 3060 is allowed to use. 3060s that use over 100W perform something like 30 - 40% better than 65W 3060s. This could end up with a pretty large gap between the laptop 3060 and the PS5 depending on your exact laptop model, potentially up to a 2x difference if you have one of the weaker 3060 laptops. How can I find out about the power the 3060 is allowed? I'm under no illusions that I got a top of the line laptop, but this is helping me understand the potential comparison.
|
# ? Apr 6, 2023 01:01 |
|
Kinda hard to compare them since there's often a gap between console and PC gaming beyond what specs would suggest (e.g., TLOU PC being poo poo) A PS5 would be less fiddly for a TV setup imo especially relative to a laptop
|
# ? Apr 6, 2023 01:07 |
|
YorexTheMad posted:How can I find out about the power the 3060 is allowed? I'm under no illusions that I got a top of the line laptop, but this is helping me understand the potential comparison. It may say in your laptop's software suite/control center somewhere, but if not, then you can share your laptop's exact model and I'll try to find out. Laptop manufacturers don't always make this clear, which is pretty lovely considering how important it is.
|
# ? Apr 6, 2023 01:36 |
|
Dr. Video Games 0031 posted:It may say in your laptop's software suite/control center somewhere, but if not, then you can share your laptop's exact model and I'll try to find out. Laptop manufacturers don't always make this clear, which is pretty lovely considering how important it is. Gigabyte A5 K1-AUS1130SB (from here) I've tried to dig around system settings, but I can't find any useful control panel or anything from System Information that gives the info. Sorry, I'm not sure where else to look. Edit: The Newegg page says 'maximum graphics power' is 130W, if that's accurate. YorexTheMad fucked around with this message at 02:55 on Apr 6, 2023 |
# ? Apr 6, 2023 02:50 |
|
Then yeah, that is actually pretty much a top-spec 3060. There's still some laptop-to-laptop variance with stuff like CPU and memory configurations and how they balance power, but it's pretty much impossible to judge that at a glance. I'd say, when comparing a good PC port to the PS5 version of that game, the PS5 may be able to hit 60fps at 1440p while you may have to drop a few settings or go down to 1080p. 4k 30fps may be possible with some games, but probably not all the games the PS5 can do that in. The saving grace is that the 3060 supports DLSS, which is much better upscaling technology than anything available on the PS5, so you can possibly make up for any deficits that way. Honestly, I don't think you'd be seeing a very big improvement in terms of image quality or performance by replacing the laptop with a PS5, but it may still be worth it if you value convenience and reliability. PC ports have been kinda lovely lately with lots of performance issues and stuttering from shader compilation. The PS5 just works, on the other hand. It will also be getting some timed exclusives, so if you want to play games early, that's the way to go.
|
# ? Apr 6, 2023 03:02 |
|
got around to watching the cyberpunk overdrive mode video and a) odd that they didn't compare against original RT, and b) even against RTX OFF, some of the overdrive shots didn't look great
|
# ? Apr 6, 2023 03:19 |
|
Thanks so much for your time and helping me understand my situation better. Not sure yet what I'm going to do but this helps a ton.
|
# ? Apr 6, 2023 03:20 |
|
shrike82 posted:got around to watching the cyberpunk overdrive mode video and a) odd that they didn't compare against original RT, and b) even against RTX OFF, some of the overdrive shots didn't look great even in their cherry picked shots i noticed the telltale "sparkling" artifacts of RTXDI, also present in portal rtx turning up the quality knobs would probably fix it but lmao it's already demanding enough
|
# ? Apr 6, 2023 03:26 |
|
I think all of the shots show a big improvement overall, but I wish they compared it against the Psycho RT preset instead of RT off. My biggest complaint is the bloom/overexposure, which I'm hoping was just because some idiot at nvidia thought that cranking the bloom would dazzle people better for their promotional video, and it will be adjustable in the game. Some stuff like noisy shadows or slowly dissipating light/shadows are probably going to be present too. I'm expecting it to be pretty similar to Portal RTX overall in that regard. There's definitely a lot of room for improvement with these current implementations which is why they call it a "technology preview," but I'm excited to try it anyway.
|
# ? Apr 6, 2023 04:03 |
|
I have a laptop with a 65W 3060 and I'm still happy with it. I figure it's better than a 3050 at least.
|
# ? Apr 6, 2023 04:13 |
|
Dr. Video Games 0031 posted:It's a lovely port, but it's also the third game this year already that has run into issues with 8GB cards, after HogLeg and RE4make, and there were a few last year too (such as miles morales for one example). At some point, you have to stop calling these outliers, because there's bound to be more games like this this year. lol can you imagine if HUB was as breathless about the future implications of Ryzen 3600 being critically cpu-bottlenecked while just standing in an empty hall as they are about VRAM? not even at 1080p with a 4090, Alex was cpu bottlenecked at 1440p with a 2070S... not even doing anything, just standing in a hall, every single core and thread loaded up to 97-100%. Can only imagine the frametimes. just like 8GB, the R5 3600 is something that had definite alternatives with much better capability at the time (9900K, etc), and that's a tier of performance AMD is continuing to offer as a current-gen product. R5 5500 is obsolete right? nobody should be buying that kind of thing in 2023, it's junk, same or worse performance as a 3600. And AMD is trying to position that as a midrange offering, R5 branding, launched within the last year. Wow, greedy, right? or maybe is there a lifecycle thing here, and 8GB was acceptable in 2018 and continues to be acceptance in certain product segments, just like the R5 5500 is acceptance in certain product segments today? almost as if there is... no bad hardware, only bad prices. and apparently bad ports Paul MaudDib fucked around with this message at 07:35 on Apr 6, 2023 |
# ? Apr 6, 2023 06:17 |
|
I feel that we need to accept that we're back in the wild west era of bad PC ports once again. It was good for a time during the last few years but not anymore. Think the botched rollout of DX12 had a fair bit to do with this. The old chestnut of 'performance is now more down to developers and less on the drivers' can't keep flying forever. At some point developers need to take responsibility for their performance issues, and they just aren't.
|
# ? Apr 6, 2023 08:30 |
|
So... is TLOU playable on an 8GB 1070? I was going to try to emulate the original one but gave up because of some emulator bugs that needed workarounds and I just couldn't be bothered.
|
# ? Apr 6, 2023 09:22 |
|
mobby_6kl posted:So... is TLOU playable on an 8GB 1070? I was going to try to emulate the original one but gave up because of some emulator bugs that needed workarounds and I just couldn't be bothered. drop it down to medium and you can probably expect something between the RTX 3050 and the Intel Arc A750
|
# ? Apr 6, 2023 09:35 |
|
unfortunately, medium quality textures in tlou look absolutely horrid
|
# ? Apr 6, 2023 09:46 |
|
Zedsdeadbaby posted:I feel that we need to accept that we're back in the wild west era of bad PC ports once again. It was good for a time during the last few years but not anymore. Think the botched rollout of DX12 had a fair bit to do with this. The old chestnut of 'performance is now more down to developers and less on the drivers' can't keep flying forever. At some point developers need to take responsibility for their performance issues, and they just aren't. vulkan has effectively admitted defeat here, they just announced a new extension which more or less brings the looser DX11 shader compilation model into vulkan as an option alongside the rigid "pre-compile your PSOs or else" model that vulkan and DX12 were originally designed around. https://www.khronos.org/blog/you-can-use-vulkan-without-pipelines-today microsoft hasn't made a similar move with DX12 yet, but I wouldn't be surprised if they do. PSOs are theoretically optimal for performance but if some engines are never going to adapt to use them properly (looking at you unreal engine) they're doing more harm than good in some games. repiv fucked around with this message at 12:44 on Apr 6, 2023 |
# ? Apr 6, 2023 12:35 |
|
The textures might have problems but whenever they are fixed for lower VRAM people I think they'll find it was worth the wait. The textures in TLOU pc are pretty glorious. It's pretty clear that they were told in no uncertain terms that the game needed to come out after the show ended, and the c suite didn't give a poo poo how it performed. The other issue is that the game is rock solid and plays pretty great if you do have the beef to run the game, so if the devs/testers had nicer rigs, they may have had an objectively great experience. Of course they should test it on lower hardware too, I'm just saying I get how people were like "eh, this is kind of ok" if they were playing on rigs with the juice for it, and the execs were shouting at everyone to get it out yesterday no matter the cost or state of the game. Taima fucked around with this message at 12:47 on Apr 6, 2023 |
# ? Apr 6, 2023 12:44 |
|
gradenko_2000 posted:
Dr. Video Games 0031 posted:unfortunately, medium quality textures in tlou look absolutely horrid
|
# ? Apr 6, 2023 13:07 |
|
mobby_6kl posted:Thanks, I guess we'll see soon as long as I can get the shaders compiled before the refund window closes lol The character textures are still good, but the environment textures go to poo poo, especially in larger outdoors scenes (indoors they're still serviceable). edit: there's also zero difference in texture quality between high and ultra presets, and the only reason the PC ultra setting is sharper than the PS5 setting is because the PC is at 4K there and the PS5 is at 1440p. So it goes straight from sharp high/ultra textures to a blurry mess on medium textures with nothing in between. Dr. Video Games 0031 fucked around with this message at 13:33 on Apr 6, 2023 |
# ? Apr 6, 2023 13:30 |
|
speaking of running games on older hardware, i just tried the third-party fsr 2 for resident evil 3, and the results were ... not good not entirely different from the concerns we had about dlss 3 messing with ui elements and everything that basically shouldn't be affected. and having to mess with manual lod bias is a mess
|
# ? Apr 6, 2023 13:53 |
|
Dr. Video Games 0031 posted:The character textures are still good, but the environment textures go to poo poo, especially in larger outdoors scenes (indoors they're still serviceable). Sorry, printing text on a can requires 16GB of VRAM now
|
# ? Apr 6, 2023 15:19 |
|
That specific barrel looks like half life 2 on the DX7 renderer. Wild.
|
# ? Apr 6, 2023 15:23 |
|
If that’s medium how bad is low?
|
# ? Apr 6, 2023 15:24 |
|
|
# ? Jun 6, 2024 16:35 |
|
MarcusSA posted:If that’s medium how bad is low?
|
# ? Apr 6, 2023 15:27 |