Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
repiv
Aug 13, 2009

Paul MaudDib posted:

as an amateur photographer it was very interesting getting an LG C1 OLED and then watching the LOTR trilogy. It's sort of similar to a "zone system" for luminosity in addition to tonality, things can have greater/lesser luminosity in addition to the actual tonal color. The shift between dark caves and bright plains etc. And really the peak luminosity is only used for certain things like lightning and gandalf the white's staff and the balrog's whip etc, it was kinda obvious when they were firing up 1000 nits for an effect.

perhaps it's less deliberately done in stuff that wasn't posthumously mastered up to HDR after the fact, I haven't noticed it since even on HDR content, but yeah, it was kinda used as an eye-searer in LOTR at least, and that was the film where I studied it the most deliberately.

the quality of HDR re-grades is all over the place, i saw a heatmap for one of the early star wars films in "HDR" and nearly everything was in the LDR range with the sole exception of the lightsabers which they'd manually masked out and cranked up to eye searing

Adbot
ADBOT LOVES YOU

Dr. Video Games 0031
Jul 17, 2004

The Smaug encounter in the second Hobbit movie was a really interesting showcase for HDR. They really cranked the luminosity for the lava and dragon's breath. They also used the wider dci-p3 color gamut to show more saturated reds. i switched back and forth between sdr and hdr masters when watching that (because I'm a giant nerd), and on sdr the dragon breath was just an orange smear almost while it was way more detailed in hdr. kinda interesting, but absolutely a gimmick for that scene. an impressive gimmick nonetheless

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

repiv posted:

the quality of HDR re-grades is all over the place, i saw a heatmap for one of the early star wars films in "HDR" and nearly everything was in the LDR range with the sole exception of the lightsabers which they'd manually masked out and cranked up to eye searing

exactly that's what I'm saying is, it's very deliberately mastered right now where some specific things deploy max-nits. But instead of just a full-LDR master being mastered around let's say 400 nits, you also have low-mastered 200 nit content, high-mastered 600 nits content, and "gently caress it 1000 nits" content like lightsaber bladess and lightning and balrog whip tips and gandalf's staff glow etc. And content seems to be very consciously/deliberately binned into one of those zones - this is a low-mastered 200 nits scene with 400 nits lantern light, this is a 400 nits mastered scene with 1000 nits balrog whips, etc.

I personally think it's very important to perceptually re-master because the response of the human eye isn't linear at all and staring at a ~400 nits average panel isn't natural either. HDR increases the amount of effect range that you can use - it's like a "zone system" luminosity range expansion (but with a similar amount of contrast since that's a more localized perceptual phenomenon - you are comparing "backlight zones" against each other, just more fluidly). But at the same time you can't go too far with it or it looks tacky. nobody likes the "can't look at it" bloom or the "wow that's just 1000 nits on that effect" kinda stuff.

this cinematographic trope seems lessened in newer content, I assume mastered with a bit more restraint, and I wonder whether games are mastered for the masses at 400 nits or with HDR in mind these days? consoles do have a decent HDTV population after all.

Paul MaudDib fucked around with this message at 02:50 on Apr 5, 2023

shrike82
Jun 11, 2005

the new asus ally handheld looks neat

https://twitter.com/VideoCardz/status/1643177208310120448?s=20

quote:

The console is said to offer 50% higher performance at 15W than Steam Deck and twice the performance at 35W.

Arivia
Mar 17, 2011
And how much of that performance will be lost to running windows 11 instead of steamOS?

Dr. Video Games 0031
Jul 17, 2004

Arivia posted:

And how much of that performance will be lost to running windows 11 instead of steamOS?

The difference on the Steam Deck isn't all that big, honestly, and some games even run faster on Win11. This 50% performance improvement was presumably already measured with the Ally on Win11 and the Steam Deck on SteamOS.

Shumagorath
Jun 6, 2001
Everyone forgets Microsoft made Windows 8 run on loving Snapdragon.

repiv
Aug 13, 2009

i wouldn't say it's that surprising that windows games tend to run a bit better on windows and native directx drivers, than though a linux/vulkan compatibility layer

proton is impressive but it's not magic

shrike82
Jun 11, 2005

the shader caching stuff is a plus for steam deck tho
my main issue with it these days is newer games are getting out of its performance envelope even with heavy tweaking

Kazinsal
Dec 13, 2011
No guys seriously this phoronix benchmark from 2013 clearly shows that team fortress 2 runs better on linux than on windows vista and therefore

Shumagorath
Jun 6, 2001

Kazinsal posted:

No guys seriously this phoronix benchmark from 2013 clearly shows that team fortress 2 runs better on linux than on windows vista and therefore
apple_powerpc_benchmark.qt

shrike82
Jun 11, 2005

sleep/suspend would be the biggest potential issue with a windows handheld
microsoft keeps on breaking it for laptops with a clear suspend signal (lid is shut)

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Kazinsal posted:

No guys seriously this phoronix benchmark from 2013 clearly shows that team fortress 2 runs better on linux than on windows vista and therefore

i'd legit love to see an apple silicon port, I bet M1/M2/M3 would actually do pretty well given the "interpreter" format of goldsrc/src/src2/respawn/etc

can u guys imagine if there was a mac update tf2, that would be awesome



repiv
Aug 13, 2009

shrike82 posted:

the shader caching stuff is a plus for steam deck tho
my main issue with it these days is newer games are getting out of its performance envelope even with heavy tweaking

the shader caching stuff is a bit of give and take, when it successfully primes the cache it's nice to have but if it doesn't (i.e. right after an update when the shared cache hasn't propagated yet) the stuttering can be much worse than it is on windows

apex for example was notorious for becoming basically unplayable on linux after every update, although apparently it's better now if you opt in to the DX12 beta

repiv fucked around with this message at 04:04 on Apr 5, 2023

YorexTheMad
Apr 16, 2007
OBAMA IS A FALSE MESSIAH

ABANDON ALL HOPE
I've asked this in a couple of different places, but the PS5 thread in Games suggested I ask here.

My current laptop has a Laptop 3060 inside. Specifically, this laptop from Newegg. Sometimes when I have company I'll hook it up to the living room TV (which is a 4K) to play games.

I'm considering getting a PS5 for the main room instead, but I don't want to buy one if it's about the same performance as using the laptop. I've had a hard time finding good comparisons between the laptop 3060 and a PS5. Does anyone have any advice about comparing the performance between the two and if getting the PS5 is worth it?

Dr. Video Games 0031
Jul 17, 2004

YorexTheMad posted:

I've asked this in a couple of different places, but the PS5 thread in Games suggested I ask here.

My current laptop has a Laptop 3060 inside. Specifically, this laptop from Newegg. Sometimes when I have company I'll hook it up to the living room TV (which is a 4K) to play games.

I'm considering getting a PS5 for the main room instead, but I don't want to buy one if it's about the same performance as using the laptop. I've had a hard time finding good comparisons between the laptop 3060 and a PS5. Does anyone have any advice about comparing the performance between the two and if getting the PS5 is worth it?

Common wisdom suggests that the PS5 is roughly equivalent to the RTX 2070 Super or RX 6650 XT. (rasterization only)

https://www.youtube.com/watch?v=S1sCLpkOkhY

From this, we know the laptop 3060 is around 8% slower at 1440p than the desktop 3060. Reviews of the desktop 3060 tell us that it's another 8% slower than the 2070 Super. Multiply both together, and the laptop 3060 is around 15 - 16% slower than the PS5.

However, and this is very important, this only applies to the highest-spec 3060 laptops. There's are large differences in performance depending on how much power your 3060 is allowed to use. 3060s that use over 100W perform something like 30 - 40% better than 65W 3060s. This could end up with a pretty large gap between the laptop 3060 and the PS5 depending on your exact laptop model, potentially up to a 2x difference if you have one of the weaker 3060 laptops.

YorexTheMad
Apr 16, 2007
OBAMA IS A FALSE MESSIAH

ABANDON ALL HOPE

Dr. Video Games 0031 posted:

However, and this is very important, this only applies to the highest-spec 3060 laptops. There's are large differences in performance depending on how much power your 3060 is allowed to use. 3060s that use over 100W perform something like 30 - 40% better than 65W 3060s. This could end up with a pretty large gap between the laptop 3060 and the PS5 depending on your exact laptop model, potentially up to a 2x difference if you have one of the weaker 3060 laptops.

How can I find out about the power the 3060 is allowed? I'm under no illusions that I got a top of the line laptop, but this is helping me understand the potential comparison.

shrike82
Jun 11, 2005

Kinda hard to compare them since there's often a gap between console and PC gaming beyond what specs would suggest (e.g., TLOU PC being poo poo)
A PS5 would be less fiddly for a TV setup imo especially relative to a laptop

Dr. Video Games 0031
Jul 17, 2004

YorexTheMad posted:

How can I find out about the power the 3060 is allowed? I'm under no illusions that I got a top of the line laptop, but this is helping me understand the potential comparison.

It may say in your laptop's software suite/control center somewhere, but if not, then you can share your laptop's exact model and I'll try to find out. Laptop manufacturers don't always make this clear, which is pretty lovely considering how important it is.

YorexTheMad
Apr 16, 2007
OBAMA IS A FALSE MESSIAH

ABANDON ALL HOPE

Dr. Video Games 0031 posted:

It may say in your laptop's software suite/control center somewhere, but if not, then you can share your laptop's exact model and I'll try to find out. Laptop manufacturers don't always make this clear, which is pretty lovely considering how important it is.

Gigabyte A5 K1-AUS1130SB (from here)

I've tried to dig around system settings, but I can't find any useful control panel or anything from System Information that gives the info. Sorry, I'm not sure where else to look.

Edit: The Newegg page says 'maximum graphics power' is 130W, if that's accurate.

YorexTheMad fucked around with this message at 02:55 on Apr 6, 2023

Dr. Video Games 0031
Jul 17, 2004

Then yeah, that is actually pretty much a top-spec 3060. There's still some laptop-to-laptop variance with stuff like CPU and memory configurations and how they balance power, but it's pretty much impossible to judge that at a glance. I'd say, when comparing a good PC port to the PS5 version of that game, the PS5 may be able to hit 60fps at 1440p while you may have to drop a few settings or go down to 1080p. 4k 30fps may be possible with some games, but probably not all the games the PS5 can do that in. The saving grace is that the 3060 supports DLSS, which is much better upscaling technology than anything available on the PS5, so you can possibly make up for any deficits that way. Honestly, I don't think you'd be seeing a very big improvement in terms of image quality or performance by replacing the laptop with a PS5, but it may still be worth it if you value convenience and reliability. PC ports have been kinda lovely lately with lots of performance issues and stuttering from shader compilation. The PS5 just works, on the other hand. It will also be getting some timed exclusives, so if you want to play games early, that's the way to go.

shrike82
Jun 11, 2005

got around to watching the cyberpunk overdrive mode video and a) odd that they didn't compare against original RT, and b) even against RTX OFF, some of the overdrive shots didn't look great

YorexTheMad
Apr 16, 2007
OBAMA IS A FALSE MESSIAH

ABANDON ALL HOPE

Thanks so much for your time and helping me understand my situation better. Not sure yet what I'm going to do but this helps a ton.

repiv
Aug 13, 2009

shrike82 posted:

got around to watching the cyberpunk overdrive mode video and a) odd that they didn't compare against original RT, and b) even against RTX OFF, some of the overdrive shots didn't look great

even in their cherry picked shots i noticed the telltale "sparkling" artifacts of RTXDI, also present in portal rtx

turning up the quality knobs would probably fix it but lmao it's already demanding enough

Dr. Video Games 0031
Jul 17, 2004

I think all of the shots show a big improvement overall, but I wish they compared it against the Psycho RT preset instead of RT off. My biggest complaint is the bloom/overexposure, which I'm hoping was just because some idiot at nvidia thought that cranking the bloom would dazzle people better for their promotional video, and it will be adjustable in the game.

Some stuff like noisy shadows or slowly dissipating light/shadows are probably going to be present too. I'm expecting it to be pretty similar to Portal RTX overall in that regard. There's definitely a lot of room for improvement with these current implementations which is why they call it a "technology preview," but I'm excited to try it anyway.

FuturePastNow
May 19, 2014


I have a laptop with a 65W 3060 and I'm still happy with it. I figure it's better than a 3050 at least.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Dr. Video Games 0031 posted:

It's a lovely port, but it's also the third game this year already that has run into issues with 8GB cards, after HogLeg and RE4make, and there were a few last year too (such as miles morales for one example). At some point, you have to stop calling these outliers, because there's bound to be more games like this this year.

lol can you imagine if HUB was as breathless about the future implications of Ryzen 3600 being critically cpu-bottlenecked while just standing in an empty hall as they are about VRAM? not even at 1080p with a 4090, Alex was cpu bottlenecked at 1440p with a 2070S... not even doing anything, just standing in a hall, every single core and thread loaded up to 97-100%. Can only imagine the frametimes.

just like 8GB, the R5 3600 is something that had definite alternatives with much better capability at the time (9900K, etc), and that's a tier of performance AMD is continuing to offer as a current-gen product. R5 5500 is obsolete right? nobody should be buying that kind of thing in 2023, it's junk, same or worse performance as a 3600. And AMD is trying to position that as a midrange offering, R5 branding, launched within the last year. Wow, greedy, right?

or maybe is there a lifecycle thing here, and 8GB was acceptable in 2018 and continues to be acceptance in certain product segments, just like the R5 5500 is acceptance in certain product segments today? almost as if there is... no bad hardware, only bad prices.

and apparently bad ports

Paul MaudDib fucked around with this message at 07:35 on Apr 6, 2023

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
I feel that we need to accept that we're back in the wild west era of bad PC ports once again. It was good for a time during the last few years but not anymore. Think the botched rollout of DX12 had a fair bit to do with this. The old chestnut of 'performance is now more down to developers and less on the drivers' can't keep flying forever. At some point developers need to take responsibility for their performance issues, and they just aren't.

mobby_6kl
Aug 9, 2009

by Fluffdaddy
So... is TLOU playable on an 8GB 1070? I was going to try to emulate the original one but gave up because of some emulator bugs that needed workarounds and I just couldn't be bothered.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

mobby_6kl posted:

So... is TLOU playable on an 8GB 1070? I was going to try to emulate the original one but gave up because of some emulator bugs that needed workarounds and I just couldn't be bothered.



drop it down to medium and you can probably expect something between the RTX 3050 and the Intel Arc A750

Dr. Video Games 0031
Jul 17, 2004

unfortunately, medium quality textures in tlou look absolutely horrid

repiv
Aug 13, 2009

Zedsdeadbaby posted:

I feel that we need to accept that we're back in the wild west era of bad PC ports once again. It was good for a time during the last few years but not anymore. Think the botched rollout of DX12 had a fair bit to do with this. The old chestnut of 'performance is now more down to developers and less on the drivers' can't keep flying forever. At some point developers need to take responsibility for their performance issues, and they just aren't.

vulkan has effectively admitted defeat here, they just announced a new extension which more or less brings the looser DX11 shader compilation model into vulkan as an option alongside the rigid "pre-compile your PSOs or else" model that vulkan and DX12 were originally designed around.

https://www.khronos.org/blog/you-can-use-vulkan-without-pipelines-today

microsoft hasn't made a similar move with DX12 yet, but I wouldn't be surprised if they do. PSOs are theoretically optimal for performance but if some engines are never going to adapt to use them properly (looking at you unreal engine) they're doing more harm than good in some games.

repiv fucked around with this message at 12:44 on Apr 6, 2023

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
The textures might have problems but whenever they are fixed for lower VRAM people I think they'll find it was worth the wait. The textures in TLOU pc are pretty glorious.

It's pretty clear that they were told in no uncertain terms that the game needed to come out after the show ended, and the c suite didn't give a poo poo how it performed. The other issue is that the game is rock solid and plays pretty great if you do have the beef to run the game, so if the devs/testers had nicer rigs, they may have had an objectively great experience.

Of course they should test it on lower hardware too, I'm just saying I get how people were like "eh, this is kind of ok" if they were playing on rigs with the juice for it, and the execs were shouting at everyone to get it out yesterday no matter the cost or state of the game.

Taima fucked around with this message at 12:47 on Apr 6, 2023

mobby_6kl
Aug 9, 2009

by Fluffdaddy

gradenko_2000 posted:



drop it down to medium and you can probably expect something between the RTX 3050 and the Intel Arc A750

Dr. Video Games 0031 posted:

unfortunately, medium quality textures in tlou look absolutely horrid
Thanks, I guess we'll see soon as long as I can get the shaders compiled before the refund window closes lol

Dr. Video Games 0031
Jul 17, 2004

mobby_6kl posted:

Thanks, I guess we'll see soon as long as I can get the shaders compiled before the refund window closes lol

The character textures are still good, but the environment textures go to poo poo, especially in larger outdoors scenes (indoors they're still serviceable).



edit: there's also zero difference in texture quality between high and ultra presets, and the only reason the PC ultra setting is sharper than the PS5 setting is because the PC is at 4K there and the PS5 is at 1440p. So it goes straight from sharp high/ultra textures to a blurry mess on medium textures with nothing in between.

Dr. Video Games 0031 fucked around with this message at 13:33 on Apr 6, 2023

kliras
Mar 27, 2021
speaking of running games on older hardware, i just tried the third-party fsr 2 for resident evil 3, and the results were ... not good

not entirely different from the concerns we had about dlss 3 messing with ui elements and everything that basically shouldn't be affected. and having to mess with manual lod bias is a mess

Inept
Jul 8, 2003

Dr. Video Games 0031 posted:

The character textures are still good, but the environment textures go to poo poo, especially in larger outdoors scenes (indoors they're still serviceable).



Sorry, printing text on a can requires 16GB of VRAM now

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.
That specific barrel looks like half life 2 on the DX7 renderer. Wild.

MarcusSA
Sep 23, 2007

If that’s medium how bad is low?

Adbot
ADBOT LOVES YOU

Kibner
Oct 21, 2008

Acguy Supremacy

MarcusSA posted:

If that’s medium how bad is low?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply