Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION




:lol:

BlankSystemDaemon posted:

RT is great for screenshots, but I sure as poo poo don't notice it when I'm playing a game where it's enabled - except that the performance usually gets hit pretty bad, and that to compensate usually involves some kind of upsampling where the game is rendered at a lower resolution.

Seems kinda like that defeats the whole purpose of making things look good.

Yeah, when I've enabled it in the games I care about, it either hasn't been that impressive (which I'll chalk up to the developers half-assing its implementation), or it has looked good, but not worth the performance impacts.

It seems like we've entered a period where you need DLSS/FSR/XeSS just to enable features like RT and not crash performance, but it's all to chase marginal improvements...

Then they'll start pushing 8K and the cycle will start all over.

Adbot
ADBOT LOVES YOU

Indiana_Krom
Jun 18, 2007
Net Slacker
RT global illumination and path tracing is just nice because games with it can do dynamic lighting with day / night cycles the lighting never breaks. In pure raster games, anything not pre-baked and fixed in the dynamic cycle will eventually fail and look bad/wrong, and even plenty of the stuff that is pre-baked/fixed will still crumble completely the moment the player interacts with it or looks at it from a different angle.

BlankSystemDaemon
Mar 13, 2009




SourKraut posted:

:lol:

Yeah, when I've enabled it in the games I care about, it either hasn't been that impressive (which I'll chalk up to the developers half-assing its implementation), or it has looked good, but not worth the performance impacts.

It seems like we've entered a period where you need DLSS/FSR/XeSS just to enable features like RT and not crash performance, but it's all to chase marginal improvements...

Then they'll start pushing 8K and the cycle will start all over.
gently caress, I can't even find a legitimate reason to go 4k - I can't imagine 8k would make sense unless you're doing monitors the size of walls.

Indiana_Krom posted:

RT global illumination and path tracing is just nice because games with it can do dynamic lighting with day / night cycles the lighting never breaks. In pure raster games, anything not pre-baked and fixed in the dynamic cycle will eventually fail and look bad/wrong, and even plenty of the stuff that is pre-baked/fixed will still crumble completely the moment the player interacts with it or looks at it from a different angle.
The thing is, all of this pre-supposes some kind of fixation on making things look as realistic as possible.

The only thing this results in is getting closer to the bottom of the uncanny valley,

Dr. Video Games 0031
Jul 17, 2004

kliras posted:

we're probably about to go back to the old days of a lot of areas with fans spinning to show dynamic rendering of raytraced light

Funnily enough, fast moving shadows and lighting changes are things that current ray tracing methods are bad at because they accumulate and cache rays over a period of multiple frames to help reduce noise. So light passing through fans can sometimes look pretty bad.

Dr. Video Games 0031
Jul 17, 2004

BlankSystemDaemon posted:

The thing is, all of this pre-supposes some kind of fixation on making things look as realistic as possible.

The only thing this results in is getting closer to the bottom of the uncanny valley,

It really doesn't. Ray tracing/real-time global illumination can be used to enhance the look of stylized games too. Just look at the Jusant demo on Steam, which uses Unreal Engine 5 and its Lumen lighting system. It's not just about photorealism.

repiv
Aug 13, 2009

Dr. Video Games 0031 posted:

It really doesn't. Ray tracing/real-time global illumination can be used to enhance the look of stylized games too. Just look at the Jusant demo on Steam, which uses Unreal Engine 5 and its Lumen lighting system. It's not just about photorealism.

or more broadly, look to modern 3D animated movies/TV/anime which are all using pathtracing at this point regardless of their art style. even spiderverse is pathtraced.

how long it's going to take to get there is up for debate, but the endgame is for games to eventually converge on pathtracing like offline rendering already has

BlankSystemDaemon
Mar 13, 2009




Dr. Video Games 0031 posted:

It really doesn't. Ray tracing/real-time global illumination can be used to enhance the look of stylized games too. Just look at the Jusant demo on Steam, which uses Unreal Engine 5 and its Lumen lighting system. It's not just about photorealism.
I don't see the raytracing in the screenshots, except possibly in this:

Having said that, my point still stands that I absolutely won't notice it when playing the game - except that performance will be tanked, unless upsampling is enabled, which will make other parts look bad in ways that're way more obvious to spot in screenshots, but also won't be noticed while playing.

It's one thing E:D got right; if you're in solo-mode, you can take extreme high-resolution screenshots that take several seconds to make, but it makes things look gorgeous - and that's in an engine that's not exactly new.

Dr. Video Games 0031
Jul 17, 2004

The Lumen lighting system is a ray tracing/rasterization hybrid system. It accomplishes similar things to fully ray-traced global illumination, but with better performance at the cost of less accuracy. My point was that high-quality global illumination and shadows (Jusant uses UE5's new shadow system too) aren't just for photorealism.

repiv
Aug 13, 2009

it doesn't appear that jusant even allows you to disable lumen, it's built around it. same as the upcoming ubisoft games that are going to have RTGI always on.

this is just realtime graphics chat now though, we should probably take it to the GPU thread

hobbesmaster
Jan 28, 2008

Hey, AMD RDNA2 APUs are a very important target for all this stuff!

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

BlankSystemDaemon posted:

gently caress, I can't even find a legitimate reason to go 4k - I can't imagine 8k would make sense unless you're doing monitors the size of walls.
Higher quality fonts via HiDPI rendering. It's like high refresh rate. Once you have it, you don't want to miss it.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Shipon
Nov 7, 2005

Combat Pretzel posted:

Higher quality fonts via HiDPI rendering. It's like high refresh rate. Once you have it, you don't want to miss it.
This is how I feel about 4K already. The biggest advantage of high resolutions is crisper text and UI elements. The actual rendering of the scene can be upscaled from 1440p and you'll barely notice but the UI looks immediately sharper.

8K is going to be utterly pointless being that 4k is already towards the limits of visual acuity at typical viewing distances anyway.

shrike82
Jun 11, 2005

a popular tip for gaming on PC handhelds these days is to disable cpu boost - a regedit for the ROG Ally with the new AMD APU allows you to turn it off and keep freqs <=3.3ghz
is it generally a good thing to do? it seems to make sense - the iGPU on handhelds need all the power they can get in preference to CPUs, but not sure if i'm missing anything

Stanley Pain
Jun 16, 2001

by Fluffdaddy
People in this thread not seeing a difference with 4k or RT on. :eyepop:

I'm visually impaired and can see the difference lmfao..

Suspect A
Jan 1, 2015

Nap Ghost
Hard to see the difference on my 60hz 1080p monitor with no RT

Klyith
Aug 3, 2007

GBS Pledge Week

Stanley Pain posted:

People in this thread not seeing a difference with 4k or RT on. :eyepop:

I'm visually impaired and can see the difference lmfao..

I can see that I'm getting 25 frames per second

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



repiv posted:

or more broadly, look to modern 3D animated movies/TV/anime which are all using pathtracing at this point regardless of their art style. even spiderverse is pathtraced.

how long it's going to take to get there is up for debate, but the endgame is for games to eventually converge on pathtracing like offline rendering already has

I’m not even sure how common this long-term convergence will be, because I think there’s a difference between how you observe media that you have no direct input into (such as movies) versus media that you can have input into, such as games.

Which isn’t to say that it shouldn’t be done, just that I think there will always be a difference in biological response to what the end user is observing based upon the method in which they engage with it.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Shipon posted:

8K is going to be utterly pointless being that 4k is already towards the limits of visual acuity at typical viewing distances anyway.

I don’t think this is true. People who have used Apple’s ProDisplay XDR 6K and Dell’s 6K and 8K offerings have all praised the image quality; they’re just currently in a different use case category right now and not meant for consumers.

But companies need to keep pushing numbers bigger for profit, and eventually 4K will seem like stagnation…

Stanley Pain posted:

People in this thread not seeing a difference with 4k or RT on. :eyepop:

I'm visually impaired and can see the difference lmfao..

I can see the difference when 4K or 5K are used for HiDPI settings, for font, etc., yeah. I can’t when gaming.

Also, being visually impaired probably helps even more for noticing some of these things, but that’s a whole separate discussion…

Canned Sunshine fucked around with this message at 06:26 on Jul 4, 2023

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
The same old chestnuts pop up everytime we go up in resolution

People were saying 1080p isn't necessary as 720p is all we will ever need, it is the pinnacle of realism, etc etc.

We will be regurgitating these posts when 16k rolls around to replace our 8k displays

Dr. Video Games 0031
Jul 17, 2004

MicroLED displays already have internal refresh rates of 12,000 Hz or more. And there are quarter-inch 5,000-DPI MicroLED displays in labs currently, so I figure it's just a matter of time before we end up reaching the goal of a "perfect" display where both the resolution and refresh rate are perceptually infinite. The only limitations on resolution and refresh rate will be the source hardware and available signal bandwidth. The point wouldn't be to actually output at those absurd resolutions but to be able to scale from any arbitrary resolution to the display's resolution with no perceptible scaling artifacts. It's probably 10 - 20 years before this ends up on our desks though.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

Dr. Video Games 0031 posted:

The point wouldn't be to actually output at those absurd resolutions but to be able to scale from any arbitrary resolution to the display's resolution with no perceptible scaling artifacts. It's probably 10 - 20 years before this ends up on our desks though.

Isn’t this less of a concern with how ubiquitous TAAU is becoming?

ijyt
Apr 10, 2012

Dr. Video Games 0031 posted:

MicroLED displays already have internal refresh rates of 12,000 Hz or more. And there are quarter-inch 5,000-DPI MicroLED displays in labs currently, so I figure it's just a matter of time before we end up reaching the goal of a "perfect" display where both the resolution and refresh rate are perceptually infinite. The only limitations on resolution and refresh rate will be the source hardware and available signal bandwidth. The point wouldn't be to actually output at those absurd resolutions but to be able to scale from any arbitrary resolution to the display's resolution with no perceptible scaling artifacts. It's probably 10 - 20 years before this ends up on our desks though.

this is lewd

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Dr. Video Games 0031 posted:

MicroLED displays already have internal refresh rates of 12,000 Hz or more. And there are quarter-inch 5,000-DPI MicroLED displays in labs currently, so I figure it's just a matter of time before we end up reaching the goal of a "perfect" display where both the resolution and refresh rate are perceptually infinite. The only limitations on resolution and refresh rate will be the source hardware and available signal bandwidth. The point wouldn't be to actually output at those absurd resolutions but to be able to scale from any arbitrary resolution to the display's resolution with no perceptible scaling artifacts. It's probably 10 - 20 years before this ends up on our desks though.

imo at that point you basically forget the idea of line scanout and just go for a minimum-error "compression" codec, and bear in mind that at 48gbit/s or whatever you can get a very low error minimum. but the idea to me there would be you draw macroblocks instead of lines, or you choose the lines that yield the minimum-error encoding. you update the perceptually-optimal parts of the image.

and again, I think that fits very interestingly with the idea of variable-rate spatial+temporal sampling with DLSS3+. You can perceptually sample that edge where the enemy is emerging at like 10,000fps and update those pixels on the absolute hotpath etc. Like I haven't looked into the tech at all but just as described, being able to draw specific bits at super high refresh, or specific lines at super high refresh, is potentially super cool. If you can do that, why draw a beam-chasing pixel stream instead of a compressed video stream format with block/macroblock updates?

one of my favorite videos, this does it with just line encoding on an IBM PC.

https://www.youtube.com/watch?v=MWdG413nNkI

https://trixter.oldskool.org/2014/06/19/8088-domination-post-mortem-part-1/

(but it's 48gbps, and writing macroblocks instead of lines)

Paul MaudDib fucked around with this message at 08:04 on Jul 4, 2023

Dr. Video Games 0031
Jul 17, 2004

Yeah, my imagination is still stuck in the current paradigm, but there's room to do some very new things once MicroLED actually lands.

Anyway, enough display talk. A "Ryzen 5 7500F" 6-core AM5 CPU has just appeared out of nowhere, with a supposed release in Asia on July 7th (not sure if that includes the rest of the world). Nobody seems sure what it actually is yet: https://www.techpowerup.com/310804/amd-ryzen-5-7500f-desktop-processor-surfaces-could-this-be-phoenix-2-on-am5

There was a 5500, which was essentially a 5600G without the iGPU (6 cores, lower clocks than 5600X, half as much L3 cache, PCIe 3 only). The 7500F could be the same kind of idea, but the F part of the model name is new. TechPowerUp thinks it might be an APU, but that's just speculation.

Dr. Video Games 0031 fucked around with this message at 09:02 on Jul 4, 2023

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
Videocardz is reporting it as a no-iGPU model, hence the F, like Intel uses the F designation.

Dr. Video Games 0031
Jul 17, 2004

They also just seem to be assuming that based on the presence of the "F", but it's not guaranteed that it means the same thing with AMD CPUs and Intel CPUs. Though I think it probably does. If it were an APU, it would be more expensive.

Dr. Video Games 0031 fucked around with this message at 09:46 on Jul 4, 2023

dialhforhero
Apr 3, 2008
Am I 🧑‍🏫 out of touch🤔? No🧐, it's the children👶 who are wrong🤷🏼‍♂️

Zedsdeadbaby posted:

The same old chestnuts pop up everytime we go up in resolution

People were saying 1080p isn't necessary as 720p is all we will ever need, it is the pinnacle of realism, etc etc.

We will be regurgitating these posts when 16k rolls around to replace our 8k displays

You are right, but there are diminishing returns. Kind of like how games seem to slowly not be improving by major leaps now vs the difference in Super Mario World to Mario 64 to Mario Sunshine.

4k is pretty detailed and noticeable enough vs 1080p but it isn’t particularly revolutionary, imo. I would think beyond 8k it will be the display and rendering technology itself that has to change to make things look more “real” (led, lcd, projectors, ray tracing, et. al.) and not necessarily how many pixels are crammed in.

dialhforhero fucked around with this message at 17:56 on Jul 4, 2023

Stanley Pain
Jun 16, 2001

by Fluffdaddy
I think some people just don't see/care about image fidelity almost in the same way as some people are fine with cheap headphones vs. better ones.

ijyt
Apr 10, 2012

Looks like I may have lucked out and gotten a 7800X3D that happily takes -30 PBO, is there anything else worth tweaking? In R23 all-core stability test it boosts to around 4.7GHz, sits at 89C and draws about 82 watts. Not sure how comparable that is to stock.

Shipon
Nov 7, 2005

SourKraut posted:

I don’t think this is true. People who have used Apple’s ProDisplay XDR 6K and Dell’s 6K and 8K offerings have all praised the image quality; they’re just currently in a different use case category right now and not meant for consumers.

But companies need to keep pushing numbers bigger for profit, and eventually 4K will seem like stagnation…

I can see the difference when 4K or 5K are used for HiDPI settings, for font, etc., yeah. I can’t when gaming.

Also, being visually impaired probably helps even more for noticing some of these things, but that’s a whole separate discussion…
The difference with those monitors is the use case is for UI elements and high information density, where there is a clear advantage to higher resolutions. I don't agree that the same is true for 3D game worlds for the most part. I think there's far more value in higher refresh rates than higher resolutions at this point - we may actually be able to achieve "real" motion blur if we can do 1000+ FPS.

hobbesmaster
Jan 28, 2008

ijyt posted:

Looks like I may have lucked out and gotten a 7800X3D that happily takes -30 PBO, is there anything else worth tweaking? In R23 all-core stability test it boosts to around 4.7GHz, sits at 89C and draws about 82 watts. Not sure how comparable that is to stock.

Sounds like you’d need more powerful cooling to improve things. The only other thing that could help would be VSoC and VDIMM to make the memory controller run cooler but I’m not familiar at all with zen4 tweaking.

Dr. Video Games 0031
Jul 17, 2004

ijyt posted:

Looks like I may have lucked out and gotten a 7800X3D that happily takes -30 PBO, is there anything else worth tweaking? In R23 all-core stability test it boosts to around 4.7GHz, sits at 89C and draws about 82 watts. Not sure how comparable that is to stock.

I get around 4.85 GHz at 86C with 92W power draw in R23. -20 all-core curve optimizer, and it's cooled by a Dark Rock Pro 4 that may or may not have a non-functional middle fan. 89C is the max operating temperature for the Zen 4 X3D chips, so it seems like some small amount of performance is being left on the table with your current setup, though I doubt it's anything you're likely to notice in regular usage.

Dr. Video Games 0031 fucked around with this message at 03:10 on Jul 5, 2023

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Shipon posted:

The difference with those monitors is the use case is for UI elements and high information density, where there is a clear advantage to higher resolutions. I don't agree that the same is true for 3D game worlds for the most part. I think there's far more value in higher refresh rates than higher resolutions at this point - we may actually be able to achieve "real" motion blur if we can do 1000+ FPS.

I honestly don't think there's value in trying to constantly push the boundary of higher resolution in pursuit of "real motion blur" in games, because you're getting into the biology of the human eye and how foveal observation is processed by the brain relative to the surrounding aperture. That's why HiDPI looks so good, and why there are still so many possibilities that can be done with HiDPI implementation, especially with gaming, when you start getting into per-pixel opportunities.

When it comes to improving motion blur, you'd be better served by upsizing the monitor enough to cover your entire area of vision, and then having the game engine target game detail within an estimated foveal area while reducing the detail on the outer aperture area, i.e. variable frame rate across the display itself. But I'm not even sure if that's currently possible in terms of display tech? But it'd be pretty cool! Especially since it would open up all kinds of opportunities for how displays present information, including the return of stereoscopic 3D (and glassless this time)!

Otherwise though, if you're looking for the entire display to do it, it's always going to appear "off" to you even if you could hit 5,000 or 10,000 FPS, because of how the brain is processing what it's observing, and I doubt you'd end up happy with it...

This is where VR/augmented gaming could really shine.

ijyt
Apr 10, 2012

hobbesmaster posted:

Sounds like you’d need more powerful cooling to improve things. The only other thing that could help would be VSoC and VDIMM to make the memory controller run cooler but I’m not familiar at all with zen4 tweaking.

It's an SFF build with an AXP120-X67 so this was going to be expected in terms of temps tbh.

e: Ran OCCT stability test, found errors on core 2 on the first test and then no issues on the second test, strange. I'll leave things as is for now and then keep optimising once I have the actual case this system is going to live in.

ijyt fucked around with this message at 09:52 on Jul 5, 2023

Klyith
Aug 3, 2007

GBS Pledge Week

ijyt posted:

e: Ran OCCT stability test, found errors on core 2 on the first test and then no issues on the second test, strange. I'll leave things as is for now and then keep optimising once I have the actual case this system is going to live in.

For testing undervolt stability you want core cycler, not traditional overclock stability tests. The problem with undervolts is the point of maximum stress isn't the max load for one core or all-core, it's the transitions between idle and load. So core cycler runs a single thread of prime95 and shuffles it from core to core to produce a lot of that.

Downside, core cycler is pretty slow. You don't have to run it for the dozens of hours like the author suggests, but you at least want an overnight test.

(Cinebench can also work well as a quick & dirty test.)

ijyt
Apr 10, 2012

Klyith posted:

For testing undervolt stability you want core cycler, not traditional overclock stability tests. The problem with undervolts is the point of maximum stress isn't the max load for one core or all-core, it's the transitions between idle and load. So core cycler runs a single thread of prime95 and shuffles it from core to core to produce a lot of that.

Downside, core cycler is pretty slow. You don't have to run it for the dozens of hours like the author suggests, but you at least want an overnight test.

(Cinebench can also work well as a quick & dirty test.)

Oh yeah that ran overnight last night without any hiccups, forgot to mention!

Rakeris
Jul 20, 2014

Klyith posted:

For testing undervolt stability you want core cycler, not traditional overclock stability tests. The problem with undervolts is the point of maximum stress isn't the max load for one core or all-core, it's the transitions between idle and load. So core cycler runs a single thread of prime95 and shuffles it from core to core to produce a lot of that.

Downside, core cycler is pretty slow. You don't have to run it for the dozens of hours like the author suggests, but you at least want an overnight test.

(Cinebench can also work well as a quick & dirty test.)

I ended up using prime95 to do like first pass tests and then after I didn't get any errors in a few hours ran corecycler for a few days. Ended up with 2 cores -25, 1 -27 and 6 -30.

forest spirit
Apr 6, 2009

Frigate Hetman Sahaidachny
First to Fight Scuttle, First to Fall Sink


Stanley Pain posted:

People in this thread not seeing a difference with 4k or RT on. :eyepop:

I'm visually impaired and can see the difference lmfao..

Yeah I don't get it.

Cyberpunk, with path tracing on, driving through a man drag at night, is unlike anything ever seen before. If you can't see a difference you haven't seen it IMO.

It's like Minecraft RTX, it's absolutely stunning with path tracing and I could never go back to vanilla. The sense of awe and discovery when the sun rises and fills up a valley with light, or exploring at night and seeing the lava flow in the distance illuminating the clouds above you so they're faintly glowing red...

(there is now a very beefy path tracing mod for java edition that was released very recently)

I'll give you this, ray tracing is hard to pin down sometimes because the implementation can range from "just the shadows please" or RTGI or full on Path Tracing. And you might have a negative reaction because of the performance hit. But slapping on DLSS fixes all of that up for the most part.

Cyberpunk 2077 w/ path tracing at 1440p on an HDR screen was something to behold. Same with Minecraft path traced.

And it's easy to see on screen shots, even the one posted where it's just stalactites and OP is stating "where the rays", if it was standard ambient occlusion the point where the stalactite models intersect with the ceiling would be glowing and then your mind sees that it's just Lego block models dragged and dropped

I think in a few years when AMD pulls it's ray tracing pants up we'll be hard into the generation of path-traces remakes

Alan Wake 2, I wonder if that'll get full path tracing on PC... would be nuts if it did

Adbot
ADBOT LOVES YOU

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
You can’t run Minecraft RT above 60fps, right?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply