|
orcane posted:Paul??? BlankSystemDaemon posted:RT is great for screenshots, but I sure as poo poo don't notice it when I'm playing a game where it's enabled - except that the performance usually gets hit pretty bad, and that to compensate usually involves some kind of upsampling where the game is rendered at a lower resolution. Yeah, when I've enabled it in the games I care about, it either hasn't been that impressive (which I'll chalk up to the developers half-assing its implementation), or it has looked good, but not worth the performance impacts. It seems like we've entered a period where you need DLSS/FSR/XeSS just to enable features like RT and not crash performance, but it's all to chase marginal improvements... Then they'll start pushing 8K and the cycle will start all over.
|
# ? Jul 3, 2023 19:33 |
|
|
# ? Jun 8, 2024 05:09 |
|
RT global illumination and path tracing is just nice because games with it can do dynamic lighting with day / night cycles the lighting never breaks. In pure raster games, anything not pre-baked and fixed in the dynamic cycle will eventually fail and look bad/wrong, and even plenty of the stuff that is pre-baked/fixed will still crumble completely the moment the player interacts with it or looks at it from a different angle.
|
# ? Jul 3, 2023 19:58 |
SourKraut posted:
Indiana_Krom posted:RT global illumination and path tracing is just nice because games with it can do dynamic lighting with day / night cycles the lighting never breaks. In pure raster games, anything not pre-baked and fixed in the dynamic cycle will eventually fail and look bad/wrong, and even plenty of the stuff that is pre-baked/fixed will still crumble completely the moment the player interacts with it or looks at it from a different angle. The only thing this results in is getting closer to the bottom of the uncanny valley,
|
|
# ? Jul 3, 2023 20:50 |
|
kliras posted:we're probably about to go back to the old days of a lot of areas with fans spinning to show dynamic rendering of raytraced light Funnily enough, fast moving shadows and lighting changes are things that current ray tracing methods are bad at because they accumulate and cache rays over a period of multiple frames to help reduce noise. So light passing through fans can sometimes look pretty bad.
|
# ? Jul 3, 2023 20:52 |
|
BlankSystemDaemon posted:The thing is, all of this pre-supposes some kind of fixation on making things look as realistic as possible. It really doesn't. Ray tracing/real-time global illumination can be used to enhance the look of stylized games too. Just look at the Jusant demo on Steam, which uses Unreal Engine 5 and its Lumen lighting system. It's not just about photorealism.
|
# ? Jul 3, 2023 20:54 |
|
Dr. Video Games 0031 posted:It really doesn't. Ray tracing/real-time global illumination can be used to enhance the look of stylized games too. Just look at the Jusant demo on Steam, which uses Unreal Engine 5 and its Lumen lighting system. It's not just about photorealism. or more broadly, look to modern 3D animated movies/TV/anime which are all using pathtracing at this point regardless of their art style. even spiderverse is pathtraced. how long it's going to take to get there is up for debate, but the endgame is for games to eventually converge on pathtracing like offline rendering already has
|
# ? Jul 3, 2023 21:03 |
Dr. Video Games 0031 posted:It really doesn't. Ray tracing/real-time global illumination can be used to enhance the look of stylized games too. Just look at the Jusant demo on Steam, which uses Unreal Engine 5 and its Lumen lighting system. It's not just about photorealism. Having said that, my point still stands that I absolutely won't notice it when playing the game - except that performance will be tanked, unless upsampling is enabled, which will make other parts look bad in ways that're way more obvious to spot in screenshots, but also won't be noticed while playing. It's one thing E:D got right; if you're in solo-mode, you can take extreme high-resolution screenshots that take several seconds to make, but it makes things look gorgeous - and that's in an engine that's not exactly new.
|
|
# ? Jul 3, 2023 21:04 |
|
The Lumen lighting system is a ray tracing/rasterization hybrid system. It accomplishes similar things to fully ray-traced global illumination, but with better performance at the cost of less accuracy. My point was that high-quality global illumination and shadows (Jusant uses UE5's new shadow system too) aren't just for photorealism.
|
# ? Jul 3, 2023 21:25 |
|
it doesn't appear that jusant even allows you to disable lumen, it's built around it. same as the upcoming ubisoft games that are going to have RTGI always on. this is just realtime graphics chat now though, we should probably take it to the GPU thread
|
# ? Jul 3, 2023 21:38 |
|
Hey, AMD RDNA2 APUs are a very important target for all this stuff!
|
# ? Jul 3, 2023 21:40 |
|
BlankSystemDaemon posted:gently caress, I can't even find a legitimate reason to go 4k - I can't imagine 8k would make sense unless you're doing monitors the size of walls.
|
# ? Jul 3, 2023 22:15 |
|
orcane posted:Paul???
|
# ? Jul 3, 2023 22:38 |
|
Combat Pretzel posted:Higher quality fonts via HiDPI rendering. It's like high refresh rate. Once you have it, you don't want to miss it. 8K is going to be utterly pointless being that 4k is already towards the limits of visual acuity at typical viewing distances anyway.
|
# ? Jul 3, 2023 23:59 |
|
a popular tip for gaming on PC handhelds these days is to disable cpu boost - a regedit for the ROG Ally with the new AMD APU allows you to turn it off and keep freqs <=3.3ghz is it generally a good thing to do? it seems to make sense - the iGPU on handhelds need all the power they can get in preference to CPUs, but not sure if i'm missing anything
|
# ? Jul 4, 2023 00:45 |
|
People in this thread not seeing a difference with 4k or RT on. I'm visually impaired and can see the difference lmfao..
|
# ? Jul 4, 2023 01:33 |
|
Hard to see the difference on my 60hz 1080p monitor with no RT
|
# ? Jul 4, 2023 01:54 |
|
Stanley Pain posted:People in this thread not seeing a difference with 4k or RT on. I can see that I'm getting 25 frames per second
|
# ? Jul 4, 2023 02:23 |
|
repiv posted:or more broadly, look to modern 3D animated movies/TV/anime which are all using pathtracing at this point regardless of their art style. even spiderverse is pathtraced. I’m not even sure how common this long-term convergence will be, because I think there’s a difference between how you observe media that you have no direct input into (such as movies) versus media that you can have input into, such as games. Which isn’t to say that it shouldn’t be done, just that I think there will always be a difference in biological response to what the end user is observing based upon the method in which they engage with it.
|
# ? Jul 4, 2023 03:18 |
|
Shipon posted:8K is going to be utterly pointless being that 4k is already towards the limits of visual acuity at typical viewing distances anyway. I don’t think this is true. People who have used Apple’s ProDisplay XDR 6K and Dell’s 6K and 8K offerings have all praised the image quality; they’re just currently in a different use case category right now and not meant for consumers. But companies need to keep pushing numbers bigger for profit, and eventually 4K will seem like stagnation… Stanley Pain posted:People in this thread not seeing a difference with 4k or RT on. I can see the difference when 4K or 5K are used for HiDPI settings, for font, etc., yeah. I can’t when gaming. Also, being visually impaired probably helps even more for noticing some of these things, but that’s a whole separate discussion… Canned Sunshine fucked around with this message at 06:26 on Jul 4, 2023 |
# ? Jul 4, 2023 03:27 |
|
The same old chestnuts pop up everytime we go up in resolution People were saying 1080p isn't necessary as 720p is all we will ever need, it is the pinnacle of realism, etc etc. We will be regurgitating these posts when 16k rolls around to replace our 8k displays
|
# ? Jul 4, 2023 04:28 |
|
MicroLED displays already have internal refresh rates of 12,000 Hz or more. And there are quarter-inch 5,000-DPI MicroLED displays in labs currently, so I figure it's just a matter of time before we end up reaching the goal of a "perfect" display where both the resolution and refresh rate are perceptually infinite. The only limitations on resolution and refresh rate will be the source hardware and available signal bandwidth. The point wouldn't be to actually output at those absurd resolutions but to be able to scale from any arbitrary resolution to the display's resolution with no perceptible scaling artifacts. It's probably 10 - 20 years before this ends up on our desks though.
|
# ? Jul 4, 2023 05:34 |
|
Dr. Video Games 0031 posted:The point wouldn't be to actually output at those absurd resolutions but to be able to scale from any arbitrary resolution to the display's resolution with no perceptible scaling artifacts. It's probably 10 - 20 years before this ends up on our desks though. Isn’t this less of a concern with how ubiquitous TAAU is becoming?
|
# ? Jul 4, 2023 06:08 |
|
Dr. Video Games 0031 posted:MicroLED displays already have internal refresh rates of 12,000 Hz or more. And there are quarter-inch 5,000-DPI MicroLED displays in labs currently, so I figure it's just a matter of time before we end up reaching the goal of a "perfect" display where both the resolution and refresh rate are perceptually infinite. The only limitations on resolution and refresh rate will be the source hardware and available signal bandwidth. The point wouldn't be to actually output at those absurd resolutions but to be able to scale from any arbitrary resolution to the display's resolution with no perceptible scaling artifacts. It's probably 10 - 20 years before this ends up on our desks though. this is lewd
|
# ? Jul 4, 2023 06:24 |
|
Dr. Video Games 0031 posted:MicroLED displays already have internal refresh rates of 12,000 Hz or more. And there are quarter-inch 5,000-DPI MicroLED displays in labs currently, so I figure it's just a matter of time before we end up reaching the goal of a "perfect" display where both the resolution and refresh rate are perceptually infinite. The only limitations on resolution and refresh rate will be the source hardware and available signal bandwidth. The point wouldn't be to actually output at those absurd resolutions but to be able to scale from any arbitrary resolution to the display's resolution with no perceptible scaling artifacts. It's probably 10 - 20 years before this ends up on our desks though. imo at that point you basically forget the idea of line scanout and just go for a minimum-error "compression" codec, and bear in mind that at 48gbit/s or whatever you can get a very low error minimum. but the idea to me there would be you draw macroblocks instead of lines, or you choose the lines that yield the minimum-error encoding. you update the perceptually-optimal parts of the image. and again, I think that fits very interestingly with the idea of variable-rate spatial+temporal sampling with DLSS3+. You can perceptually sample that edge where the enemy is emerging at like 10,000fps and update those pixels on the absolute hotpath etc. Like I haven't looked into the tech at all but just as described, being able to draw specific bits at super high refresh, or specific lines at super high refresh, is potentially super cool. If you can do that, why draw a beam-chasing pixel stream instead of a compressed video stream format with block/macroblock updates? one of my favorite videos, this does it with just line encoding on an IBM PC. https://www.youtube.com/watch?v=MWdG413nNkI https://trixter.oldskool.org/2014/06/19/8088-domination-post-mortem-part-1/ (but it's 48gbps, and writing macroblocks instead of lines) Paul MaudDib fucked around with this message at 08:04 on Jul 4, 2023 |
# ? Jul 4, 2023 07:48 |
|
Yeah, my imagination is still stuck in the current paradigm, but there's room to do some very new things once MicroLED actually lands. Anyway, enough display talk. A "Ryzen 5 7500F" 6-core AM5 CPU has just appeared out of nowhere, with a supposed release in Asia on July 7th (not sure if that includes the rest of the world). Nobody seems sure what it actually is yet: https://www.techpowerup.com/310804/amd-ryzen-5-7500f-desktop-processor-surfaces-could-this-be-phoenix-2-on-am5 There was a 5500, which was essentially a 5600G without the iGPU (6 cores, lower clocks than 5600X, half as much L3 cache, PCIe 3 only). The 7500F could be the same kind of idea, but the F part of the model name is new. TechPowerUp thinks it might be an APU, but that's just speculation. Dr. Video Games 0031 fucked around with this message at 09:02 on Jul 4, 2023 |
# ? Jul 4, 2023 08:58 |
|
Videocardz is reporting it as a no-iGPU model, hence the F, like Intel uses the F designation.
|
# ? Jul 4, 2023 09:02 |
|
They also just seem to be assuming that based on the presence of the "F", but it's not guaranteed that it means the same thing with AMD CPUs and Intel CPUs. Though I think it probably does. If it were an APU, it would be more expensive.
Dr. Video Games 0031 fucked around with this message at 09:46 on Jul 4, 2023 |
# ? Jul 4, 2023 09:05 |
|
Zedsdeadbaby posted:The same old chestnuts pop up everytime we go up in resolution You are right, but there are diminishing returns. Kind of like how games seem to slowly not be improving by major leaps now vs the difference in Super Mario World to Mario 64 to Mario Sunshine. 4k is pretty detailed and noticeable enough vs 1080p but it isn’t particularly revolutionary, imo. I would think beyond 8k it will be the display and rendering technology itself that has to change to make things look more “real” (led, lcd, projectors, ray tracing, et. al.) and not necessarily how many pixels are crammed in. dialhforhero fucked around with this message at 17:56 on Jul 4, 2023 |
# ? Jul 4, 2023 17:52 |
|
I think some people just don't see/care about image fidelity almost in the same way as some people are fine with cheap headphones vs. better ones.
|
# ? Jul 4, 2023 18:50 |
|
Looks like I may have lucked out and gotten a 7800X3D that happily takes -30 PBO, is there anything else worth tweaking? In R23 all-core stability test it boosts to around 4.7GHz, sits at 89C and draws about 82 watts. Not sure how comparable that is to stock.
|
# ? Jul 4, 2023 22:31 |
|
SourKraut posted:I don’t think this is true. People who have used Apple’s ProDisplay XDR 6K and Dell’s 6K and 8K offerings have all praised the image quality; they’re just currently in a different use case category right now and not meant for consumers.
|
# ? Jul 5, 2023 02:15 |
|
ijyt posted:Looks like I may have lucked out and gotten a 7800X3D that happily takes -30 PBO, is there anything else worth tweaking? In R23 all-core stability test it boosts to around 4.7GHz, sits at 89C and draws about 82 watts. Not sure how comparable that is to stock. Sounds like you’d need more powerful cooling to improve things. The only other thing that could help would be VSoC and VDIMM to make the memory controller run cooler but I’m not familiar at all with zen4 tweaking.
|
# ? Jul 5, 2023 02:40 |
|
ijyt posted:Looks like I may have lucked out and gotten a 7800X3D that happily takes -30 PBO, is there anything else worth tweaking? In R23 all-core stability test it boosts to around 4.7GHz, sits at 89C and draws about 82 watts. Not sure how comparable that is to stock. I get around 4.85 GHz at 86C with 92W power draw in R23. -20 all-core curve optimizer, and it's cooled by a Dark Rock Pro 4 that may or may not have a non-functional middle fan. 89C is the max operating temperature for the Zen 4 X3D chips, so it seems like some small amount of performance is being left on the table with your current setup, though I doubt it's anything you're likely to notice in regular usage. Dr. Video Games 0031 fucked around with this message at 03:10 on Jul 5, 2023 |
# ? Jul 5, 2023 02:50 |
|
Shipon posted:The difference with those monitors is the use case is for UI elements and high information density, where there is a clear advantage to higher resolutions. I don't agree that the same is true for 3D game worlds for the most part. I think there's far more value in higher refresh rates than higher resolutions at this point - we may actually be able to achieve "real" motion blur if we can do 1000+ FPS. I honestly don't think there's value in trying to constantly push the boundary of higher resolution in pursuit of "real motion blur" in games, because you're getting into the biology of the human eye and how foveal observation is processed by the brain relative to the surrounding aperture. That's why HiDPI looks so good, and why there are still so many possibilities that can be done with HiDPI implementation, especially with gaming, when you start getting into per-pixel opportunities. When it comes to improving motion blur, you'd be better served by upsizing the monitor enough to cover your entire area of vision, and then having the game engine target game detail within an estimated foveal area while reducing the detail on the outer aperture area, i.e. variable frame rate across the display itself. But I'm not even sure if that's currently possible in terms of display tech? But it'd be pretty cool! Especially since it would open up all kinds of opportunities for how displays present information, including the return of stereoscopic 3D (and glassless this time)! Otherwise though, if you're looking for the entire display to do it, it's always going to appear "off" to you even if you could hit 5,000 or 10,000 FPS, because of how the brain is processing what it's observing, and I doubt you'd end up happy with it... This is where VR/augmented gaming could really shine.
|
# ? Jul 5, 2023 03:04 |
|
hobbesmaster posted:Sounds like you’d need more powerful cooling to improve things. The only other thing that could help would be VSoC and VDIMM to make the memory controller run cooler but I’m not familiar at all with zen4 tweaking. It's an SFF build with an AXP120-X67 so this was going to be expected in terms of temps tbh. e: Ran OCCT stability test, found errors on core 2 on the first test and then no issues on the second test, strange. I'll leave things as is for now and then keep optimising once I have the actual case this system is going to live in. ijyt fucked around with this message at 09:52 on Jul 5, 2023 |
# ? Jul 5, 2023 07:46 |
|
ijyt posted:e: Ran OCCT stability test, found errors on core 2 on the first test and then no issues on the second test, strange. I'll leave things as is for now and then keep optimising once I have the actual case this system is going to live in. For testing undervolt stability you want core cycler, not traditional overclock stability tests. The problem with undervolts is the point of maximum stress isn't the max load for one core or all-core, it's the transitions between idle and load. So core cycler runs a single thread of prime95 and shuffles it from core to core to produce a lot of that. Downside, core cycler is pretty slow. You don't have to run it for the dozens of hours like the author suggests, but you at least want an overnight test. (Cinebench can also work well as a quick & dirty test.)
|
# ? Jul 5, 2023 11:54 |
|
Klyith posted:For testing undervolt stability you want core cycler, not traditional overclock stability tests. The problem with undervolts is the point of maximum stress isn't the max load for one core or all-core, it's the transitions between idle and load. So core cycler runs a single thread of prime95 and shuffles it from core to core to produce a lot of that. Oh yeah that ran overnight last night without any hiccups, forgot to mention!
|
# ? Jul 5, 2023 13:17 |
|
Klyith posted:For testing undervolt stability you want core cycler, not traditional overclock stability tests. The problem with undervolts is the point of maximum stress isn't the max load for one core or all-core, it's the transitions between idle and load. So core cycler runs a single thread of prime95 and shuffles it from core to core to produce a lot of that. I ended up using prime95 to do like first pass tests and then after I didn't get any errors in a few hours ran corecycler for a few days. Ended up with 2 cores -25, 1 -27 and 6 -30.
|
# ? Jul 5, 2023 13:19 |
|
Stanley Pain posted:People in this thread not seeing a difference with 4k or RT on. Yeah I don't get it. Cyberpunk, with path tracing on, driving through a man drag at night, is unlike anything ever seen before. If you can't see a difference you haven't seen it IMO. It's like Minecraft RTX, it's absolutely stunning with path tracing and I could never go back to vanilla. The sense of awe and discovery when the sun rises and fills up a valley with light, or exploring at night and seeing the lava flow in the distance illuminating the clouds above you so they're faintly glowing red... (there is now a very beefy path tracing mod for java edition that was released very recently) I'll give you this, ray tracing is hard to pin down sometimes because the implementation can range from "just the shadows please" or RTGI or full on Path Tracing. And you might have a negative reaction because of the performance hit. But slapping on DLSS fixes all of that up for the most part. Cyberpunk 2077 w/ path tracing at 1440p on an HDR screen was something to behold. Same with Minecraft path traced. And it's easy to see on screen shots, even the one posted where it's just stalactites and OP is stating "where the rays", if it was standard ambient occlusion the point where the stalactite models intersect with the ceiling would be glowing and then your mind sees that it's just Lego block models dragged and dropped I think in a few years when AMD pulls it's ray tracing pants up we'll be hard into the generation of path-traces remakes Alan Wake 2, I wonder if that'll get full path tracing on PC... would be nuts if it did
|
# ? Jul 5, 2023 14:02 |
|
|
# ? Jun 8, 2024 05:09 |
|
You can’t run Minecraft RT above 60fps, right?
|
# ? Jul 5, 2023 14:10 |