|
eames posted:The phrase „needlessly high resolution“ reminds me a bit of the countless arguments of .wav/lossless audio vs lossy compression. As of 2020 the latter seems to have won out (I consume all my music via normal iTunes/Spotify/Amazon streaming), but I also think that the defenders of lossless have valid arguments. Well, the thing with audio is that most people consume it (especially when streamed) on poo poo-tier hardware that you'd be hard pressed to differentiate a 56kbps MP3 from a CD source in the first place, so the extra overhead of lossless was a waste (and a cost the streaming services simply couldn't absorb in many cases). Image quality, on the other hand, is going to continue to be stressed as we keep pressing for higher and higher resolution and refresh-rate monitors. As a mitigating factor though, most games are about motion, and motion can hide a lot of imperfections, especially if it can be combined with something like foveated rendering that pushes most of the issue areas out to the periphery. People looking for the ultimate in static image quality for screenshots, or where performance isn't generally an issue, might still opt for the native resolution purity. It'll be interesting to see where it goes!
|
# ? Apr 15, 2020 23:58 |
|
|
# ? Jun 5, 2024 17:28 |
|
ConanTheLibrarian posted:It will be interesting to see if nvidia add tensor cores to their cheap GPUs. On the one hand they might reserve them as a premium feature, but on the other hand they could create low end cards with many fewer shaders by compensating with tensor cores. Pushing DLSS to the largest segment of the market is one way to encourage developers to invest in it. If they make an RTX 3010 I'll use it to DLSS a 480p image up to my 720p monitor
|
# ? Apr 16, 2020 00:07 |
|
With Nvidia's Lightspeed studio reportedly adding RTX to some older titles, should they throw in DLSS 2.0 in the mix that would be massive. So tempted to get a 2080ti now to play around with, but I really should wait for Ampere.
|
# ? Apr 16, 2020 00:11 |
|
Per-pixel quality matters a lot less as resolution increases. A single pixel that's meaningfully off is hard to miss at 480p, and basically invisible at 4k. The reality is that you just tend not to notice, because where tradeoffs are made, it's usually for good reason.
|
# ? Apr 16, 2020 00:11 |
|
K8.0 posted:Per-pixel quality matters a lot less as resolution increases. A single pixel that's meaningfully off is hard to miss at 480p, and basically invisible at 4k. The reality is that you just tend not to notice, because where tradeoffs are made, it's usually for good reason. the caveat being that even if it's OK to be off, it still needs to be off in a temporally stable fashion, otherwise the movement/shimmering is very noticeable. This is the thing that TAA really did better, jitter was sometimes very noticeable in previous AA algorithms.
|
# ? Apr 16, 2020 00:21 |
|
I think peoples intuition about resolution needs updating, we're used to the idea that native resolution is the magic point where graphics are perfect but that's not how it works anymore. Modern PBR rendering has so many high frequency components that even a native 1440p or 4K image is usually undersampled and aliased to poo poo, that's why TAA has become an industry standard to reconstruct those subpixel details over time. There isn't even a direct mapping between rendered pixels and screen pixels at native resolution anymore, if you use TAA then the rendered pixels have a pseudo-random subpixel offset and you're relying on the reconstruction to stabilize it. If reconstruction is practically required to get a stable signal, and reconstruction negates the benefit of perfectly matching the rendered pixels to the display pixels... what's actually special about native resolution at that point? We just need a resolution that feeds enough information into the reconstruction to get a good signal out of the other side, which as we've seen can be significantly lower than native if the algorithm is smart enough. [joker voice] taa extrapolates subpixel details and nobody bats an eye, but taa extrapolates details slightly bigger than one pixel and everyone loses their minds repiv fucked around with this message at 01:20 on Apr 16, 2020 |
# ? Apr 16, 2020 00:54 |
|
K8.0 posted:Per-pixel quality matters a lot less as resolution increases. A single pixel that's meaningfully off is hard to miss at 480p, and basically invisible at 4k. The reality is that you just tend not to notice, because where tradeoffs are made, it's usually for good reason. Its pixel density. Lowered settings are extremely hard to pick out @ like 23'' 4k. Much easier at say, 43''
|
# ? Apr 16, 2020 01:44 |
|
Probably a lost cause, but does anyone know if it is possible to have ULMB enabled on one monitor and GSync on the other? I have two PG279Q side by side and it'd be nice to not have gsync on the secondary because it likes to sync that monitor up with the game I am playing on my primary and mess with Netflix on the second monitor. I'd love to have gsync on the primary for playing Doom and ULMB on my secondary for sweet movie-watching motion persistence. I used to be able to get it to work by setting the second monitor to 120Hz and then turning on ULMB in the monitor UI but now I can't do that now as the setting in the nvidia control panel forces GSync on everything or nothing. ULMB is normally source independent so it'd be nice to just say to the monitor "don't loving talk to the GPU please".
|
# ? Apr 16, 2020 08:34 |
|
Seamonster posted:Its pixel density. Lowered settings are extremely hard to pick out @ like 23'' 4k. Much easier at say, 43'' Exactly. I'm running 3840x2160 @ 43" All those pixels count
|
# ? Apr 16, 2020 08:54 |
|
eames posted:I feel like it could go either way and might turn out to be two-edged sword, because if this catches on native rendering output might be gone for good. I dunno, “native rendering” has been sort of meaningless since the advent of TAA and other methods that are more-or-less doing reconstruction/blurring even at “native” resolution. Razor sharp and clean has been gone for a long time. Stickman fucked around with this message at 09:08 on Apr 16, 2020 |
# ? Apr 16, 2020 09:05 |
|
Dynamic resolution is something I wish was more widespread in PC games. It would be great if this could somehow be combined with DLSS or something like it.
|
# ? Apr 16, 2020 10:25 |
|
Stickman posted:I dunno, “native rendering” has been sort of meaningless since the advent of TAA and other methods that are more-or-less doing reconstruction/blurring even at “native” resolution. Razor sharp and clean has been gone for a long time. But it's also not really necessary. Image/video compression relies heavily on the differences in the eye's ability to make out the various components of a visual signal. Certain things aren't perceptible. To give an example, MGS5's rendering engine used different resolutions for each layer of effects for a given frame (some native, some subsampled). That game looked great and ran really well even on potato GPUs, showing that the loss of fidelity wasn't something you'd really miss, and the performance gains were more than worth it.
|
# ? Apr 16, 2020 10:34 |
|
Pretty much everything uses multiple resolutions through the pipeline now, buffers as low as quarter resolution are often used and even at ultra settings your SSAO, SSR, volumetrics, etc are still usually half res. The fact that it's rarely noticeable just goes to show that native resolution isn't special, as long as you sample above the Nyquist frequency or close enough to integrate it temporally.
|
# ? Apr 16, 2020 12:23 |
|
In other news Crysis Remastered just leaked and it's going to use the hardware-agnostic, non-accelerated raytracing from the Neon Noir demo, it'll be interesting to see how that stacks up against RTX/DXR.
|
# ? Apr 16, 2020 13:04 |
|
repiv posted:I think peoples intuition about resolution needs updating, we're used to the idea that native resolution is the magic point where graphics are perfect but that's not how it works anymore. Modern PBR rendering has so many high frequency components that even a native 1440p or 4K image is usually undersampled and aliased to poo poo, that's why TAA has become an industry standard to reconstruct those subpixel details over time. There isn't even a direct mapping between rendered pixels and screen pixels at native resolution anymore, if you use TAA then the rendered pixels have a pseudo-random subpixel offset and you're relying on the reconstruction to stabilize it. Yeah this is a great point. I'm all for keeping the options in if you want to do compares or get slightly better quality at a performance hit, but now we're talking about graphics in terms of things like raycasting and hairworks. It's not a quality as much as "You can get these visual features at a cost of approximating Native Res in a very close manner. HalloKitty posted:Exactly. I'm running 3840x2160 @ 43" How far away are you though? Hi Res at a distance is an even better use case for "Horseshoes, Hand-grenades, and Pixel Quality."
|
# ? Apr 16, 2020 14:50 |
|
Lockback posted:Yeah this is a great point. I'm all for keeping the options in if you want to do compares or get slightly better quality at a performance hit, but now we're talking about graphics in terms of things like raycasting and hairworks. It's not a quality as much as "You can get these visual features at a cost of approximating Native Res in a very close manner. Not very far, it's my main monitor. Probably 2'6" or so.
|
# ? Apr 16, 2020 14:59 |
|
Unironically looking forward to Crysis Remastered, not only will it look better it's going to run better as well. Technical aspects and the curiosity of its own raytracing implemention aside, it's also a great game.
|
# ? Apr 16, 2020 15:33 |
|
It's going to be nice with a new reference benchmark game at the very least.
|
# ? Apr 16, 2020 15:40 |
|
Zedsdeadbaby posted:Unironically looking forward to Crysis Remastered, not only will it look better it's going to run better as well. Technical aspects and the curiosity of its own raytracing implemention aside, it's also a great game. Am I right that they are doing ray tracing without using RT cores or did I misunderstand that? They talked about a solution that works with AMD and NVidia and I wasn't sure what that meant.
|
# ? Apr 16, 2020 16:56 |
|
Lockback posted:Am I right that they are doing ray tracing without using RT cores or did I misunderstand that? They talked about a solution that works with AMD and NVidia and I wasn't sure what that meant. You can do raytracing without the special cores, NVidia enabled it for non-RTX cards on some things. It predictably runs like rear end, but if you’re building ray tracing into your from the ground up without planning on using the RT cores it may be more efficient. What I want to know is if Crysis remastered will have the ability to tap into the RT cores if you do have them. I know they want a hardware agnostic thing but it would be kind of dumb to not take full advantage like that.
|
# ? Apr 16, 2020 17:02 |
|
Ugly In The Morning posted:You can do raytracing without the special cores, NVidia enabled it for non-RTX cards on some things. It predictably runs like rear end, but if you’re building ray tracing into your from the ground up without planning on using the RT cores it may be more efficient. Yeah, that was my question. I know you can do Raytracing on the GPU (or even the CPU if you really wanted to) but what I read implied they WERE NOT going to use the RT cores at all, which seems wasteful.
|
# ? Apr 16, 2020 17:05 |
|
If they want to use some form of raytracing on PS4/Xbone hardware they'll have to use a cut down, efficient enough version and implementing an extra elaborate version for Nvidia tensor cores would be the "wasteful" option.
|
# ? Apr 16, 2020 17:21 |
|
Crytek main business isn't selling games, it's licensing engines. They aren't going to go "Wowee check out this hot new old demonstrator game for our engine featuring raytracing but no RTX support!"
|
# ? Apr 16, 2020 17:23 |
|
Ugly In The Morning posted:What I want to know is if Crysis remastered will have the ability to tap into the RT cores if you do have them. I know they want a hardware agnostic thing but it would be kind of dumb to not take full advantage like that. Their Neon Noir demo was pure compute, it didn't use RT cores even if they were available. Which yeah, is a weird move when every hardware vendor and consoles have committed to hardware raytracing now. Ironically Turing still ran rings around everything else even with its RT cores sitting idle though
|
# ? Apr 16, 2020 17:41 |
|
https://press.crytek.com/crytek-announces-crysis-remasteredquote:Crysis Remastered will focus on the original game’s single-player campaigns and is slated to contain high-quality textures and improved art assets, an HD texture pack, temporal anti-aliasing, SSDO, SVOGI, state-of-the-art depth fields, new light settings, motion blur, and parallax occlusion mapping, particle effects will also be added where applicable. Further additions such as volumetric fog and shafts of light, software-based ray tracing, and screen space reflections provide the game with a major visual upgrade. still no hardware acceleration why tho
|
# ? Apr 16, 2020 19:15 |
|
repiv posted:https://press.crytek.com/crytek-announces-crysis-remastered Can't maintain the "But can it run Crysis?" moniker if you don't make raytracing equally as expensive on all GPUs. edit: Integrating RTX support would be cheating
|
# ? Apr 16, 2020 19:22 |
|
In other other news Minecraft RTX is out, you need to install the Xbox Insider Hub app to opt-in to the RTX beta branch for now. The servers seem to be getting hammered though. https://www.youtube.com/watch?v=s_eeWr622Ss
|
# ? Apr 16, 2020 19:24 |
|
repiv posted:you need to install the Xbox Insider Hub app to opt-in to the RTX beta branch for now. i do wanna play with it, but after the hours i spent to disable all the xbox dogshit in windows 10 in the first place, this is probably gonna keep me away. ok maybe it was more like minutes than hours but still.
|
# ? Apr 16, 2020 19:31 |
|
Unironically, Minecraft RTX to me seems like the best example of ray tracing in any game I've tried so far.
|
# ? Apr 16, 2020 19:48 |
|
repiv posted:https://press.crytek.com/crytek-announces-crysis-remastered
|
# ? Apr 16, 2020 19:53 |
|
Apparently their ray tracing implemention is pretty cheap
|
# ? Apr 16, 2020 20:07 |
|
I know typical advice would be to buy the best you can afford when you're ready to purchase, but would now actually be a decent time to wait for next gen offerings? From what I gather from friends, AMD will be adding hardware ray tracing and both companies are planning on releasing next gen sometime in Q3 or Q4 this year (though who knows how coronavirus will affect that). If I bought now I think I'd be looking at a 5700 or 5700 XT (looks like those are the best value in my price range), but if I'm planning on keeping the card for many years it seems like waiting for better hardware ray tracing would be a good call. I suppose I could go for a 2060S right now, but I feel like if I'm making that choice just because of ray tracing, I should just wait.
|
# ? Apr 16, 2020 22:44 |
|
Well you should buy GPUs so you can play games. And if you buy a 5700/XT, you'll probably be waiting until December to actually play things without black screens etc. So I guess you may as well wait until then anyway.
|
# ? Apr 16, 2020 22:51 |
|
Splinter posted:I know typical advice would be to buy the best you can afford when you're ready to purchase, but would now actually be a decent time to wait for next gen offerings? From what I gather from friends, AMD will be adding hardware ray tracing and both companies are planning on releasing next gen sometime in Q3 or Q4 this year (though who knows how coronavirus will affect that). If I bought now I think I'd be looking at a 5700 or 5700 XT (looks like those are the best value in my price range), but if I'm planning on keeping the card for many years it seems like waiting for better hardware ray tracing would be a good call. I suppose I could go for a 2060S right now, but I feel like if I'm making that choice just because of ray tracing, I should just wait. If you are going to keep it for many years, then it would probably be wise to skip the 5700 cards if for no other reason than AMD will probably never fix the drivers for them. Nvidia drivers have problems too, but usually they are "crashes or causes corruption in x game that you don't play" vs the 5700 driver problems which range from "displays only a black screen when you so much as try to load a web browser" to "hard locks the machine completely if you launch a game requiring a complete power cycle to recover".
|
# ? Apr 16, 2020 22:53 |
|
If you're going to buy a graphics card, I would suggest a 2070S over any of those. Save up the extra 70-100 bucks and put it off if you have to, having the Tensor cores will be handy with hardware accelerated ray tracing and the new DLSS implementation is really well done. Just tested it on Control today and got 50 percent more frames per second with ray tracing maxed, and that's something not coming to AMD cards anytime soon. NVidia's 70 level cards have had a ton of longevity in the last few gens, too. It's kind of a bummer to see AMD drop the ball so hard on high end/mid range cards, I used to buy Radeon stuff all the time but it's just not on the same level as it was back in the day.
|
# ? Apr 16, 2020 22:58 |
|
That is good to know about the AMD drivers. Sounds like holding off/saving up, either for next gen, or a 2070S if really need something before then is the way to go. Thanks!
|
# ? Apr 16, 2020 23:27 |
|
I would not count on next gen until November/December of this year, though early indications will be it will be a much better generational step than the 1000-2000(non super) was. Under normal circumstances I'd say that puts us into the "Probably wait" area, though you'll probably be playing more games in the next 6 months than you will be during the following year, so weigh that accordingly. Personally, I think 5700XT or a flashed 5700 is the best bet at that price range. The advice above makes it look like the 5700 is unplayable but that isn't right, it definitely works fine in 99% of the use cases, but just know that the drivers have had problems and it hasn't been smooth. Whether it makes sense to upgrade now for the Roni or not sorta depends on what you have.
|
# ? Apr 17, 2020 00:03 |
|
"99% works fine" is overselling it a bit, considering the number of people in this thread alone who have had problems. It's not entirely unplayable, but you are taking a fair risk that some games may not work right, and there doesn't seem to be much consistency between issue systems to be able to predict whether you'll be fine or not. Between driver issues and the lack of DLSS, the 5700(xt) is a hard sell. The 2060S performs similarly to the 5700, has DLSS and working drivers, and is often basically the same price. It's a no-brainer there.
|
# ? Apr 17, 2020 00:39 |
|
A 5700 is a poor recommendation, their driver issues are legitimately woeful
|
# ? Apr 17, 2020 03:55 |
|
|
# ? Jun 5, 2024 17:28 |
|
Well the most demanding game I own now is Minecraft. I messed around with the beta today and it looks amazing, it's very clearly a title that demonstrates how immense a difference having "correct" lighting, shadows and all that other good stuff can do. It also makes my 2080 Ti cry, 40-60 FPS with conservative draw distance settings at 3440x1440 with DLSS 2.0 active. I have to suspect the reason why Minecraft RTX has been delayed as long as its been was due to performance issues, DLSS 2.0 magic makes the game playable where I suspect it just wasn't without it.
|
# ? Apr 17, 2020 04:02 |