Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

eames posted:

The phrase „needlessly high resolution“ reminds me a bit of the countless arguments of .wav/lossless audio vs lossy compression. As of 2020 the latter seems to have won out (I consume all my music via normal iTunes/Spotify/Amazon streaming), but I also think that the defenders of lossless have valid arguments.

Well, the thing with audio is that most people consume it (especially when streamed) on poo poo-tier hardware that you'd be hard pressed to differentiate a 56kbps MP3 from a CD source in the first place, so the extra overhead of lossless was a waste (and a cost the streaming services simply couldn't absorb in many cases).

Image quality, on the other hand, is going to continue to be stressed as we keep pressing for higher and higher resolution and refresh-rate monitors. As a mitigating factor though, most games are about motion, and motion can hide a lot of imperfections, especially if it can be combined with something like foveated rendering that pushes most of the issue areas out to the periphery. People looking for the ultimate in static image quality for screenshots, or where performance isn't generally an issue, might still opt for the native resolution purity. It'll be interesting to see where it goes!

Adbot
ADBOT LOVES YOU

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

ConanTheLibrarian posted:

It will be interesting to see if nvidia add tensor cores to their cheap GPUs. On the one hand they might reserve them as a premium feature, but on the other hand they could create low end cards with many fewer shaders by compensating with tensor cores. Pushing DLSS to the largest segment of the market is one way to encourage developers to invest in it.

If they make an RTX 3010 I'll use it to DLSS a 480p image up to my 720p monitor :3:

snickothemule
Jul 11, 2016

wretched single ply might as well use my socks
With Nvidia's Lightspeed studio reportedly adding RTX to some older titles, should they throw in DLSS 2.0 in the mix that would be massive. So tempted to get a 2080ti now to play around with, but I really should wait for Ampere.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Per-pixel quality matters a lot less as resolution increases. A single pixel that's meaningfully off is hard to miss at 480p, and basically invisible at 4k. The reality is that you just tend not to notice, because where tradeoffs are made, it's usually for good reason.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

K8.0 posted:

Per-pixel quality matters a lot less as resolution increases. A single pixel that's meaningfully off is hard to miss at 480p, and basically invisible at 4k. The reality is that you just tend not to notice, because where tradeoffs are made, it's usually for good reason.

the caveat being that even if it's OK to be off, it still needs to be off in a temporally stable fashion, otherwise the movement/shimmering is very noticeable. This is the thing that TAA really did better, jitter was sometimes very noticeable in previous AA algorithms.

repiv
Aug 13, 2009

I think peoples intuition about resolution needs updating, we're used to the idea that native resolution is the magic point where graphics are perfect but that's not how it works anymore. Modern PBR rendering has so many high frequency components that even a native 1440p or 4K image is usually undersampled and aliased to poo poo, that's why TAA has become an industry standard to reconstruct those subpixel details over time. There isn't even a direct mapping between rendered pixels and screen pixels at native resolution anymore, if you use TAA then the rendered pixels have a pseudo-random subpixel offset and you're relying on the reconstruction to stabilize it.

If reconstruction is practically required to get a stable signal, and reconstruction negates the benefit of perfectly matching the rendered pixels to the display pixels... what's actually special about native resolution at that point? We just need a resolution that feeds enough information into the reconstruction to get a good signal out of the other side, which as we've seen can be significantly lower than native if the algorithm is smart enough.

[joker voice] taa extrapolates subpixel details and nobody bats an eye, but taa extrapolates details slightly bigger than one pixel and everyone loses their minds

repiv fucked around with this message at 01:20 on Apr 16, 2020

Seamonster
Apr 30, 2007

IMMER SIEGREICH

K8.0 posted:

Per-pixel quality matters a lot less as resolution increases. A single pixel that's meaningfully off is hard to miss at 480p, and basically invisible at 4k. The reality is that you just tend not to notice, because where tradeoffs are made, it's usually for good reason.

Its pixel density. Lowered settings are extremely hard to pick out @ like 23'' 4k. Much easier at say, 43''

BurritoJustice
Oct 9, 2012

Probably a lost cause, but does anyone know if it is possible to have ULMB enabled on one monitor and GSync on the other? I have two PG279Q side by side and it'd be nice to not have gsync on the secondary because it likes to sync that monitor up with the game I am playing on my primary and mess with Netflix on the second monitor. I'd love to have gsync on the primary for playing Doom and ULMB on my secondary for sweet movie-watching motion persistence.

I used to be able to get it to work by setting the second monitor to 120Hz and then turning on ULMB in the monitor UI but now I can't do that now as the setting in the nvidia control panel forces GSync on everything or nothing. ULMB is normally source independent so it'd be nice to just say to the monitor "don't loving talk to the GPU please".

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Seamonster posted:

Its pixel density. Lowered settings are extremely hard to pick out @ like 23'' 4k. Much easier at say, 43''

Exactly. I'm running 3840x2160 @ 43"
All those pixels count

Stickman
Feb 1, 2004

eames posted:

I feel like it could go either way and might turn out to be two-edged sword, because if this catches on native rendering output might be gone for good.

Yes, it's indubitably cool to get a free 50% performance gain without an immediately obvious difference in image quality, but I'm not sure I'll be as thrilled when <insert big AAA publisher> forces those tradeoffs for me and the locally rendered title displays videocompression-esque artifacts to achieve playable framerates.
I realize I'm overly negative again but :tinfoil:

I dunno, “native rendering” has been sort of meaningless since the advent of TAA and other methods that are more-or-less doing reconstruction/blurring even at “native” resolution. Razor sharp and clean has been gone for a long time.

Stickman fucked around with this message at 09:08 on Apr 16, 2020

Anime Schoolgirl
Nov 28, 2002

Dynamic resolution is something I wish was more widespread in PC games. It would be great if this could somehow be combined with DLSS or something like it.

ConanTheLibrarian
Aug 13, 2004


dis buch is late
Fallen Rib

Stickman posted:

I dunno, “native rendering” has been sort of meaningless since the advent of TAA and other methods that are more-or-less doing reconstruction/blurring even at “native” resolution. Razor sharp and clean has been gone for a long time.

But it's also not really necessary. Image/video compression relies heavily on the differences in the eye's ability to make out the various components of a visual signal. Certain things aren't perceptible. To give an example, MGS5's rendering engine used different resolutions for each layer of effects for a given frame (some native, some subsampled). That game looked great and ran really well even on potato GPUs, showing that the loss of fidelity wasn't something you'd really miss, and the performance gains were more than worth it.

repiv
Aug 13, 2009

Pretty much everything uses multiple resolutions through the pipeline now, buffers as low as quarter resolution are often used and even at ultra settings your SSAO, SSR, volumetrics, etc are still usually half res.

The fact that it's rarely noticeable just goes to show that native resolution isn't special, as long as you sample above the Nyquist frequency or close enough to integrate it temporally.

repiv
Aug 13, 2009

In other news Crysis Remastered just leaked and it's going to use the hardware-agnostic, non-accelerated raytracing from the Neon Noir demo, it'll be interesting to see how that stacks up against RTX/DXR.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

repiv posted:

I think peoples intuition about resolution needs updating, we're used to the idea that native resolution is the magic point where graphics are perfect but that's not how it works anymore. Modern PBR rendering has so many high frequency components that even a native 1440p or 4K image is usually undersampled and aliased to poo poo, that's why TAA has become an industry standard to reconstruct those subpixel details over time. There isn't even a direct mapping between rendered pixels and screen pixels at native resolution anymore, if you use TAA then the rendered pixels have a pseudo-random subpixel offset and you're relying on the reconstruction to stabilize it.

If reconstruction is practically required to get a stable signal, and reconstruction negates the benefit of perfectly matching the rendered pixels to the display pixels... what's actually special about native resolution at that point? We just need a resolution that feeds enough information into the reconstruction to get a good signal out of the other side, which as we've seen can be significantly lower than native if the algorithm is smart enough.

[joker voice] taa extrapolates subpixel details and nobody bats an eye, but taa extrapolates details slightly bigger than one pixel and everyone loses their minds

Yeah this is a great point. I'm all for keeping the options in if you want to do compares or get slightly better quality at a performance hit, but now we're talking about graphics in terms of things like raycasting and hairworks. It's not a quality as much as "You can get these visual features at a cost of approximating Native Res in a very close manner.

HalloKitty posted:

Exactly. I'm running 3840x2160 @ 43"
All those pixels count

How far away are you though? Hi Res at a distance is an even better use case for "Horseshoes, Hand-grenades, and Pixel Quality."

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Lockback posted:

Yeah this is a great point. I'm all for keeping the options in if you want to do compares or get slightly better quality at a performance hit, but now we're talking about graphics in terms of things like raycasting and hairworks. It's not a quality as much as "You can get these visual features at a cost of approximating Native Res in a very close manner.


How far away are you though? Hi Res at a distance is an even better use case for "Horseshoes, Hand-grenades, and Pixel Quality."

Not very far, it's my main monitor. Probably 2'6" or so.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
Unironically looking forward to Crysis Remastered, not only will it look better it's going to run better as well. Technical aspects and the curiosity of its own raytracing implemention aside, it's also a great game.

ufarn
May 30, 2009
It's going to be nice with a new reference benchmark game at the very least.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

Zedsdeadbaby posted:

Unironically looking forward to Crysis Remastered, not only will it look better it's going to run better as well. Technical aspects and the curiosity of its own raytracing implemention aside, it's also a great game.

Am I right that they are doing ray tracing without using RT cores or did I misunderstand that? They talked about a solution that works with AMD and NVidia and I wasn't sure what that meant.

Ugly In The Morning
Jul 1, 2010
Pillbug

Lockback posted:

Am I right that they are doing ray tracing without using RT cores or did I misunderstand that? They talked about a solution that works with AMD and NVidia and I wasn't sure what that meant.

You can do raytracing without the special cores, NVidia enabled it for non-RTX cards on some things. It predictably runs like rear end, but if you’re building ray tracing into your from the ground up without planning on using the RT cores it may be more efficient.

What I want to know is if Crysis remastered will have the ability to tap into the RT cores if you do have them. I know they want a hardware agnostic thing but it would be kind of dumb to not take full advantage like that.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

Ugly In The Morning posted:

You can do raytracing without the special cores, NVidia enabled it for non-RTX cards on some things. It predictably runs like rear end, but if you’re building ray tracing into your from the ground up without planning on using the RT cores it may be more efficient.

What I want to know is if Crysis remastered will have the ability to tap into the RT cores if you do have them. I know they want a hardware agnostic thing but it would be kind of dumb to not take full advantage like that.

Yeah, that was my question. I know you can do Raytracing on the GPU (or even the CPU if you really wanted to) but what I read implied they WERE NOT going to use the RT cores at all, which seems wasteful.

orcane
Jun 13, 2012

Fun Shoe
If they want to use some form of raytracing on PS4/Xbone hardware they'll have to use a cut down, efficient enough version and implementing an extra elaborate version for Nvidia tensor cores would be the "wasteful" option.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Crytek main business isn't selling games, it's licensing engines. They aren't going to go "Wowee check out this hot new old demonstrator game for our engine featuring raytracing but no RTX support!"

repiv
Aug 13, 2009

Ugly In The Morning posted:

What I want to know is if Crysis remastered will have the ability to tap into the RT cores if you do have them. I know they want a hardware agnostic thing but it would be kind of dumb to not take full advantage like that.

Their Neon Noir demo was pure compute, it didn't use RT cores even if they were available. Which yeah, is a weird move when every hardware vendor and consoles have committed to hardware raytracing now.

Ironically Turing still ran rings around everything else even with its RT cores sitting idle though

repiv
Aug 13, 2009

https://press.crytek.com/crytek-announces-crysis-remastered

quote:

Crysis Remastered will focus on the original game’s single-player campaigns and is slated to contain high-quality textures and improved art assets, an HD texture pack, temporal anti-aliasing, SSDO, SVOGI, state-of-the-art depth fields, new light settings, motion blur, and parallax occlusion mapping, particle effects will also be added where applicable. Further additions such as volumetric fog and shafts of light, software-based ray tracing, and screen space reflections provide the game with a major visual upgrade.

still no hardware acceleration

why tho

MagusDraco
Nov 11, 2011

even speedwagon was trolled

Can't maintain the "But can it run Crysis?" moniker if you don't make raytracing equally as expensive on all GPUs.

edit: Integrating RTX support would be cheating

repiv
Aug 13, 2009

In other other news Minecraft RTX is out, you need to install the Xbox Insider Hub app to opt-in to the RTX beta branch for now. The servers seem to be getting hammered though.

https://www.youtube.com/watch?v=s_eeWr622Ss

Cygni
Nov 12, 2005

raring to post

repiv posted:

you need to install the Xbox Insider Hub app to opt-in to the RTX beta branch for now.

i do wanna play with it, but after the hours i spent to disable all the xbox dogshit in windows 10 in the first place, this is probably gonna keep me away.

ok maybe it was more like minutes than hours but still.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.
Unironically, Minecraft RTX to me seems like the best example of ray tracing in any game I've tried so far.

ufarn
May 30, 2009
They still might flip a feature switch once Ampere is out. But making it so Nvidia-specific is probably not a great idea, at least in the game; they might still enable it in-engine.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
Apparently their ray tracing implemention is pretty cheap

Splinter
Jul 4, 2003
Cowabunga!
I know typical advice would be to buy the best you can afford when you're ready to purchase, but would now actually be a decent time to wait for next gen offerings? From what I gather from friends, AMD will be adding hardware ray tracing and both companies are planning on releasing next gen sometime in Q3 or Q4 this year (though who knows how coronavirus will affect that). If I bought now I think I'd be looking at a 5700 or 5700 XT (looks like those are the best value in my price range), but if I'm planning on keeping the card for many years it seems like waiting for better hardware ray tracing would be a good call. I suppose I could go for a 2060S right now, but I feel like if I'm making that choice just because of ray tracing, I should just wait.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Well you should buy GPUs so you can play games. And if you buy a 5700/XT, you'll probably be waiting until December to actually play things without black screens etc. So I guess you may as well wait until then anyway.

Indiana_Krom
Jun 18, 2007
Net Slacker

Splinter posted:

I know typical advice would be to buy the best you can afford when you're ready to purchase, but would now actually be a decent time to wait for next gen offerings? From what I gather from friends, AMD will be adding hardware ray tracing and both companies are planning on releasing next gen sometime in Q3 or Q4 this year (though who knows how coronavirus will affect that). If I bought now I think I'd be looking at a 5700 or 5700 XT (looks like those are the best value in my price range), but if I'm planning on keeping the card for many years it seems like waiting for better hardware ray tracing would be a good call. I suppose I could go for a 2060S right now, but I feel like if I'm making that choice just because of ray tracing, I should just wait.

If you are going to keep it for many years, then it would probably be wise to skip the 5700 cards if for no other reason than AMD will probably never fix the drivers for them. Nvidia drivers have problems too, but usually they are "crashes or causes corruption in x game that you don't play" vs the 5700 driver problems which range from "displays only a black screen when you so much as try to load a web browser" to "hard locks the machine completely if you launch a game requiring a complete power cycle to recover".

Ugly In The Morning
Jul 1, 2010
Pillbug
If you're going to buy a graphics card, I would suggest a 2070S over any of those. Save up the extra 70-100 bucks and put it off if you have to, having the Tensor cores will be handy with hardware accelerated ray tracing and the new DLSS implementation is really well done. Just tested it on Control today and got 50 percent more frames per second with ray tracing maxed, and that's something not coming to AMD cards anytime soon. NVidia's 70 level cards have had a ton of longevity in the last few gens, too. It's kind of a bummer to see AMD drop the ball so hard on high end/mid range cards, I used to buy Radeon stuff all the time but it's just not on the same level as it was back in the day.

Splinter
Jul 4, 2003
Cowabunga!
That is good to know about the AMD drivers. Sounds like holding off/saving up, either for next gen, or a 2070S if really need something before then is the way to go. Thanks!

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.
I would not count on next gen until November/December of this year, though early indications will be it will be a much better generational step than the 1000-2000(non super) was. Under normal circumstances I'd say that puts us into the "Probably wait" area, though you'll probably be playing more games in the next 6 months than you will be during the following year, so weigh that accordingly.

Personally, I think 5700XT or a flashed 5700 is the best bet at that price range. The advice above makes it look like the 5700 is unplayable but that isn't right, it definitely works fine in 99% of the use cases, but just know that the drivers have had problems and it hasn't been smooth. Whether it makes sense to upgrade now for the Roni or not sorta depends on what you have.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
"99% works fine" is overselling it a bit, considering the number of people in this thread alone who have had problems. It's not entirely unplayable, but you are taking a fair risk that some games may not work right, and there doesn't seem to be much consistency between issue systems to be able to predict whether you'll be fine or not.

Between driver issues and the lack of DLSS, the 5700(xt) is a hard sell. The 2060S performs similarly to the 5700, has DLSS and working drivers, and is often basically the same price. It's a no-brainer there.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
A 5700 is a poor recommendation, their driver issues are legitimately woeful

Adbot
ADBOT LOVES YOU

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.
Well the most demanding game I own now is Minecraft. I messed around with the beta today and it looks amazing, it's very clearly a title that demonstrates how immense a difference having "correct" lighting, shadows and all that other good stuff can do. It also makes my 2080 Ti cry, 40-60 FPS with conservative draw distance settings at 3440x1440 with DLSS 2.0 active. I have to suspect the reason why Minecraft RTX has been delayed as long as its been was due to performance issues, DLSS 2.0 magic makes the game playable where I suspect it just wasn't without it.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply