Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
buglord
Jul 31, 2010

Cheating at a raffle? I sentence you to 1 year in jail! No! Two years! Three! Four! Five years! Ah! Ah! Ah! Ah!

Buglord
I feel kinda goofy for putting my eggs in the NVIDIA basket for my 3070. Granted, AMD GPUs were even rarer at the time and getting my 3070 was blind luck with a discord alert.

But at 1440p, raytracing whenever I turned it slowed games down just enough to make the experience feel noticeably worse. And adding insult to injury, I still had to do A/B testing to notice differences in games like cyberpunk. And while DLSS seemed to be a feature that delivered exactly what was promised to me, FSR means I’m not bound to team green anymore.

I’d love to upgrade myself to a 7900XT/XTX with minimal $ out of pocket considering store credit I have+whatever my 3070 is currently worth, but unfortunately the most graphically intensive games seem to be the most by the numbers and kinda boring, so here I am playing Rimworld and Vicky 3.

Adbot
ADBOT LOVES YOU

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

buglord posted:

I feel kinda goofy for putting my eggs in the NVIDIA basket for my 3070. Granted, AMD GPUs were even rarer at the time and getting my 3070 was blind luck with a discord alert.

But at 1440p, raytracing whenever I turned it slowed games down just enough to make the experience feel noticeably worse. And adding insult to injury, I still had to do A/B testing to notice differences in games like cyberpunk. And while DLSS seemed to be a feature that delivered exactly what was promised to me, FSR means I’m not bound to team green anymore.

I’d love to upgrade myself to a 7900XT/XTX with minimal $ out of pocket considering store credit I have+whatever my 3070 is currently worth, but unfortunately the most graphically intensive games seem to be the most by the numbers and kinda boring, so here I am playing Rimworld and Vicky 3.

Same, I have zero urge to upgrade my PC when AAA scene is now a yawnfest.

Cabbages and VHS
Aug 25, 2004

Listen, I've been around a bit, you know, and I thought I'd seen some creepy things go on in the movie business, but I really have to say this is the most disgusting thing that's ever happened to me.
I must, surely, get some kind of satisfaction out of getting almost every GPU generation and doing a full system rebuild whenever there's major architecture changes.

I do some ML stuff, but honestly I barely game and when I do often using GeForce Now. I feel like I'm spending time and money to simply have capabilities I seldom fully use.

Not unlike my synth addiction, I guess. Synths tend to hold value though, sometimes they even appreciate!!

And yet I'll probably upgrade my 4090 as soon as that's possible.

Josh Lyman
May 24, 2009


I wasn't aware that the 6800 XT was still being sold at retail; Newegg has a Gigabyte for $570 so 30% cheaper than a 7900 XT. HUB has it 30% slower at 1440p, which is a reasonable tradeoff given my light gaming needs.

Given the 4070 will be $600, I'm assuming a 7800XT, which is supposed to release in June, won't be more. So assuming I convince myself I don't need an $800 card, waiting for the 7800 XT is probably my best bet.

shrike82 posted:

Budget PC setups are probably going to struggle as PS5/XSX only games become the norm - even without RT and lol at
Did I say something wrong? I think it's reasonable to not have to use DLSS/FSR to get 100fps+ at 1440p with an $800 GPU. RT is a different story of course.

edit: v I hadn't really thought about it that way. Upscaling, in my mind, was always a way to get extra life out of an old card or make RT playable.

Josh Lyman fucked around with this message at 03:50 on Apr 2, 2023

Kibner
Oct 21, 2008

Acguy Supremacy

Josh Lyman posted:

Did I say something wrong? I think it's reasonable to not have to use DLSS/FSR to get 100fps+ at 1440p with an $800 GPU. RT is a different story of course.

Games are being developed while intending for users to use these fancy upscaling techniques to get good/decent framerates. In the very near future, if you can get high framerates without using those upscaling techniques, it is because you have a card that is overkill for the application.

shrike82
Jun 11, 2005

there's an argument to be made that gamedevs have begun leaning on DLSS/FSR2 as a "crutch" to get acceptable performance for PC ports

i don't know if i agree with that but from another perspective, gamers get too hung up on resolution

Cygni
Nov 12, 2005

raring to post

shrike82 posted:

there's an argument to be made that gamedevs have begun leaning on DLSS/FSR2 as a "crutch" to get acceptable performance for PC ports

i don't know if i agree with that but from another perspective, gamers get too hung up on resolution

if you give devs more resources, that means they can write worse code! problem solved?

Lord Stimperor
Jun 13, 2018

I'm a lovable meme.

Tbqh we are pretty pampered if 100+ fps becomes our baseline. A couple of years ago when VR was new, the required 90+ was considered utopian

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Josh Lyman posted:

I wasn't aware that the 6800 XT was still being sold at retail; Newegg has a Gigabyte for $570 so 30% cheaper than a 7900 XT. HUB has it 30% slower at 1440p, which is a reasonable tradeoff given my light gaming needs.

Given the 4070 will be $600, I'm assuming a 7800XT, which is supposed to release in June, won't be more. So assuming I convince myself I don't need an $800 card, waiting for the 7800 XT is probably my best bet.

Did I say something wrong? I think it's reasonable to not have to use DLSS/FSR to get 100fps+ at 1440p with an $800 GPU. RT is a different story of course.

edit: v I hadn't really thought about it that way. Upscaling, in my mind, was always a way to get extra life out of an old card or make RT playable.

I have attempted to point it out several times, but the base 6800 is not much slower than the 6800 XT, is more power efficient, and can be often had new for less than $500.
I purchased this card: https://www.newegg.com/sapphire-rad...&quicklink=true
It has a dual BIOS switch, with the default being overclocked. It runs practically silently even running time spy loops 20+ times inside a micro ITX case (Sliger SV540).
A couple of days ago it had a $20 off code.

HalloKitty fucked around with this message at 12:23 on Apr 2, 2023

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

Lord Stimperor posted:

Tbqh we are pretty pampered if 100+ fps becomes our baseline. A couple of years ago when VR was new, the required 90+ was considered utopian

Standards are always going up, otherwise what's the point of all the progression in CPUs, GPUs and so on.
One day we will all be playing on 8k displays with 240hz+. And then 16k. And so on it goes. I would love to have a time machine and peer even just ten years into the future. Ten years ago me would be absolutely blown away by my Q70A display with 4k and 120hz VRR.

Dr. Video Games 0031
Jul 17, 2004

to be fair, at some point the improvements to both resolution and frame rate hit extremely heavy diminishing returns. the whole "the human eye can't see more than [X]" thing is a meme now, but there is a limit to how much detail our eye can see in terms of spatial resolution. when it comes to gaming at least, there will probably be very little reason to ever go beyond 5K for desktop-sized monitors. or maybe 8K, but that's the max I'd ever go.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Zedsdeadbaby posted:

Standards are always going up, otherwise what's the point of all the progression in CPUs, GPUs and so on.
One day we will all be playing on 8k displays with 240hz+. And then 16k. And so on it goes. I would love to have a time machine and peer even just ten years into the future. Ten years ago me would be absolutely blown away by my Q70A display with 4k and 120hz VRR.

Ten years ago me would be disappointed with the fact Windows 10 and 11 are uglier than Windows 7, always doing some random poo poo in the background; the fact DDR5 memory training times take me back to the days when you could watch the KB checked counter in BIOS go up in real time, and the fact that power consumption is so high that we need bigger heatsinks and fans than ever. (Although I would be impressed by the fact that 75" TVs can bought easily for reasonable prices)

Dr. Video Games 0031 posted:

to be fair, at some point the improvements to both resolution and frame rate hit extremely heavy diminishing returns. the whole "the human eye can't see more than [X]" thing is a meme now, but there is a limit to how much detail our eye can see in terms of spatial resolution. when it comes to gaming at least, there will probably be very little reason to ever go beyond 5K for desktop-sized monitors. or maybe 8K, but that's the max I'd ever go.

Exactly, spatial resolution resolved is a function of distance from the screen (and your eyesight), temporal "resolution" can be variable, peripheral vision tends to be more sensitive, with 90 ish being a baseline to clear up flickering when presented with black between each viewed frame, but there are no doubt benefits to going further (I remember many years ago being able to cleary see CRT flicker in my peripheral vision until 85Hz). I'd guess that everything above 120Hz involves diminishing returns too great to be worth bothering with
The good thing is, we're getting older! Over time we won't be able to see as well or hear as well (I'm certain you've all experienced hearing loss, I doubt you can hear the flyback on a CRT any longer). So we can save a lot of cash on fancy gear...

HalloKitty fucked around with this message at 12:36 on Apr 2, 2023

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

Dr. Video Games 0031 posted:

to be fair, at some point the improvements to both resolution and frame rate hit extremely heavy diminishing returns. the whole "the human eye can't see more than [X]" thing is a meme now, but there is a limit to how much detail our eye can see in terms of spatial resolution. when it comes to gaming at least, there will probably be very little reason to ever go beyond 5K for desktop-sized monitors. or maybe 8K, but that's the max I'd ever go.

8k is exciting but mostly for the plethora of integer scaling options.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Lockback posted:

8k is exciting but mostly for the plethora of integer scaling options.

Did we really have to get there just to replace non-fixed-pixel displays? And we call it progress..

shrike82
Jun 11, 2005

I don't know if pampered is the word I'd use to describe the current era where we're back to lovely PC ports

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

shrike82 posted:

I don't know if pampered is the word I'd use to describe the current era where we're back to lovely PC ports

What, you mean you don't like games that have tacked-on keyboard and mouse support, and ugly-rear end launchers with a lack of options?

Arzachel
May 12, 2012

HalloKitty posted:

Exactly, spatial resolution resolved is a function of distance from the screen (and your eyesight), temporal "resolution" can be variable, peripheral vision tends to be more sensitive, with 90 ish being a baseline to clear up flickering when presented with black between each viewed frame, but there are no doubt benefits to going further (I remember many years ago being able to cleary see CRT flicker in my peripheral vision until 85Hz). I'd guess that everything above 120Hz involves diminishing returns too great to be worth bothering with

Obviously it depends on what games you play, but to me 144hz->240hz is a much more meaningful upgrade than 1440p->4k.

Indiana_Krom
Jun 18, 2007
Net Slacker

HalloKitty posted:

Ten years ago me would be disappointed with the fact Windows 10 and 11 are uglier than Windows 7, always doing some random poo poo in the background; the fact DDR5 memory training times take me back to the days when you could watch the KB checked counter in BIOS go up in real time, and the fact that power consumption is so high that we need bigger heatsinks and fans than ever. (Although I would be impressed by the fact that 75" TVs can bought easily for reasonable prices)

Exactly, spatial resolution resolved is a function of distance from the screen (and your eyesight), temporal "resolution" can be variable, peripheral vision tends to be more sensitive, with 90 ish being a baseline to clear up flickering when presented with black between each viewed frame, but there are no doubt benefits to going further (I remember many years ago being able to cleary see CRT flicker in my peripheral vision until 85Hz). I'd guess that everything above 120Hz involves diminishing returns too great to be worth bothering with
The good thing is, we're getting older! Over time we won't be able to see as well or hear as well (I'm certain you've all experienced hearing loss, I doubt you can hear the flyback on a CRT any longer). So we can save a lot of cash on fancy gear...

I can tell the difference between 120 and 180, but it is much less obvious than the difference between 60 and 120. It is worth noting that they are objectively smaller differences even: Going between 60 and 120 is a difference of 8.3 MS per frame while going between 120 and 180 is a difference of only 2.7 MS per frame.

repiv
Aug 13, 2009

Dr. Video Games 0031 posted:

to be fair, at some point the improvements to both resolution and frame rate hit extremely heavy diminishing returns. the whole "the human eye can't see more than [X]" thing is a meme now, but there is a limit to how much detail our eye can see in terms of spatial resolution. when it comes to gaming at least, there will probably be very little reason to ever go beyond 5K for desktop-sized monitors. or maybe 8K, but that's the max I'd ever go.

at some point the detail becomes so dense that you'll only really be able to scrutinize it when there's not much movement in the scene, and that synergizes well with temporal upscalers which resolve the most detail when not much is moving

shrike82
Jun 11, 2005

https://twitter.com/VideoCardz/status/1642469539244908545?s=20

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Temporal upsampling also works better the higher your framerate is, so as we continue to push back towards higher framerates in the PC space, the quality of it both per frame and perceptively in real time inherently increases.

Complaining about it is some boomer poo poo. It's a big step forward technologically, and there will be virtually nothing that renders without it soon enough.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
I think so too, native resolution may eventually be seen as a waste of resources, not just GPU cycles wise but also power consumption.
I immediately use FSR2.0/DLSS if the option is there, even if native framerate can match refresh rate.

Indiana_Krom
Jun 18, 2007
Net Slacker

Zedsdeadbaby posted:

I think so too, native resolution may eventually be seen as a waste of resources, not just GPU cycles wise but also power consumption.
I immediately use FSR2.0/DLSS if the option is there, even if native framerate can match refresh rate.

It definitely helps that the DLSS result is often higher quality and more stable anti-aliasing than whatever native method is implemented in a lot of games. Some of this is likely because the extra performance means a greater number of temporal samples which means fewer and or less persistent artifacts related to them.

Like the native TAA in Dying Light 2 is objectively worse than DLSS, more ghosting and smearing even at the same frame rate when I hit CPU limits.

repiv
Aug 13, 2009

K8.0 posted:

Temporal upsampling also works better the higher your framerate is, so as we continue to push back towards higher framerates in the PC space, the quality of it both per frame and perceptively in real time inherently increases.

Complaining about it is some boomer poo poo. It's a big step forward technologically, and there will be virtually nothing that renders without it soon enough.

we arguably already hit that inflection point when TAA became the default (and often only) AA solution in most engines, temporal supersampling is already ubiquitous even at "native"

for some reason people tend to think of TAAU as cheating in a way that TAA isn't, but fundamentally they're doing the same thing, TAAU just extends it to decouple the input resolution from the output

lordfrikk
Mar 11, 2010

Oh, say it ain't fuckin' so,
you stupid fuck!
When playing games in 4K I often turn off anti-aliasing completely because it's barely necessary.

repiv
Aug 13, 2009

there's no accounting for taste, but in general 4K is nowhere near dense enough to bruteforce a clean image with just 1spp

Cross-Section
Mar 18, 2009

I'm the weirdo who cranks the res up to 2.25x DLDSR in every game that supports it simply because he can :cool:

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

lordfrikk posted:

When playing games in 4K I often turn off anti-aliasing completely because it's barely necessary.

Hm I don't agree with that, even at native 4k there's a lot of shimmering and edge instability on fine details such as foliage, cables, fence posts, staircases etc. Destiny 2 looks like a mess because of it. The graphical complexity goes up with each new area but they haven't updated the PS3-era SMAA.

repiv
Aug 13, 2009

it's compounded by games increasingly relying on shortcuts that assume TAA will clean them up, like replacing (expensive) alpha transparency with (cheap) dither patterns that get blended into approximate transparency by TAA

in those cases if you force AA off it just looks like straight dogshit

UHD
Nov 11, 2006


Cross-Section posted:

I'm the weirdo who cranks the res up to 2.25x DLDSR in every game that supports it simply because he can :cool:

meanwhile i shut DLSS off completely in cp2077 because i dont notice framerate dips at all at 1440p with RT on (4070ti, 12600K cpu) but i did immediately notice ghosting whenever some things moved (like jackie's hands) and since my framerates are almost always 60+ i didnt feel like taking the time to tweak something i didnt need so it wouldnt do a thing i didnt want. maybe thats just cp2077 jank, god knows that game has a lot of it

repiv posted:

it's compounded by games increasingly relying on shortcuts that assume TAA will clean them up, like replacing (expensive) alpha transparency with (cheap) dither patterns that get blended into approximate transparency by TAA

in those cases if you force AA off it just looks like straight dogshit



this has strong 90s "this only looks good on CRTs" energy

repiv
Aug 13, 2009

UHD posted:

this has strong 90s "this only looks good on CRTs" energy

everything old is new again

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

repiv posted:

everything old is new again



The intended look was using composite video, as on the right, but there's also pure crisp RGB out on the Mega Drive, leading to the look on the left.

It creates quite a dilemma when cabling up a "nice" retro setup

buglord
Jul 31, 2010

Cheating at a raffle? I sentence you to 1 year in jail! No! Two years! Three! Four! Five years! Ah! Ah! Ah! Ah!

Buglord

Indiana_Krom posted:

I can tell the difference between 120 and 180, but it is much less obvious than the difference between 60 and 120. It is worth noting that they are objectively smaller differences even: Going between 60 and 120 is a difference of 8.3 MS per frame while going between 120 and 180 is a difference of only 2.7 MS per frame.

Yeah what’s really nice for me is that I’ve stopped noticing when things go above 100fps which means I don’t really have to worry about getting a card that can break that limit since my eyes can’t notice it anyway.

repiv
Aug 13, 2009

watching DFs initial rough video on the TLOU port, holy moly is it heavy on the CPU

in particular the asset streaming is extremely CPU intensive, that will have been handled by the decompression unit on the PS5 but it's done in software on PC (directstorage decompression didn't make the cut for this port)

Shumagorath
Jun 6, 2001

Zedsdeadbaby posted:

I think so too, native resolution may eventually be seen as a waste of resources, not just GPU cycles wise but also power consumption.
I immediately use FSR2.0/DLSS if the option is there, even if native framerate can match refresh rate.
I would definitely do this if I was playing above 1200p, but even DLSS Quality at that res is noticeably muddier. It sure does drop my power consumption when RT is on, though!

kliras
Mar 27, 2021
df's playthrough of tlou was also a good example of why we shouldn't arbitrarily name texture quality tiers "medium" if it looks like absolute garbage. companies would sooner name something "very high" for no meaningful difference compared to "low, lower, very low" etc

shrike82
Jun 11, 2005

lol how would you objectively demarcate quality tiers

repiv
Aug 13, 2009

they could at least make it clear which setting corresponds to what the console version uses

pyrotek
May 21, 2004



shrike82 posted:

lol how would you objectively demarcate quality tiers

For that game, they could use "PS5" for high and ultra since they look exactly the same as each other and the console, then "PS3" for medium and "PS2" for low. Poor PS4 just gets skipped.

The medium textures are really, really bad, and low only slightly worse than that.

Josh Lyman posted:

After many hours of pained deliberation, I have decided to return my 4070 Ti tomorrow. Hogwarts Legacy and TLOU might be poorly optimized games, but for a card I'm keeping for at least 4 years until the RTX 60 series and possibly 8 years until the 80 series (that's how long I had my 970), 12GB VRAM just doesn't make sense when the 7900 XT has 20GB and (slightly) faster rasterization for the same price.

RT may be useful at some point but not until the PS6/Xbox has powerful enough hardware, which will be in 4-5 years based on the PS4->PS5 time gap, plus a lag for devs to start targeting that hardware. Also, Fornite RT performs comparably on both architectures, so future games using Unreal Engine 5 may be similar.

Supposedly DLSS looks better than FSR but I haven't tried it and FPS is similar. For $800, I expect to not have to use upscaling for 1440p. Maybe that will change in 4-5 years with more demanding games or if I upgrade to 4K monitors, but neither GPU is really appropriate for 4K and future versions of FSR may close the visual quality gap.

I think that is the right choice. $800 is a lot of money for a GPU and it really should last a while. The VRAM issue is real; too many games are going over 12GB already.

I like RT more than you, but you are correct that it won't fully come into its own until the PS6/Xbox Next comes around and can hopefully handle path tracing, at which point games will be 100% designed around RT.

It really is worth trying DLSS, though. Sometimes it can actually look better than native in many ways. Of course, sometimes games can have issues with ghosting and the like depending on the game and the DLSS version used, and it looks better the higher the output resolution and framerate.

Frame generation is less useful, mostly if you are getting 60+ without it, are already happy with the latency and don't mind a minor hit, turning it on won't make the game go above the VRR range of your monitor, and just want a bit more perceptual smoothness for free. Even then you have to hope the game renders UI elements in a way that frame generation plays nicely with, or the UI artifacts and increased latency might be more bad than the extra FPS are good.

Adbot
ADBOT LOVES YOU

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

repiv posted:

it's compounded by games increasingly relying on shortcuts that assume TAA will clean them up, like replacing (expensive) alpha transparency with (cheap) dither patterns that get blended into approximate transparency by TAA

in those cases if you force AA off it just looks like straight dogshit

Diablo 2 Resurrected does this for fur and then the Switch port can't clean it up and it ends up looking like static

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply