Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
Thank you for the responses

Arrath posted:

Pixar movies have been fully ray/path traced for near a decade now, IIRC.

and speaking of adorable ray-traced CGI, Lego Builder's Journey is the free game on Epic Games for today. Go grab it and put your GPU through its paces

Adbot
ADBOT LOVES YOU

New Zealand can eat me
Aug 29, 2008

:matters:


steckles posted:

I honestly don't think deep learning is going to find too much use in graphics in the near to medium term beyond running relatively small pre-trained networks, like DLSS does, or for denoising. All the image generation networks are amazing, but they're also inscrutable black boxes that would be tough to art direct. I'd love to see somebody make a game using NeRFs though. That would be awesome.

I'm not sure I agree, things like Restir DI (Reservoir Spatio-Temporal Importance Resampling) by itself seem to indicate the opposite. Any time you can throw monte carlo at a problem and see gains like this, a more efficient educated guess is right around the corner

E: and I think most of the gains to be had wrt art direction at this point come from eliminating limitations like the size of the spaces you can light at a given amount of detail

New Zealand can eat me fucked around with this message at 02:32 on Dec 22, 2022

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

gradenko_2000 posted:

and speaking of adorable ray-traced CGI, Lego Builder's Journey is the free game on Epic Games for today. Go grab it and put your GPU through its paces

Thanks for this. Game looks moderately intresting but I wasn't likely to buy it.

steckles
Jan 14, 2006

New Zealand can eat me posted:

I'm not sure I agree, things like Restir DI (Reservoir Spatio-Temporal Importance Resampling) by itself seem to indicate the opposite. Any time you can throw monte carlo at a problem and see gains like this, a more efficient educated guess is right around the corner
ReStir isn’t a deep learning algorithm. It can be combined with DL based de-noising/image reconstruction and maybe neural path guiding, but it’s a “classic” light transport algorithm that plays nice with GPUs. It’s not even that impressive against PSSMLT or Path Guiding for complex scenes.

Edit: To clarify further: Stochastic algorithms, like ReSTIR or TAA, and Deep Learning/Whatever passes for AI these days have very little to do with one another. Stochastic techniques have a real part to play in network training, and the application of carefully formed noise is integral to content generation with trained networks, but just because an algorithm can adapt to it's input, like almost every light transport algorithm invented in the last 25 years does, doesn't make it AI or Deep Learning. I'm sure NVidia doesn't mind the public's confusion about where one ends and the other begins, but some DL algorithms are complementary to light transport, rather than a requirement for it.

Edit2: One really cool thing that DL can do for path tracing is to compress extremely complicate BSDFs into very small networks. Super impressive stuff, but the training happens off-line and the CPU/GPU only need to evaluate the final network.

steckles fucked around with this message at 03:29 on Dec 22, 2022

New Zealand can eat me
Aug 29, 2008

:matters:


That's why I said the second sentence

repiv
Aug 13, 2009

there doesn't appear to be anything like that on the horizon in any case

the next obvious step for ML in realtime rendering is to apply it to denoising, there's plenty of precedent for that in offline rendering, even from nvidia, but shrinking it down to realtime constraints has eluded researchers so far

New Zealand can eat me
Aug 29, 2008

:matters:


The paper itself highlights this as the obvious next step (upscaling lower resolution samples and then denoising them), due to the extent this approach benefits from temporal sampling

I'm not sure that this shadertoy implementation is 100% correct, but is very impressive nonetheless

E: You can gently caress with values in common to increase the number of samples/resolution, and also the number of previous frames it borrows from. If you crank them it starts to look a little muddy but the detail is wild for the effort

Double edit: The updated version is way cooler

New Zealand can eat me fucked around with this message at 03:33 on Dec 22, 2022

Yaoi Gagarin
Feb 20, 2014

gradenko_2000 posted:

Thank you for the responses

and speaking of adorable ray-traced CGI, Lego Builder's Journey is the free game on Epic Games for today. Go grab it and put your GPU through its paces

When I took my first computer graphics class back in university, we got to pick our final project for that quarter. A number of us did some form of raytracer. I wrote a pretty naive raytracer in a couple of weeks, another student did a photon mapping engine, one did a path tracer. The principles are well understood and simple enough that even a student can generate really good looking images. There's even a series of very short books that can walk you through writing your own: https://raytracing.github.io/

All the difficulty in video games is because of the real-time aspect. You take any CS problem and say "ok now do it in less than 16 milliseconds" and you've made it way harder. Of course even Pixar and VFX companies can't afford to be wasteful, but their constraint is "render 2 hours of footage in less than a year" so to some extent they can throw more compute at making each frame look good.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

steckles posted:

Edit2: One really cool thing that DL can do for path tracing is to compress extremely complicate BSDFs into very small networks. Super impressive stuff, but the training happens off-line and the CPU/GPU only need to evaluate the final network.

this reminds me of when spherical harmonics were the big thing on the gd-algorithms list for a few years. compress that lighting down to a few exponents or whatever

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
For a PCIe gen3 only pc, what are the cards to avoid? 6500XT or also the 6600/XTs?

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

steckles posted:

Shadows from fans and quickly moving object in Portal RTX have an unrealistic, odd look to them that comes from the temporal reconstruction. We put up with it now because it's new and shiny, but nobody wants that. Honestly, we're probably at the dawn of a glorious new age of hacks that developers will create to bring, I dunno, "immediacy" back to ray traced lighting.

the ray volume is so low that pixels aren't being sampled enough. when you've got a raster with motion, and it's harmonic/periodic movement and you sample that at a fixed rate, you're getting an aliased sample - basically the beat frequencies of the motion and the sampling will cause aliasing actually in the subpixel samples themselves.

edit: "propeller distortion" with rolling shutters is a visual demonstration of such temporal aliasing, or you can think of it as the temporal analog of moire, which is spatial aliasing. grates (whether temporal or spatial) that are sliiiightly offset tend to have weird moire artifacts.

really pathtracing needs a lot more rays, it's still barely here (although to be fair portal does have some weird poo poo that makes it more complex, like non-euclidian space). like, basically another way of putting this problem is they've fallen below the nyquist rate of that area of the image given the high-frequency data being sampled (lots of motion) and are seeing aliasing as a result.

there are of course ways you could address it better. More rays in that area, use ML to direct the samples to the most effective places to use them if needed. Train DLSS (or use the optical flow engine) to identify harmonic motion if needed, and handle those differently (as a specific training case). Or just gimmick it up in those areas with some engine tricks.

I'm surprised I didn't see anyone bring up harmonics when the bug came up around those spinning lights glitching out coming through gratings... the grate adds a beat frequency/moire to the ray samples. If you're only sampling every N frames (and N is not small here, it's like 1/16 rays per pixel or something?) that can cause problems. They already are doing lots of tricks just to get this far.

Paul MaudDib fucked around with this message at 07:41 on Dec 22, 2022

Dr. Video Games 0031
Jul 17, 2004

What they're doing now is accumulating rays across multiple frames to build the picture, which is what's causing stuff like shadows to fade in and out when they spontaneously appear or disappear. The same happens with light. When a light source suddenly vanishes in Portal RTX, it fades out over a surprisingly long length of time (sometimes over half a second). Reusing rays across multiple frames is key to getting good performance since the raw ray count just isn't there yet, but the artifacts from this technique seem more apparent in Portal RTX than other games that use it. This will probably be a common facet among all RTX Remix mods, so you better get used to the look.

Shipon
Nov 7, 2005

Dr. Video Games 0031 posted:

What they're doing now is accumulating rays across multiple frames to build the picture, which is what's causing stuff like shadows to fade in and out when they spontaneously appear or disappear. The same happens with light. When a light source suddenly vanishes in Portal RTX, it fades out over a surprisingly long length of time (sometimes over half a second). Reusing rays across multiple frames is key to getting good performance since the raw ray count just isn't there yet, but the artifacts from this technique seem more apparent in Portal RTX than other games that use it. This will probably be a common facet among all RTX Remix mods, so you better get used to the look.

I noticed this in the Metro Exodus raytraced edition too, that lighting has this weird randomly pulsing and fading in and out appearance especially in darker regions.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

Rinkles posted:

For a PCIe gen3 only pc, what are the cards to avoid? 6500XT or also the 6600/XTs?

the RX 6600 XT has eight lanes of PCIe 4.0, and it is fine when running at PCIe 3.0
https://www.techpowerup.com/review/amd-radeon-rx-6600-xt-pci-express-scaling/13.html

EDIT: the RX 6600 non-XT is also has eight lanes, and should be fine on PCIe 3.0



the RX 6500 XT has FOUR lanes of PCIe 4.0, and it is NOT FINE
https://www.techpowerup.com/review/amd-radeon-rx-6500-xt-pci-express-scaling/14.html



the RX 6400, similarly, has just four lanes of PCIe 4.0, and it is NOT FINE
https://www.techpowerup.com/review/amd-radeon-rx-6400-pci-express-30-scaling/12.html

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
thanks

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

gradenko_2000 posted:

the RX 6400, similarly, has just four lanes of PCIe 4.0, and it is NOT FINE
https://www.techpowerup.com/review/amd-radeon-rx-6400-pci-express-30-scaling/12.html



it's so fuckin cheap though, if you just want the cheapest thing that's current-gen (formerly), it used to be down to $200 at microcenter and now it's down to like $130. for stuff that really needs a modern output with modern drivers, rx 6400 has always been old reliable.

this seems like a deal though and I'm not sure why nobody's latched onto it. I saw that yesterday night and figured that'd be gone, that's an ok deal for a super cheap entry level gpu. in a world where the entry level price has gone way way up, $130 for a not-complete-trash-tier gpu isn't bad, no reason you can't e-sports with that at 1080p or other less intensive stuff as well.

not a lot of ram of course, and it really does kinda need pcie 4 to shine, and there's no video encoders onboard (lolamd) but there's nothing else remotely in that price class right now. so like, if you have pcie 3.0 and the card loses some performance in highly drawcall-bottlenecked games or when you pop vram limits... what's the alternative? (And remember not every game is doom either...)

yes, you will have to keep VRAM usage under control because swapping will load up the bus, but everything is a crossover title that has to play on base tier xbox one S and ps4 casual, so that's not impossible. turn down your textures and suck it up, it's a $130 card. There are less than 5 non-cross-gen games released for PS5 to date lol, there's nothing you can't play at min settings at 1080p 30fps even on a 6500 XT on pcie 3.0 with 4gb lol.

the whole "6500xt is trash!!!" thing was when it was $400. Yeah, it was trash at that price. $130 is different, I'm much more accepting of problems and limitations in a genuine entry-level price point.

Also, 6500XT/6400 are actually on 6nm, it's interesting to look at the clock differences and cost differences vs the mainline RDNA2 chips in this light.

Paul MaudDib fucked around with this message at 08:17 on Dec 22, 2022

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
yeah okay I'm willing to concede that:

* it doesn't quite matter that the PCIe bus is capping your FPS if you're going to be CPU bound...

*. .. or if the GPU isn't going to be capable of that much more performance. TPU recorded 26.5 FPS for Cyberpunk 2077 with an RX 6500 XT when running on a PCie 4.0 system, which goes down to 21.2 FPS for PCIe 3.0 - it might matter for Doom Eternal, but here? Maybe not

* for new [but still budget] systems, even an Alder Lake Pentium G7400 will support PCIe 4.0

Dr. Video Games 0031
Jul 17, 2004

Paul MaudDib posted:

it's so fuckin cheap though, if you just want the cheapest thing that's current-gen (formerly), it used to be down to $200 at microcenter and now it's down to like $130. for stuff that really needs a modern output with modern drivers, rx 6400 has always been old reliable.

this seems like a deal though and I'm not sure why nobody's latched onto it. I saw that yesterday night and figured that'd be gone, that's an ok deal for a super cheap entry level gpu. in a world where the entry level price has gone way way up, $130 for a not-complete-trash-tier gpu isn't bad, no reason you can't e-sports with that at 1080p or other less intensive stuff as well.

not a lot of ram of course, and it really does kinda need pcie 4 to shine, and there's no video encoders onboard (lolamd) but there's nothing else remotely in that price class right now. so like, if you have pcie 3.0 and the card loses some performance in highly drawcall-bottlenecked games or when you pop vram limits... what's the alternative? (And remember not every game is doom either...)

yes, you will have to keep VRAM usage under control because swapping will load up the bus, but everything is a crossover title that has to play on base tier xbox one S and ps4 casual, so that's not impossible. turn down your textures and suck it up, it's a $130 card. There are less than 5 non-cross-gen games released for PS5 to date lol, there's nothing you can't play at min settings at 1080p 30fps even on a 6500 XT on pcie 3.0 with 4gb lol.

the whole "6500xt is trash!!!" thing was when it was $400. Yeah, it was trash at that price. $130 is different, I'm much more accepting of problems and limitations in a genuine entry-level price point.

Also, 6500XT/6400 are actually on 6nm, it's interesting to look at the clock differences and cost differences vs the mainline RDNA2 chips in this light.

The alternative is to not buy a GPU that cheap or to buy used. No one with a PCIe 3.0 system should under any circumstance ever buy a 6400 XT or 6500 XT. You really, really need to look at any other possible alternative because those GPUs will be the aboslute bottom of the barrel for you.

"Just turn down your textures and suck it up." How about you buy a decent GPU instead so you don't have to do that?

steckles
Jan 14, 2006

Paul MaudDib posted:

the ray volume is so low that pixels aren't being sampled enough. when you've got a raster with motion, and it's harmonic/periodic movement and you sample that at a fixed rate, you're getting an aliased sample - basically the beat frequencies of the motion and the sampling will cause aliasing actually in the subpixel samples themselves.
Hmmm, I don’t think that’s what is going on in this case. More likely, because shadows don’t have motion vectors, the temporal reconstruction algorithm can only use a simple differential brightness heuristic to determine which samples to include when reconstructing each pixel. The developers have obviously decided that smearing is preferable to noise, so they’ve likely got their non-motion vector based rejection criteria set to fairly inclusive levels. They could likely fix the issue, but not without introducing an unacceptable amount of noise into the image. A situationally dependent magic parameter with no physical basis that only lets you choose between two undesirable results, sounds kind of like a… hack to me.

I think this issue is indicative of a broader problem facing all temporal reconstruction algorithms: You need to track a potentially infinite number of motion vectors per pixel to track all of the dis/occlusions that could affect it during any frame. You’ve got a pixel receiving light in the presence of moving geometry? You need to store relative vectors for all the moving lights and geometry that could be affecting it. That pixel is also receiving some bounce light? Well now you also need to store the relative motion vectors for all the moving lights and geometry for all of the points that might contribute some illumination to that pixel. And so on. You basically need as many motion vector as there are path vertices if you want to accurately handle dis/occlusion without smearing. I’m not sure what a path tracer that could take that into account would look like. I doubt it would look anything like ReSTIR though.

kliras
Mar 27, 2021
shame about the inevitable price though

https://twitter.com/videocardz/status/1605853190989221889

SwissArmyDruid
Feb 14, 2014

by sebmojo
Yeah, I tried buying one of those but 30-series, and neither the MSRP is going down for those, nor are any to be had on aftermarket. Everyone's keeping them because Noctua. Asus hit a loving home run.

SwissArmyDruid fucked around with this message at 12:28 on Dec 22, 2022

mobby_6kl
Aug 9, 2009

by Fluffdaddy

What, a $900 4070Ti? Can't fuckin wait.

orcane
Jun 13, 2012

Fun Shoe
Let's be reasonable.

mobby_6kl posted:

What, a $1200 4070Ti? Can't fuckin wait.

mcbexx
Jul 4, 2004

British dentistry is
not on trial here!



wargames posted:

also won't nvidia handicap the 4060 because it might compete against other nvidia products then release a cut down 4070 that will be called 4060 then a release a 4060 ti then another 4060 with slight different memory?

The 4050 will have 4 GB VRAM.

Well, 3.5 GB of fast VRAM and .5 GB of slightly slower VRAM.

mobby_6kl
Aug 9, 2009

by Fluffdaddy

mcbexx posted:

The 4050 will have 4 GB VRAM.

Well, 3.5 GB of fast VRAM and .5 GB of slightly slower VRAM.

There will be also another version of the 4050 with actual 4GB of VRAM but it's DDR4

kliras
Mar 27, 2021
60ti tends to be the cut-off point for sane models and all kinds of weird, scummy stuff with different vram versions and whatnot happen below that

the 4080 12gb version threw somewhat of a wrench into that rule of thumb though, but at least it was so egregious they unlaunched it

nitsuga
Jan 1, 2007

kliras posted:

60ti tends to be the cut-off point for sane models and all kinds of weird, scummy stuff with different vram versions and whatnot happen below that

the 4080 12gb version threw somewhat of a wrench into that rule of thumb though, but at least it was so egregious they unlaunched it…

…and will sell it for just as much with a different label.

kliras
Mar 27, 2021
it's gonna be interesting to see how nvidia pull off announcing 4070/60 if they have to get the 70ti out the door to whichever reviews it's going to get. probably means the general release schedule slides quite a bit

PBCrunch
Jun 17, 2002

Lawrence Phillips Always #1 to Me

Paul MaudDib posted:

this seems like a deal though and I'm not sure why nobody's latched onto it. I saw that yesterday night and figured that'd be gone, that's an ok deal for a super cheap entry level gpu. in a world where the entry level price has gone way way up, $130 for a not-complete-trash-tier gpu isn't bad, no reason you can't e-sports with that at 1080p or other less intensive stuff as well.

not a lot of ram of course, and it really does kinda need pcie 4 to shine, and there's no video encoders onboard (lolamd) but there's nothing else remotely in that price class right now. so like, if you have pcie 3.0 and the card loses some performance in highly drawcall-bottlenecked games or when you pop vram limits... what's the alternative? (And remember not every game is doom either...)

yes, you will have to keep VRAM usage under control because swapping will load up the bus, but everything is a crossover title that has to play on base tier xbox one S and ps4 casual, so that's not impossible. turn down your textures and suck it up, it's a $130 card. There are less than 5 non-cross-gen games released for PS5 to date lol, there's nothing you can't play at min settings at 1080p 30fps even on a 6500 XT on pcie 3.0 with 4gb lol.

the whole "6500xt is trash!!!" thing was when it was $400. Yeah, it was trash at that price. $130 is different, I'm much more accepting of problems and limitations in a genuine entry-level price point.

Also, 6500XT/6400 are actually on 6nm, it's interesting to look at the clock differences and cost differences vs the mainline RDNA2 chips in this light.

The Intel Arc 380 is in the ballpark in performance in some games (check the ones you want to play), has more display outputs, 6GB VRAM, and has modern media codec support.

The Arc does stumble quite a bit if your system doesn't support Rebar though.

SSJ_naruto_2003
Oct 12, 2012



kliras posted:

60ti tends to be the cut-off point for sane models and all kinds of weird, scummy stuff with different vram versions and whatnot happen below that

the 4080 12gb version threw somewhat of a wrench into that rule of thumb though, but at least it was so egregious they unlaunched it

Did you forget the 970

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

I wish modern video cards were a good a value as the 970, hoo boy.

jokes
Dec 20, 2012

Uh... Kupo?

1080 was great too. I got mine for like $500 and had it for 5 years, sold it for $200

Shipon
Nov 7, 2005
remember when the 970 launched with only 3.5 GB of usable VRAM, and now people are pretending that it was a great deal when it had glaring issues?

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

I don’t remember the 3.5G thing being a practical issue with my 970, and I ran it for years including a bunch of VR development and testing. Maybe it caused problems for other people, though.

Theophany
Jul 22, 2014

SUCCHIAMI IL MIO CAZZO DA DIETRO, RANA RAGAZZO



2022 FIA Formula 1 WDC
I remember the 1080Ti being incredible value for money at the time and prior to that the 7970 being absolutely poo poo hot in crossfire.

And as to be expected, AMD gradually getting their poo poo together in their 'ages like fine wine' strategy: https://arstechnica.com/gadgets/2022/12/amd-acknowledges-high-power-use-in-rx-7900-gpus-plans-driver-fixes/

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

This is so soon after release it’s more like just letting the wine breathe than aging it.

Wiggly Wayne DDS
Sep 11, 2010



Subjunctive posted:

I don’t remember the 3.5G thing being a practical issue with my 970, and I ran it for years including a bunch of VR development and testing. Maybe it caused problems for other people, though.
it wasn't is the problem. history's been rewritten off of some synthetic tests of moving 4GB of data on and off the vram, when in games nvidia's drivers were handling the edge-cases for vram limits. that it took months for the problem to even be found speaks volumes to how it impacted the card in practice

really though it's not a fight worth arguing the facts over as the most outspoken about it have the least knowledge about how the cards function. the legal issue on the card being marketed with the wrong tech specs is a different story:

quote:

Specifically, plaintiffs alleged that the GTX 970 was misrepresented as being able to “(1) operate with a full 4 gigabytes of video random access memory, (2) have 64 render output processors, and (3) have an L2 cache capacity of 2megabytes, or omitted material facts to the contrary.”

Plaintiffs claimed that instead of 4 GB of video random access memory, the graphics card operated on 3.5 GB with a separate .5 GB spillover.

In addition, the lawsuits alleged that instead of the 56 render output processors that the GTX 970 advertised, the actual device contained 64 ROPs.
the uh, write-ups get the rop totals mixed up...

hobbesmaster
Jan 28, 2008

Subjunctive posted:

I don’t remember the 3.5G thing being a practical issue with my 970, and I ran it for years including a bunch of VR development and testing. Maybe it caused problems for other people, though.

Huh, how many pixels in the VR tests? I thought one of the problems was it just ate poo poo at 1440p.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

hobbesmaster posted:

Huh, how many pixels in the VR tests? I thought one of the problems was it just ate poo poo at 1440p.

Honestly I don’t recall, but it was variable up to a bit beyond whatever the total pixel count was for the Rift CV1. I ran in 1440p though, on those sweet Korean overclockable monitors, and never knew anything was amiss.

Adbot
ADBOT LOVES YOU

mobby_6kl
Aug 9, 2009

by Fluffdaddy

Charles Leclerc posted:

I remember the 1080Ti being incredible value for money at the time and prior to that the 7970 being absolutely poo poo hot in crossfire.

And as to be expected, AMD gradually getting their poo poo together in their 'ages like fine wine' strategy: https://arstechnica.com/gadgets/2022/12/amd-acknowledges-high-power-use-in-rx-7900-gpus-plans-driver-fixes/
Wasn't the Ti silly expensive? Maybe it was worth it at least, Pascal overall was pretty good value and got me to pony up more than the ~$200 I've been spending on cards before like the 9600 Pro and 8800 GT.


Subjunctive posted:

This is so soon after release it’s more like just letting the wine breathe than aging it.
Speaking of which, has any of the reviews gone back and checked if the Arc drivers are getting fixed in the cases it performed like poo poo?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply