Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Carecat
Apr 27, 2004

Buglord
Yeah one is a custom loop one is an AIO. AIO is incredibly easy to install as long as your case has a suitable location to put it.

I don't think we should recommend a custom loop but that's kind of a small radiator for a 250W card?

You shouldn't buy a 2XXX series right now until we know what the 3XXXX are like.

Adbot
ADBOT LOVES YOU

Zarin
Nov 11, 2008

I SEE YOU

Carecat posted:

Yeah one is a custom loop one is an AIO. AIO is incredibly easy to install as long as your case has a suitable location to put it.

I don't think we should recommend a custom loop but that's kind of a small radiator for a 250W card?

You shouldn't buy a 2XXX series right now until we know what the 3XXXX are like.

Sorry for the confusion: I'm just using these as examples; I have no plans to purchase a 20xx card. I'm just planning on going with EVGA 3070 or 3080, so I assume that we'll get similar cooling options on those.

I was also wondering about radiator size (and VRAM being air cooled) - would that "hybrid" AIO version even be worth it? Or would I be better off with a 3-fan air cooled in that case?

And for my other sorta-unasked question, just plugging in 2 PEX hoses is a pretty difficult operation, then? (I know there's more to it then that, of course; just wondering if whatever I might get out of using the fully-watercooled one wouldn't be worth the extra hassle from either the AIO or a 3-fan)

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
Water cooling requires a couple of things:

the water block that makes contact with the thing you're cooling (say, the GPU die), so that heat is transferred from the die, to the block
the liquid that takes the heat and transfers it to the radiator
the radiator, which is where the heat is dissipated out into the environment
the fan for the radiator, which helps the heat dissipate
the pump, which moves the liquid around in this loop

AIO means "All In One": a fan is already connected to the radiator, and the radiator, the pump, and the water block are all already connected to each other with tubing, and there's already liquid inside the tubes.

In the case of that first card you linked, the water block is even already attached to the die, since you're buying the whole card plus the cooling solution in one go. In cases where AIOs are sold separately, you'd have to mount the water block to the GPU (or the CPU) yourself.

And yes, it's a "hybrid" solution in that there's also a fan that's used to air-cool the VRMs and the rest of the card

___

In the case of that second card you linked, there is a water block that makes contact with everything on the card that needs to be cooled, and then the card has mounting holes for tubing. The idea there is that you have a separate water-cooling loop, and you can "just" integrate the card into being part of the loop, but it assumes that you provide the whole rest of whatever that set-up is going to be.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

Zarin posted:

Some questions about watercooling, if I may:

This card looks like it comes with a small radiator that cools the GPU, and the VRAM is air-cooled. Is this considered an AIO, or is it somewhere in-between?

However, this card appears to be a fully-watercooled option, but doesn't appear to come with a radiator or hoses. I imagine I would have to source those myself? And if I did that, would that be considered a "custom loop" at that point?

I'm mostly interested in the terminology to see if I'm using it correctly; either option is probably within the realm of my mechanical aptitude, and I could probably be talked into either when the 3080 drops, depending on what's available.

Is one cooling solution above noticeably better than the other?

The first SKU you linked is a card with an AIO attached, yes. Most people mount that unit to the exhaust fan spot on their case, but if you use a large air cooler, you're going to want to make sure you have decent clearance between the HSF and AIO. The fan on the card is largely there to cool the subcomponents like the voltage regulation circuitry.

The *second* SKU is very much a "custom loop" card. Most people who go with a custom loop prefer to source their own waterblocks rather than buy them pre-installed, because they're sticklers about what TIM they want to use and buying such a card would just entail them taking off the included waterblock to "do it right."

Speaking plainly, the AIO isn't the most elegant of solutions but it is the easiest. A proper custom loop system can easily cost "very good monitor" money, and what you gain in thermals and performance is largely outweighed by losing the ability to easily service your system without having to work around the tubing. And this is assuming you go with *soft* tubing and don't go full nutjob with hard/rigid conduits.

Zarin
Nov 11, 2008

I SEE YOU

gradenko_2000 posted:

Water cooling requires a couple of things:

the water block that makes contact with the thing you're cooling (say, the GPU die), so that heat is transferred from the die, to the block
the liquid that takes the heat and transfers it to the radiator
the radiator, which is where the heat is dissipated out into the environment
the fan for the radiator, which helps the heat dissipate
the pump, which moves the liquid around in this loop

AIO means "All In One": a fan is already connected to the radiator, and the radiator, the pump, and the water block are all already connected to each other with tubing, and there's already liquid inside the tubes.

In the case of that first card you linked, the water block is even already attached to the die, since you're buying the whole card plus the cooling solution in one go. In cases where AIOs are sold separately, you'd have to mount the water block to the GPU (or the CPU) yourself.

And yes, it's a "hybrid" solution in that there's also a fan that's used to air-cool the VRMs and the rest of the card

___

In the case of that second card you linked, there is a water block that makes contact with everything on the card that needs to be cooled, and then the card has mounting holes for tubing. The idea there is that you have a separate water-cooling loop, and you can "just" integrate the card into being part of the loop, but it assumes that you provide the whole rest of whatever that set-up is going to be.

Ah, alright! Guess I forgot about the pump and all the other stuff too, yeah. Okay, so that second one would be a more serious commitment, gotcha. Not sure I'm quite ready for that level of effort, hmm.

I guess I was sort of expecting one of the AIO offerings to be the full-waterblock one, but pre-packaged like the hybrid. I don't see that on EVGA's 20xx series, though. Not sure if anything like that ever existed, either.

So I guess the big question would be if the "hybrid" would be that much more efficient at dumping heat than 3-fan air, hmm.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

interesting that the PS5 (from what's known) looks least equipped to make use of this type of technology

Zarin
Nov 11, 2008

I SEE YOU

BIG HEADLINE posted:

The first SKU you linked is a card with an AIO attached, yes. Most people mount that unit to the exhaust fan spot on their case, but if you use a large air cooler, you're going to want to make sure you have decent clearance between the HSF and AIO. The fan on the card is largely there to cool the subcomponents like the voltage regulation circuitry.

The *second* SKU is very much a "custom loop" card. Most people who go with a custom loop prefer to source their own waterblocks rather than buy them pre-installed, because they're sticklers about what TIM they want to use and buying such a card would just entail them taking off the included waterblock to "do it right."

Speaking plainly, the AIO isn't the most elegant of solutions but it is the easiest. A proper custom loop system can easily cost "very good monitor" money, and what you gain in thermals and performance is largely outweighed by losing the ability to easily service your system without having to work around the tubing. And this is assuming you go with *soft* tubing and don't go full nutjob with hard/rigid conduits.

Hah, I'm suddenly having flashbacks to the post from last week about how the cycle starts and ends with "Good Air Cooling" :v:

For where I'm at right now, the decision would be between whatever AIO is offered and a 3-fan air-cooled solution, then. Guess I'll take any guidance on the matter!

(For reference, when I'm not gaming, I run the system at 100% doing Folding@Home stuff, so cooling is nice. I've noticed that on my custom fan curve, my 3-fan 970 is slowly requiring more and more fan speed % to keep around 70C; I suspect that it probably needs re-pasted at this point, which is something that I may or may not tackle after I have a new 30xx card.)

I've done piping for industrial systems (mostly hydraulics) so I sort of have an idea of what kind of pain-in-the-rear end shitshow it is; it's also something I can see myself tackling in a decade or so once I feel like I have reasonable amounts of disposable income :P

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

Zarin posted:

So I guess the big question would be if the "hybrid" would be that much more efficient at dumping heat than 3-fan air, hmm.

Those huge 3-fan kinds of coolers are entirely dependent on how efficient your case and case fans are at actively and passively exhausting the heated air within the case through positive pressure. By pushing cooler ambient air into the case, the warm air inside the case is pushed out actively through the back and passively through engineered vents.

When a case is small or doesn't have the means to generate that positive pressure, that's really the only application for "blower" coolers that directly exhaust outward. The vents on those three fan coolers on the retention bracket are passive exhaust holes.

Would the Hybrid dump heat better? Yes. Initially. Watercooling gives you great numbers, but the caveat is that liquid takes a lot longer to cool back down than metal does. You're also adding another potential point of failure to your system, so if you do go with an EVGA Hybrid SKU, maybe consider dropping extra money on an extended warranty. EVGA lets you cover a card for 5 or 10 years and the cost is contingent on what you paid for the card.

Zarin
Nov 11, 2008

I SEE YOU

BIG HEADLINE posted:

Those huge 3-fan kinds of coolers are entirely dependent on how efficient your case and case fans are at actively and passively exhausting the heated air within the case through positive pressure. By pushing cooler ambient air into the case, the warm air inside the case is pushed out actively through the back and passively through engineered vents.

When a case is small or doesn't have the means to generate that positive pressure, that's really the only application for "blower" coolers that directly exhaust outward. The vents on those three fan coolers on the retention bracket are passive exhaust holes.

Would the Hybrid dump heat better? Yes. Initially. Watercooling gives you great numbers, but the caveat is that liquid takes a lot longer to cool back down than metal does. You're also adding another potential point of failure to your system, so if you do go with an EVGA Hybrid SKU, maybe consider dropping extra money on an extended warranty. EVGA lets you cover a card for 5 or 10 years and the cost is contingent on what you paid for the card.

I suppose that's an important point - I have a penchant for large cases with big fans. I like to think I'm reasonably good at getting the balance right for high-throughput positive pressure in the case.

I didn't know about the extended warranty, though, that's interesting! I just looked it up, and it seems like it could be a good deal.

Riflen
Mar 13, 2009

"Cheating bitch"
Bleak Gremlin

Rinkles posted:

Could someone explain how DLSS works beyond machine learning black box technology, cause it sounds like magic. I'd think it was too good to be true if not for the evidence. Maybe an example of some artifacting mistake might show what it's trying to do.

DLSS is a superior form of Temporal Anti Aliasing (TAA).
TAA takes information about objects in motion from previous frames and upcoming frames and runs it through an algorithm run on the GPU's shaders, to produce an image that can reduce some unwanted artifacts that tend to be present in modern realtime graphics. It's essentially guessing what to draw using data available from inside the GPU at the time the game is running.

DLSS is doing that too, except that DLSS also has access to learned information about reference or 'ground truth' images from the game rendered at 32 samples per pixel. Basically how the game looks when super sampled 32 times. It uses this extra information to produce a final image that can contain information that isn't even present in the previous or upcoming frames.

The icing on the cake is that DLSS is so good at this, that you can have your GPU render the game at 960x540 and produce a better output image than 1920x1080 with TAA. As running the DLSS algorithm costs between 0.4 and 2.5 milliseconds depending on resolution and GPU used, it allows for a nice performance increase at the same time as improving image quality. These resolutions are one example. DLSS 2.0 supports up to 4x upscaling ratio, so 1920x1080 to 3840x2160 is also possible.

http://behindthepixels.io/assets/files/DLSS2.0.pdf

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

Riflen posted:

DLSS is a superior form of Temporal Anti Aliasing (TAA).
TAA takes information about objects in motion from previous frames and upcoming frames and runs it through an algorithm run on the GPU's shaders, to produce an image that can reduce some unwanted artifacts that tend to be present in modern realtime graphics. It's essentially guessing what to draw using data available from inside the GPU at the time the game is running.

DLSS is doing that too, except that DLSS also has access to learned information about reference or 'ground truth' images from the game rendered at 32 samples per pixel. Basically how the game looks when super sampled 32 times. It uses this extra information to produce a final image that can contain information that isn't even present in the previous or upcoming frames.

The icing on the cake is that DLSS is so good at this, that you can have your GPU render the game at 960x540 and produce a better output image than 1920x1080 with TAA. As running the DLSS algorithm costs between 0.4 and 2.5 milliseconds depending on resolution and GPU used, it allows for a nice performance increase at the same time as improving image quality. These resolutions are one example. DLSS 2.0 supports up to 4x upscaling ratio, so 1920x1080 to 3840x2160 is also possible.

http://behindthepixels.io/assets/files/DLSS2.0.pdf

Remarkable



(pardon the imgur compression)

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
DLSS is the closest thing to actual magic we will get in a very long time. Two years ago the idea of a rendering technique that takes images lower than native resolution and makes them look even better and run even faster would have been dismissed as impossible and ludicrous.

Rinkles posted:

interesting that the PS5 (from what's known) looks least equipped to make use of this type of technology

That applies to both consoles. It's an nvidia technology, Xbox and Playstation uses AMD GPUs. Sony has a patent for something that's similar to DLSS but it didn't seem to appear in any of the PS5 footage shown so far.

Riflen
Mar 13, 2009

"Cheating bitch"
Bleak Gremlin

Rinkles posted:

Remarkable



(pardon the imgur compression)

I very much recommend watching the presentation when you have a spare 45 minutes. The temporal stability examples later on are very impressive.

https://www.youtube.com/watch?v=tMtMneugt0A

ConanTheLibrarian
Aug 13, 2004


dis buch is late
Fallen Rib

If Nvidia really want to aim for another leap forward for some hypothetical DLSS3, they should look at fixing the midground LOD pop in along the riverbed that's very apparent at the 11 minute mark in that video. It's far more noticeable than the stuff in the far distance that is highlighted in the vid.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

ConanTheLibrarian posted:

If Nvidia really want to aim for another leap forward for some hypothetical DLSS3, they should look at fixing the midground LOD pop in along the riverbed that's very apparent at the 11 minute mark in that video. It's far more noticeable than the stuff in the far distance that is highlighted in the vid.

That's got nothing to do with nvidia or dlss, that's down to a game developer decision, level of detail models are as old as dinosaurs and a necessary evil, if they are too obvious/not subtle enough that is a developer issue.

UE5 claims to solve the thorny issue of LODs with their nanite implementation. A handy demonstration of nanite can be seen here where the tree's number of triangles goes up and down depending on how close the camera is to it. After all why render more triangles than the display's pixels can see?

Zedsdeadbaby fucked around with this message at 11:29 on Aug 6, 2020

repiv
Aug 13, 2009

Zedsdeadbaby posted:

A handy demonstration of nanite can be seen here where the tree's number of triangles goes up and down depending on how close the camera is to it. After all why render more triangles than the display's pixels can see?

That's demonstrating how things work now in UE4, not how UE5/Nanite will work. You'll notice that the polygon count starts at ~5000 and instantly pops to ~1000 when the camera reaches a certain distance threshold.

The promise of Nanite is it can smoothly vary the polygon count to any value in order to maintain a specific polygon density per pixel at any distance (up to 1tri/pix at the highest quality setting but potentially lower for performance).

TacticalHoodie
May 7, 2007

If I am being realistic and waiting for a few months for the first gen card issues to shake out, would the New Year 2021 be a good time to get a 3000/Big Navi2 series card without having to stake out websites waiting for stock? I am thinking of giving my 1080ti to my girlfriend to help keep her cost down on her new build and getting a current gen card. I'm also looking at a new monitor as the sleep issues on my TN Dell Monitor is getting worse by the month but it has not failed yet, but the monitor prices are still "pants on head" stupid at the moment in Canada.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Zarin posted:

I was also wondering about radiator size (and VRAM being air cooled) - would that "hybrid" AIO version even be worth it? Or would I be better off with a 3-fan air cooled in that case?

And for my other sorta-unasked question, just plugging in 2 PEX hoses is a pretty difficult operation, then? (I know there's more to it then that, of course; just wondering if whatever I might get out of using the fully-watercooled one wouldn't be worth the extra hassle from either the AIO or a 3-fan)

Smallish radiators are totally fine for a GPU, honestly. Big gently caress-off radiators are good for cooling the water back down as it passes, and that means a bigger temp differential between the water and the block, and that is helpful for cooling a CPU where the CPU package tends to struggle to get the heat out in the first place. GPUs with their giant dies have a much easier time with heat transfer, so the temp differential isn't nearly as important. I've got the 1080Ti EVGA Hybrid, and it works very, very well.

If you did want to go a custom loop route, plugging the hoses in isn't difficult at all. Getting everything routed the way you want it to look is usually the bigger challenge. In terms of functionality, a custom loop doesn't really get you much that you couldn't get with an AIO, other than the ability to look sweet as hell (which is a legitimate end goal unto itself). Cooling wise they're about the same.

BIG HEADLINE posted:

Would the Hybrid dump heat better? Yes. Initially. Watercooling gives you great numbers, but the caveat is that liquid takes a lot longer to cool back down than metal does. You're also adding another potential point of failure to your system, so if you do go with an EVGA Hybrid SKU, maybe consider dropping extra money on an extended warranty. EVGA lets you cover a card for 5 or 10 years and the cost is contingent on what you paid for the card.

Not sure what you mean here? Watercooling gives better numbers forever assuming you're not overloading it (which is tough to do). Sure, if you've heat-soaked the liquid by running full-tilt for a bit, that water is going to take a little to cool back down to ambient afterwards, but that doesn't do much in terms of the GPU/CPU, and if the GPU hangs out for an extra minute at ~40C before dropping back to ~32C, so what? The radiator will be dumping heat out of the case more efficiently than an open-fan cooler will in almost all cases.

The warranty thing is potentially worth it on two conditions: (1) you are the type to actually keep a card more than the standard 3 year warranty, and (2) you don't have a Citibank or similar card that automatically gives you a 2-year warranty extension. Most AIOs don't recommend using them for more than 5 years, so I wouldn't plan on keeping it much past that point regardless.

DrDork fucked around with this message at 14:28 on Aug 6, 2020

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Rinkles posted:

Could someone explain how DLSS works beyond machine learning black box technology, cause it sounds like magic. I'd think it was too good to be true if not for the evidence. Maybe an example of some artifacting mistake might show what it's trying to do.

Rinkles posted:

Could someone explain how DLSS works beyond machine learning black box technology, cause it sounds like magic. I'd think it was too good to be true if not for the evidence. Maybe an example of some artifacting mistake might show what it's trying to do.

Normal TAA works by accumulating sub-pixel samples across multiple frames. The problem is that not all the samples produce a “good” output, it just sort of generically mixes them and hopes for the best.

With dlss 2.0, nvidia applies a neural net model to determine weights, essentially choosing which samples to use in real-time based on the game data.

v1ld
Apr 16, 2012

Whiskey A Go Go! posted:

If I am being realistic and waiting for a few months for the first gen card issues to shake out, would the New Year 2021 be a good time to get a 3000/Big Navi2 series card without having to stake out websites waiting for stock? I am thinking of giving my 1080ti to my girlfriend to help keep her cost down on her new build and getting a current gen card. I'm also looking at a new monitor as the sleep issues on my TN Dell Monitor is getting worse by the month but it has not failed yet, but the monitor prices are still "pants on head" stupid at the moment in Canada.

Dunno about card pricing and availability in the future, but Thanksgiving to Christmas sales are absolutely a great time to buy a monitor.

E: When my old TN died, I bought a $90 1080p IPS (with 48-75Hz FreeSync over HDMI, woo hoo) to tide me over for 3 months till the end of year sales dropped prices on the 3440x1440p monitor I wanted. Came out way ahead.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Rinkles posted:

Remarkable



(pardon the imgur compression)

I know I've said it before, but it's a complete turn-around from DLSS 1, where it was a straight up joke. Nobody's joking about DLSS 2. What a seriously worthwhile piece of performance enhancing tech.

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

Zedsdeadbaby posted:

DLSS is the closest thing to actual magic we will get in a very long time. Two years ago the idea of a rendering technique that takes images lower than native resolution and makes them look even better and run even faster would have been dismissed as impossible and ludicrous.


That applies to both consoles. It's an nvidia technology, Xbox and Playstation uses AMD GPUs. Sony has a patent for something that's similar to DLSS but it didn't seem to appear in any of the PS5 footage shown so far.

image superresolution techniques using DL have been around for ages, it's only recently that it's become reasonable to implement them in real time. dlss goes further by including taa data like motion vectors and multiple previous frames. almost like an inverse of video compression

Malcolm XML fucked around with this message at 17:01 on Aug 6, 2020

repiv
Aug 13, 2009

Malcolm XML posted:

image superresolution techniques using DL have been around for ages, it's only recently that it's become reasonable to implement them in real time.

The offline super-resolution techniques are arguably still not reasonable in real time, that's what DLSS1 attempted to do and failed miserably.

DLSS1 was DL-guided TAA followed by DL upscaling that attempted to imagine extra details, DLSS2 is just DL-guided TAAU.

shrike82
Jun 11, 2005

Without seeing the implementation details, you can guess that it's some kind of convolutional auto-encoder where the training involves using high-resolution image + vectors deliberately downscaled and then reconstructed - for a DL problem, it's convenient in that you don't need to generate labelled samples.

The more interesting question is their claim of DLSS2 (vs. 1.) not having to be trained on a per-game basis. The question is whether their network was simply trained on a diverse enough universe of game images that the network can handle games it has never "seen" before. Or whether they've just streamlined the training process such that they can "fine-tune" the network on a per-game basis quickly. The former would be fairly unusual in that there tends to be a drop in reconstruction accuracy which would imply they've determined that there's an acceptable loss in fidelity which isn't actually that noticeable (e.g., some of the artifacting observed in DS).

And contrary to what's been said, there's no reason to think the next-gen consoles (esp. Microsoft's) won't be able to implement some form of DLSS in their games - perhaps through DirectML. Inferencing through Nvidia's tensor cores has some benefits but there's no reason why they can't be done through generic GPU cores albeit with some loss of performance.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
DLSS2 rendering better than native resolution just makes me more sure than ever that we're in a simulation on some alien child's holo-desktop and he's controlling trump just to gently caress with us like some dude playing Crusader Kings.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

shrike82 posted:

Without seeing the implementation details, you can guess that it's some kind of convolutional auto-encoder where the training involves using high-resolution image + vectors deliberately downscaled and then reconstructed - for a DL problem, it's convenient in that you don't need to generate labelled samples.

The more interesting question is their claim of DLSS2 (vs. 1.) not having to be trained on a per-game basis. The question is whether their network was simply trained on a diverse enough universe of game images that the network can handle games it has never "seen" before. Or whether they've just streamlined the training process such that they can "fine-tune" the network on a per-game basis quickly. The former would be fairly unusual in that there tends to be a drop in reconstruction accuracy which would imply they've determined that there's an acceptable loss in fidelity which isn't actually that noticeable (e.g., some of the artifacting observed in DS).

And contrary to what's been said, there's no reason to think the next-gen consoles (esp. Microsoft's) won't be able to implement some form of DLSS in their games - perhaps through DirectML. Inferencing through Nvidia's tensor cores has some benefits but there's no reason why they can't be done through generic GPU cores albeit with some loss of performance.

Read the presentation linked earlier. Specifically the "SPATIAL-TEMPORAL SUPER SAMPLING" part.

Basically, it works differently from before, where it would actually auto-encode the output image directly. The reason DLSS 2.0 is game-agnostic is because they've moved to a traditional TAA style upsampling algorithm but using the neural network to weight the samples and discard the invalid samples - so what they are deep learning is now a filter for the input to a TAA-style reconstruction algorithm. So the filter looks at the motion data for the samples and so on and chooses which of them to input to the TAA, and is specifically designed to be temporally stable to avoid shimmering/etc.

shrike82
Jun 11, 2005

That's interesting but they don't detail (whether in that presentation or anywhere else) some metric of reconstruction accuracy/loss.
Like I said, there's probably some loss of fidelity in reconstruction but as humans, we tend not to notice it - as long as the stream of output images are consistent across time. That's why I tend to find DF-style sampling of videos of DLSS on/off limited and subjective.

It's not just out of academic interest as a data scientist - them discussing reconstruction losses would be interesting in:
1) quantifying the trade-offs between the different SS factors (540P, 1080P etc.)
2) identifying games/scene-types which DLSS does better or worse with
3) quantifying future improvements in accuracy with future versions of DLSS etc.

We're likely going to see researchers make further use of human deficiencies in perception to improve game performance (e.g., foveation for VR) which is neat.

shrike82 fucked around with this message at 19:01 on Aug 6, 2020

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

Taima posted:

DLSS2 rendering better than native resolution just makes me more sure than ever that we're in a simulation on some alien child's holo-desktop and he's controlling trump just to gently caress with us like some dude playing Crusader Kings.

2020 is proof that the simulation is breaking down. Nothing matters anymore. Embrace anarcho-nihilism before they reboot the system

repiv
Aug 13, 2009

yikes, so much for death strandings port boding well for horizon zero dawn

it's pretty clear they were ported independently of each other

https://www.youtube.com/watch?v=7w-pZm7Zay4

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
Worth keeping in mind that Death Stranding was developed with the PC version in mind from the beginning.

And maybe I'm wrong, but in most situations, Horizon seems like a more demanding game. Not to say the port couldn't have been better.

repiv
Aug 13, 2009

yeah horizon has more going on but that doesn't excuse frame pacing and stuttering issues, animations running at 30fps regardless of framerate, higher ambient occlusion settings looking worse, etc

pyrotek
May 21, 2004



repiv posted:

yikes, so much for death strandings port boding well for horizon zero dawn

it's pretty clear they were ported independently of each other

https://www.youtube.com/watch?v=7w-pZm7Zay4

I was wondering why this game didn't have DLSS support like Death Stranding, and I think this video hints at why. I have to imagine all the parts that are limited to 30 FPS keep the temporal reconstruction from working correctly.

Mercrom
Jul 17, 2009
It seems much more likely the reason is that they didn't want to spend any more money or time on the pc port than they absolutely needed to.

shrike82
Jun 11, 2005

I’m looking forward to the PC port of Halo Infinite - that should be another clusterfuck.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Rinkles posted:

Worth keeping in mind that Death Stranding was developed with the PC version in mind from the beginning.

Yeah, this is probably a big part of it. H:ZD came out three years ago on the PS4 as an exclusive, so that it's taken this long to get to the PC is a pretty solid indication that it wasn't at all designed with a PC port in mind. As close as consoles are to PCs these days, console-exclusives often do all sorts of funky stuff to make them run better on the limited hardware which almost invariably ends up in a sub-optimal port.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

repiv posted:

yikes, so much for death strandings port boding well for horizon zero dawn

it's pretty clear they were ported independently of each other

https://www.youtube.com/watch?v=7w-pZm7Zay4

I wonder if this was simply because Nvidia has such an aggressive and well-funded partner team, they were able to lend a lot more help in terms of making the game run as well as possible. Of course that's speculation, but this is a borderline disaster... no DLSS and a lovely port. drat, how disappointing.

If anything, Nvidia is probably tripling down on their partner program because they are going to want every game to have DLSS.


Zedsdeadbaby posted:

2020 is proof that the simulation is breaking down. Nothing matters anymore. Embrace anarcho-nihilism before they reboot the system

I just texted a buddy last night that if the world ends before I can get this 3080Ti, I'm going to be pretty salty about it. Which in retrospect, wow are my priorities hosed up.

repiv
Aug 13, 2009

DrDork posted:

As close as consoles are to PCs these days, console-exclusives often do all sorts of funky stuff to make them run better on the limited hardware which almost invariably ends up in a sub-optimal port.

Yeah the heavy PCIe traffic might be due to some system(s) that were designed assuming they would always run on a shared memory archetecture.

ufarn
May 30, 2009
Another win for team "never preorder". Man, I was looking so much forward to playing this upscaled at 144+ FPS.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

Taima posted:

I just texted a buddy last night that if the world ends before I can get this 3080Ti, I'm going to be pretty salty about it. Which in retrospect, wow are my priorities hosed up.

That's actually a healthy coping mechanism

I'm personally jonesing for an entirely new system altogether at the end of the year, gimme that juicy zen3 and ampere goodness

Adbot
ADBOT LOVES YOU

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Zedsdeadbaby posted:

I'm personally jonesing for an entirely new system altogether at the end of the year, gimme that juicy zen3 and ampere goodness

I'm pretty evenly divided between "jump on the Zen3 train" and "ride the 5820k 'till DDR5 is reasonable."

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply