Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

MagusDraco posted:

The GTX 1660ti reviews dropped. It's pretty much a $280~ GTX 1070 in performance with 6 gigs of GDDR6 ram and no RTX features.

edit: Anandtech review
Gamers Nexus

edit: also it's pretty short. but still a 2 slot card.
That Evga is a 3-slot design isn't it though? Rather bizarre choice from Evga as that nixes it from most mini-itx systems as an option when in terms of power/thermals it should be ideal.

edit: Nice to see the overclocking #'s though, that was my hope with the smaller chip in that it would overclock well and seems you get a healthy jump.

Happy_Misanthrope fucked around with this message at 16:10 on Feb 22, 2019

Adbot
ADBOT LOVES YOU

B-Mac
Apr 21, 2003
I'll never catch "the gay"!

Happy_Misanthrope posted:

Why do you think MS started buying up studios left and right? Because they had the same idea - exclusivity is dead - then Sony lapped them with the PS4. God of War has sold over 6 mil, Spiderman has sold over 9 million. How do think The Last of Us 2 will do? Perhaps 'less' exclusives than in years past, but they still matter a fuckton.

Only reason I bought a PS4 Pro was because of their exclusives, a bunch of great Sony exclusives.

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

B-Mac posted:

Only reason I bought a PS4 Pro was because of their exclusives, a bunch of great Sony exclusives.

Only reason I turn mine on anymore. Early in its first two years, sure - not much there and what exclusives they had were relatively weak. But there's a ton now to justify at least the purchase of a slim.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

I sold my PS4 because I never used it but I wouldn't mind a Blu-ray player maybe..


E: otoh I already have like 3 xbone controllers for my PCs

MagusDraco
Nov 11, 2011

even speedwagon was trolled

Happy_Misanthrope posted:

That Evga is a 3-slot design isn't it though? Rather bizarre choice from Evga as that nixes it from most mini-itx systems as an option when in terms of power/thermals it should be ideal.

edit: Nice to see the overclocking #'s though, that was my hope with the smaller chip in that it would overclock well and seems you get a healthy jump.

Oh welp yeah. It's a 2.75 slot card.

edit: I keep having to fight the urge to upgrade to a ps4 pro now that I have a real nice TV and still need to play most of the great sony exclusives. Not really worth it for me to do that 2ish years before the new consoles though

MagusDraco fucked around with this message at 16:31 on Feb 22, 2019

headcase
Sep 28, 2001

what are the odds that game/driver devs start using RTX chips for raw processing power to improve other aspects of game performance in future iterations?

Inept
Jul 8, 2003

MagusDraco posted:

The GTX 1660ti reviews dropped. It's pretty much a $280~ GTX 1070 in performance with 6 gigs of GDDR6 ram and no RTX features.

edit: Anandtech review
Gamers Nexus

edit: also it's pretty short. but still a 2 slot card. 2.75 slot my mistake

The lack of competition for nvidia is really terrible. It's $100 cheaper than the comparable card that came out almost 3 years ago, but it still has 2GB less RAM.

eames
May 9, 2009

headcase posted:

what are the odds that game/driver devs start using RTX chips for raw processing power to improve other aspects of game performance in future iterations?

DLSS is their best attempt at that

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

Inept posted:

The lack of competition for nvidia is really terrible. It's $100 cheaper than the comparable card that came out almost 3 years ago, but it still has 2GB less RAM.

Turing is one big perf/$ horror show one after another.

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

repiv posted:

Back to DLSS, they've already patched the Metro implementation:




https://www.reddit.com/r/nvidia/comments/at6wrg/metro_exodus_dlss_improvements/

Maybe they can salvage this after all.

Ah a lot better. My first question whenever seeing these is now how it compares to 1400p, but how it compares to an equivalent res to equal the DLSS performance impact, and seem the Reddit guy did this and DLSS comes out on top now:



Was all set to pick up a 1660Ti, but if this can be the quality of DLSS going forward...dunno, 'close enough' to native 4K is pretty high on my list of priorities as I game on a 55" 4k tv.

headcase
Sep 28, 2001

So the answer is yes. There could be meaningful innovations that make good use of those chips in your run of the mill competitive game.

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

headcase posted:

So the answer is yes. There could be meaningful innovations that make good use of those chips in your run of the mill competitive game.
Bear in mind DLSS is limited to max of 60fps, so 'competitive' will vary depending on what kind of display you own. If you're a BFV player on a 144hz monitor, any significant DLSS improvement won't apply to you. It's perfect for a single player game like Metro though.

orcane
Jun 13, 2012

Fun Shoe
It completely depends on the game's specific implementation, also having to train the algorithm for game + RTX mode + resolution + target framerate means it will not be anything but a gimmick for a long time.

Craptacular!
Jul 9, 2001

Fuck the DH
I still have a yikes moment when I look at the telephone pole and see how un-straightened it looks compared to the original. It's like a high resolution version of PS2 jaggies or something.

I mean yeah there's more framerates but it has to look nicer than simply running in 1440 and getting your frames that way, and I don't think it achieves that.

headcase
Sep 28, 2001

Happy_Misanthrope posted:

Bear in mind DLSS is limited to max of 60fps, so 'competitive' will vary depending on what kind of display you own. If you're a BFV player on a 144hz monitor, any significant DLSS improvement won't apply to you. It's perfect for a single player game like Metro though.

I get that this is not that thing, but there could be undiscovered innovations that do work at high frame rates (or just increase frame rates/reduce lag).

e: they have to know that is what we are paying for here.

headcase fucked around with this message at 19:26 on Feb 22, 2019

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

Craptacular! posted:

I still have a yikes moment when I look at the telephone pole and see how un-straightened it looks compared to the original. It's like a high resolution version of PS2 jaggies or something.

I mean yeah there's more framerates but it has to look nicer than simply running in 1440 and getting your frames that way, and I don't think it achieves that.
Yeah those are the kind of artifacts you might sometimes see. The tree branches look quite good though, so one rendering quirk does not mean every element that shares a similar structure will also share those artifacts.

Those images are showing comparisons with native 4k though, not 1440p.

edit: Here's a Onedrive link from a Reddit user showing Exodus with 4k at 80% scaling, and 4K DLSS (upscaled from 1440p). I think the DLSS image looks noticeably better and closer to native 4K. Native 1440p is not remotely comparable now, where before this patch it was superior.

Happy_Misanthrope fucked around with this message at 19:35 on Feb 22, 2019

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
That they're even able to offer competitive quality at this point in the game, though, is actually pretty impressive. Yeah, it isn't perfect, yeah, it has limitations, and no, it's not as good as native. But if they can iterate on this for another 6-9 months while finishing off the 21-series, then DLSS 2.0 or whatever they'll release alongside the next round of cards might actually be pretty compelling.

lllllllllllllllllll
Feb 28, 2010

Now the scene's lighting is perfect!
Hm, maybe next year or so, when they hopefully sweeten the deal with a HDMI 2.1 port.

Stickman
Feb 1, 2004

headcase posted:

I get that this is not that thing, but there could be undiscovered innovations that do work at high frame rates (or just increase frame rates/reduce lag).

It's either a) significantly increase the number of Tensor Cores, or b) significantly decrease the size of the neural network. (b) might be possible with a lot more work, but considering how much trouble they're having getting NNs out the door for titles, I suspect that it's not going to be feasible any time soon, at least not without serious quality degradation. (a) is a possibility for future card generations if they get 7nm working properly.

Since DLSS operates on the final framebuffer using discrete hardware (and I don't believe uses additional inputs like movement vectors), you could also presumably c) pipeline the process. The GPU could start rendering the next frame while the Tensor Cores do their magic on the current frame. This would let you effectively remove the frame rate hit of DLSS over the target resolution, so long as DLSS computations are less than half of the total frame time. You'd also still have the same frametime lag as the reduced frame rate, since it would still take [scene computation time]+[DLSS compuation time] to get the frame out the door.

E: Since it's primarily included to offset the raytracing performance hit, it makes sense that they didn't go with (a)'s significant cost increase. (b) is probably just technically infeasible. I'm honestly not sure why they didn't pipeline the processing, though. Maybe I'm missing something about how DLSS works? Presumably it can't require loading/unloading from VRAM for every frame or it'd be much more of a performance hit than it already is.

Stickman fucked around with this message at 19:50 on Feb 22, 2019

headcase
Sep 28, 2001

Surely all this sweet sweet vector processing could help with non-raytraced non-predictive "traditional" rendering too?

Stickman
Feb 1, 2004

headcase posted:

Surely all this sweet sweet vector processing could help with non-raytraced non-predictive "traditional" rendering too?

It can if you're running 4k and your framerates are sub-60 (which is pretty much Ultra on any card except a 2080 Ti).

Stickman fucked around with this message at 20:11 on Feb 22, 2019

Stickman
Feb 1, 2004

Happy_Misanthrope posted:

Yeah those are the kind of artifacts you might sometimes see. The tree branches look quite good though, so one rendering quirk does not mean every element that shares a similar structure will also share those artifacts.

Those images are showing comparisons with native 4k though, not 1440p.

edit: Here's a Onedrive link from a Reddit user showing Exodus with 4k at 80% scaling, and 4K DLSS (upscaled from 1440p). I think the DLSS image looks noticeably better and closer to native 4K. Native 1440p is not remotely comparable now, where before this patch it was superior.

Thanks for posting this!

For me, the biggest problem with the DLSS in the FFXV videos was the shimmering effect caused by aliasing of distant thin objects (or even just edges of small objects). You can see some indication of that in the still shots:

4k 80% vs. 4k DLSS


The wires and pipe/beam edges look blurry but smoother in the 80% rendering. The trade-off is that foreground objects look much better:

4k 80% vs. 4k DLSS


Obviously we'd need to see it in motion to determine how much the aliasing causes noticeable temporal effects, but I suspect there's still a tradeoff. On the other hand, all of our rendering techniques have some artifacts, so it's just a matter of which one's bother you the least!

Stickman fucked around with this message at 20:12 on Feb 22, 2019

SwissArmyDruid
Feb 14, 2014

by sebmojo
The rumor mill grinds apace:

http://www.jeuxvideo.com/news/1006598/projet-scarlet-les-deux-prochaines-xbox-devoilees-a-l-e3-2019.htm

quote:

The Xbox brand is quietly charting its future going forward. In addition to its partnership with Nintendo, Xbox is expected to present two new machines at E3 2019, already known as "Lockhart" and "Anaconda"

[...]

Over the course of our investigation, we got confirmation from data sheets that leaked back in January were not far from the truth, including SSDs in both machines.

[...]

The release date is scheduled for holiday 2020, at the same time as the long-awaited Halo Infinite. which will be a launch title.

So, that's when we should expect to see Navi, then.

(Man, Google's translate has gotten a lot better for French over the past year. I remember when I was translating Canard PC articles talking about the Zen leaks a few years back. Now, I could just copy-paste the whole thing straight out of Google Translate and it would be largely intelligible.)

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

Happy_Misanthrope posted:

Bear in mind DLSS is limited to max of 60fps, so 'competitive' will vary depending on what kind of display you own. If you're a BFV player on a 144hz monitor, any significant DLSS improvement won't apply to you. It's perfect for a single player game like Metro though.

This is one of the things that makes me have a negative outlook on pc gaming, by the way. Not that it will die entirely or anything, but it's funny how PC gaming has become this hodgepodge of niche use cases on the high end. It seems to work for now though.

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."
Here's some more DLSS comparison shots between native 1440p and DLSS 4k. Note this is somewhat unfair as DLSS 4K doesn't deliver 1440p performance, it's closer to 1800p, but just shows that at the very least, it's a huge improvement over it's native starting res.

1440p upscaled normally to 4k:



4k DLSS:



edit: Credit to user Dictator at Beyond3d.com for these.

Happy_Misanthrope fucked around with this message at 20:35 on Feb 22, 2019

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

Taima posted:

This is one of the things that makes me have a negative outlook on pc gaming, by the way. Not that it will die entirely or anything, but it's funny how PC gaming has become this hodgepodge of niche use cases on the high end. It seems to work for now though.
Well yeah, that's also my concern as I stated earlier - the PC's advantages over consoles start to become less overt and more niche, and require a greater hardware investment to really see those advantages. It's an issue when Moore's Law is a spec in the rear view mirror, also bear in mind we're in the midst of a huge res jump for most (1080p/1440p->4k), so we're also asking a lot more. Have to expect some compromises to keep it affordable.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
Excuse my ignorance but I just got into the RTX game on my gaming PC... so DLSS has to be specifically implemented right? So it won't be a universal thing ever? I love the idea, because my 2080 doesn't guarantee 4k/60fps gaming like the 2080 TI does, so if this could help guarantee 4K gaming on my end into the future it would be a relief. I'm already having trouble with some specific, even older titles like Rise of the Tomb Raider.

Taima fucked around with this message at 20:36 on Feb 22, 2019

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
Correct. DLSS relies on tuned neural nets to work, and that tuning more or less has to be done by NVidia. So you can probably expect to see DLSS profiles for most popular games, but your random indy game is unlikely to get one. Then again, most indy games aren't nearly as complex as mainstream AAA titles, and thus usually need it less to begin with.

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

Taima posted:

Excuse my ignorance but I just got into the RTX game on my gaming PC... so DLSS has to be specifically implemented right? So it won't be a universal thing ever? I love the idea, because my 2080 doesn't guarantee 4k/60fps gaming like the 2080 TI does, so if this could help guarantee 4K gaming on my end into the future it would be a relief.
Yeah, unfortunately it does require some intervention from the developer. If it can work this well on the whole I'd love to see it being an option for most games but likely not to happen if it requires developers to patch old code.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Taima posted:

Excuse my ignorance but I just got into the RTX game on my gaming PC... so DLSS has to be specifically implemented right? So it won't be a universal thing ever? I love the idea, because my 2080 doesn't guarantee 4k/60fps gaming like the 2080 TI does, so if this could help guarantee 4K gaming on my end into the future it would be a relief.

Right now you get better performance and cleaner results by simply using the render scale slider in most modern games. That doesn't require anything, it's just sub-native rendering. But this new Metro implementation could be something to watch out for.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

HalloKitty posted:

Right now you get better performance and cleaner results by simply using the render scale slider in most modern games. That doesn't require anything, it's just sub-native rendering. As far as I know, DLSS in actual games has currently not been proven to be better in any way than just.. running at a lower resolution.

You bring up a good point. The issue is that I game on a 75 inch TV. And the honest truth is that even 1440p tends to look relatively terrible compared to native 4K at that resolution. But I really hope to a find a solution to that. I think if I had a smaller TV, turning the resolution down would be a fine solution, just not sure it's for me based on what I've experienced with my rig.

A big display has a way of making everything look like poo poo as far as I can tell, unless you're running native. I am worried that by upgrading my rig and TV, I've entered this inescapable rabbit hole where nothing is ever going to look right again. But when it works, it's incredible. Titanfall 2 at 4K/60 on a 75 inch monitor is glorious, for instance. It's just not the norm.

Taima fucked around with this message at 20:42 on Feb 22, 2019

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

Taima posted:

You bring up a good point. The issue is that I game on a 75 inch TV. And the honest truth is that even 1440p tends to look relatively terrible compared to native 4K at that resolution. But I really hope to a find a solution to that. I think if I had a smaller TV, turning the resolution down would be a fine solution, just not sure it's for me based on what I've experienced with my rig.
If the game doesn't have a scaler and you have performance to spare at 1440p with the quality settings you use, just create a custom res. This is how I game on my 4K TV with my piddly 3GB 1060, use an interm resolution like 3200x1800 when I can get the performance I want. Looks noticeably better than 1440p.

quote:

A big display has a way of making everything look like poo poo as far as I can tell, unless you're running native. I am worried that by upgrading my rig and TV, I've entered this inescapable rabbit hole where nothing is ever going to look right again. But when it works, it's incredible. Titanfall 2 at 4K/60 on a 75 inch monitor is glorious, for instance. It's just not the norm.
That's exactly what happened to me and I only have 55". I don't play enough to run out and get a $1000 (cad) 2080, but I'll sooner accept 30fps (depends on the game) than running a game at 1080p which looks like poo poo. Doesn't help that even from evenly divisible resolutions neither AMD or Nvidia offer a nearest-neighbour-pixel upscaling method, that would help somewhat - both the TV and Nvidia's scaling suck for resolutions that low.

Seamonster
Apr 30, 2007

IMMER SIEGREICH
For all the bigass TV folks, how do you set your antialiasing?

I know you don't really need maxed out AA on smaller 4k monitors since the pixel density is so high but you guys have...viewing distance?

Mr.PayDay
Jan 2, 2004
life is short - play hard

Truga posted:

I wanted to run games well on 2560x1600@72hz, also be ready for VR, which is why I bought a 980Ti.

It's gonna be 4 years old today and still works fine for these requirements. There's some VR games where it has issues, but in those games even a 2080Ti has the same issues, so I don't see any reason why I'd upgrade. I'll probably buy a radeon on the cheap off ebay later this year when my zen3 linux box is ready, though.

I went from a 980Ti to 2080Ti on 1440p and had an average fps jump - depending on the game/engine - from 100-200%.
I went from 65 fps in BF5 to 140+ for example. Assassins Creed was like 40 fps and is now at 80 fps on the same Ultra settings. Forza Horizon 4 with MSAA8 and several Extreme settings allows 90-100 fps instead of 40+ and the speed with that brilliant graphics feels amazing.
It is so much more fun, for me.

Every game of my libraries showed me reasons with every single additional fps why that invest was great.
Metro Exodus with RTX Features is visually incredible.

Of course, if you care less about visual immersion and don’t find joy in that, you won’t find incentives to upgrade a 4 years old GPU.

Enthusiasts like me don’t have a rational approach. I now realize how the 980Ti held significant aspects of fun back from me, now that I got double or triple fps.
And RTX stuff on top of that.
And I paid the price form disposable income.

Pc Gaming and Hardware is not an expensive hobby compared to cars, traveling, house+garden or even some sports.

I find it absurd how gamers gonna diss and mock other gamers for spending 3K evey 3 years tbh.

Stickman
Feb 1, 2004

Mr.PayDay posted:

I find it absurd how gamers gonna diss and mock other gamers for spending 3K evey 3 years tbh.

Phew username/post combo :v:

I think at that point it's because for most people the marginal gains are pretty tiny over spending 1500 every six years (and maybe upgrading the GPU at 3-year intervals). It's all relative, of course, but I guess that's just the price you pay :downsrim:

Seamonster
Apr 30, 2007

IMMER SIEGREICH
The 2k I will spend this year will get me:

At least quadruple the thread count
DDR4 (and twice as much)
NVME storage (again twice as much)
PCIE 3.0/4.0?

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

Seamonster posted:

For all the bigass TV folks, how do you set your antialiasing?

I know you don't really need maxed out AA on smaller 4k monitors since the pixel density is so high but you guys have...viewing distance?

Simple post-AA is more than fine enough, but even at 4k that depends on the game. 4K doesn't get rid of subpixel aliasing by any means though. Rise of the Tomb Raider is an extreme case but even at 4K there's still quite a bit of shimmering, it's a game that benefits massively from higher res but it still needs better AA. Was thinking that would be an ideal case for DLSS 2X, but don't know if the game has to use TAA first before DLSS can work (not like it would be patched regardless).

Stickman
Feb 1, 2004

Happy_Misanthrope posted:

Simple post-AA is more than fine enough, but even at 4k that depends on the game. 4K doesn't get rid of subpixel aliasing by any means though. Rise of the Tomb Raider is an extreme case but even at 4K there's still quite a bit of shimmering, it's a game that benefits massively from higher res but it still needs better AA. Was thinking that would be an ideal case for DLSS 2X, but don't know if the game has to use TAA first before DLSS can work (not like it would be patched regardless).

Since DLSS upscaling doesn't use a TAA pass, I suspect 2X won't as well!

repiv
Aug 13, 2009

The DLSS blurb on Nvidia's developer site originally said it requires motion vectors and the previous frame as input à-la TAA, but the current version of the page no longer says that. Maybe they ditched the temporal element at some point.

Adbot
ADBOT LOVES YOU

Stickman
Feb 1, 2004

repiv posted:

The DLSS blurb on Nvidia's developer site originally said it requires motion vectors and the previous frame as input à-la TAA, but the current version of the page no longer says that. Maybe they ditched the temporal element at some point.

Requiring motion vectors is different from a TAA pass, though. That's just extra input.

E: Since you're on now, what do you think about pipelining DLSS? Since it just uses completed framebuffers (and maybe a motion vector buffer) as input, it seems like should be relatively easy to start the next frame while DLSS finishes the first, effectively eliminate the frame rate hit over the input resolution (but not the frame time hit). It seems like they definitely didn't do this, because DLSS is a bit frame rate hit over 1440p, but is there a good reason not to besides maybe requiring a small amount of extra hardware?

Stickman fucked around with this message at 21:46 on Feb 22, 2019

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply