|
MagusDraco posted:The GTX 1660ti reviews dropped. It's pretty much a $280~ GTX 1070 in performance with 6 gigs of GDDR6 ram and no RTX features. edit: Nice to see the overclocking #'s though, that was my hope with the smaller chip in that it would overclock well and seems you get a healthy jump. Happy_Misanthrope fucked around with this message at 16:10 on Feb 22, 2019 |
# ? Feb 22, 2019 16:07 |
|
|
# ? Jun 5, 2024 06:03 |
|
Happy_Misanthrope posted:Why do you think MS started buying up studios left and right? Because they had the same idea - exclusivity is dead - then Sony lapped them with the PS4. God of War has sold over 6 mil, Spiderman has sold over 9 million. How do think The Last of Us 2 will do? Perhaps 'less' exclusives than in years past, but they still matter a fuckton. Only reason I bought a PS4 Pro was because of their exclusives, a bunch of great Sony exclusives.
|
# ? Feb 22, 2019 16:13 |
|
B-Mac posted:Only reason I bought a PS4 Pro was because of their exclusives, a bunch of great Sony exclusives. Only reason I turn mine on anymore. Early in its first two years, sure - not much there and what exclusives they had were relatively weak. But there's a ton now to justify at least the purchase of a slim.
|
# ? Feb 22, 2019 16:18 |
|
I sold my PS4 because I never used it but I wouldn't mind a Blu-ray player maybe.. E: otoh I already have like 3 xbone controllers for my PCs
|
# ? Feb 22, 2019 16:19 |
|
Happy_Misanthrope posted:That Evga is a 3-slot design isn't it though? Rather bizarre choice from Evga as that nixes it from most mini-itx systems as an option when in terms of power/thermals it should be ideal. Oh welp yeah. It's a 2.75 slot card. edit: I keep having to fight the urge to upgrade to a ps4 pro now that I have a real nice TV and still need to play most of the great sony exclusives. Not really worth it for me to do that 2ish years before the new consoles though MagusDraco fucked around with this message at 16:31 on Feb 22, 2019 |
# ? Feb 22, 2019 16:20 |
|
what are the odds that game/driver devs start using RTX chips for raw processing power to improve other aspects of game performance in future iterations?
|
# ? Feb 22, 2019 16:29 |
|
MagusDraco posted:The GTX 1660ti reviews dropped. It's pretty much a $280~ GTX 1070 in performance with 6 gigs of GDDR6 ram and no RTX features. The lack of competition for nvidia is really terrible. It's $100 cheaper than the comparable card that came out almost 3 years ago, but it still has 2GB less RAM.
|
# ? Feb 22, 2019 16:33 |
|
headcase posted:what are the odds that game/driver devs start using RTX chips for raw processing power to improve other aspects of game performance in future iterations? DLSS is their best attempt at that
|
# ? Feb 22, 2019 16:37 |
|
Inept posted:The lack of competition for nvidia is really terrible. It's $100 cheaper than the comparable card that came out almost 3 years ago, but it still has 2GB less RAM. Turing is one big perf/$ horror show one after another.
|
# ? Feb 22, 2019 16:42 |
|
repiv posted:Back to DLSS, they've already patched the Metro implementation: Ah a lot better. My first question whenever seeing these is now how it compares to 1400p, but how it compares to an equivalent res to equal the DLSS performance impact, and seem the Reddit guy did this and DLSS comes out on top now: Was all set to pick up a 1660Ti, but if this can be the quality of DLSS going forward...dunno, 'close enough' to native 4K is pretty high on my list of priorities as I game on a 55" 4k tv.
|
# ? Feb 22, 2019 16:54 |
|
So the answer is yes. There could be meaningful innovations that make good use of those chips in your run of the mill competitive game.
|
# ? Feb 22, 2019 18:48 |
|
headcase posted:So the answer is yes. There could be meaningful innovations that make good use of those chips in your run of the mill competitive game.
|
# ? Feb 22, 2019 19:05 |
|
It completely depends on the game's specific implementation, also having to train the algorithm for game + RTX mode + resolution + target framerate means it will not be anything but a gimmick for a long time.
|
# ? Feb 22, 2019 19:16 |
|
I still have a yikes moment when I look at the telephone pole and see how un-straightened it looks compared to the original. It's like a high resolution version of PS2 jaggies or something. I mean yeah there's more framerates but it has to look nicer than simply running in 1440 and getting your frames that way, and I don't think it achieves that.
|
# ? Feb 22, 2019 19:18 |
|
Happy_Misanthrope posted:Bear in mind DLSS is limited to max of 60fps, so 'competitive' will vary depending on what kind of display you own. If you're a BFV player on a 144hz monitor, any significant DLSS improvement won't apply to you. It's perfect for a single player game like Metro though. I get that this is not that thing, but there could be undiscovered innovations that do work at high frame rates (or just increase frame rates/reduce lag). e: they have to know that is what we are paying for here. headcase fucked around with this message at 19:26 on Feb 22, 2019 |
# ? Feb 22, 2019 19:18 |
|
Craptacular! posted:I still have a yikes moment when I look at the telephone pole and see how un-straightened it looks compared to the original. It's like a high resolution version of PS2 jaggies or something. Those images are showing comparisons with native 4k though, not 1440p. edit: Here's a Onedrive link from a Reddit user showing Exodus with 4k at 80% scaling, and 4K DLSS (upscaled from 1440p). I think the DLSS image looks noticeably better and closer to native 4K. Native 1440p is not remotely comparable now, where before this patch it was superior. Happy_Misanthrope fucked around with this message at 19:35 on Feb 22, 2019 |
# ? Feb 22, 2019 19:29 |
|
That they're even able to offer competitive quality at this point in the game, though, is actually pretty impressive. Yeah, it isn't perfect, yeah, it has limitations, and no, it's not as good as native. But if they can iterate on this for another 6-9 months while finishing off the 21-series, then DLSS 2.0 or whatever they'll release alongside the next round of cards might actually be pretty compelling.
|
# ? Feb 22, 2019 19:38 |
|
Hm, maybe next year or so, when they hopefully sweeten the deal with a HDMI 2.1 port.
|
# ? Feb 22, 2019 19:45 |
|
headcase posted:I get that this is not that thing, but there could be undiscovered innovations that do work at high frame rates (or just increase frame rates/reduce lag). It's either a) significantly increase the number of Tensor Cores, or b) significantly decrease the size of the neural network. (b) might be possible with a lot more work, but considering how much trouble they're having getting NNs out the door for titles, I suspect that it's not going to be feasible any time soon, at least not without serious quality degradation. (a) is a possibility for future card generations if they get 7nm working properly. Since DLSS operates on the final framebuffer using discrete hardware (and I don't believe uses additional inputs like movement vectors), you could also presumably c) pipeline the process. The GPU could start rendering the next frame while the Tensor Cores do their magic on the current frame. This would let you effectively remove the frame rate hit of DLSS over the target resolution, so long as DLSS computations are less than half of the total frame time. You'd also still have the same frametime lag as the reduced frame rate, since it would still take [scene computation time]+[DLSS compuation time] to get the frame out the door. E: Since it's primarily included to offset the raytracing performance hit, it makes sense that they didn't go with (a)'s significant cost increase. (b) is probably just technically infeasible. I'm honestly not sure why they didn't pipeline the processing, though. Maybe I'm missing something about how DLSS works? Presumably it can't require loading/unloading from VRAM for every frame or it'd be much more of a performance hit than it already is. Stickman fucked around with this message at 19:50 on Feb 22, 2019 |
# ? Feb 22, 2019 19:47 |
|
Surely all this sweet sweet vector processing could help with non-raytraced non-predictive "traditional" rendering too?
|
# ? Feb 22, 2019 19:58 |
|
headcase posted:Surely all this sweet sweet vector processing could help with non-raytraced non-predictive "traditional" rendering too? It can if you're running 4k and your framerates are sub-60 (which is pretty much Ultra on any card except a 2080 Ti). Stickman fucked around with this message at 20:11 on Feb 22, 2019 |
# ? Feb 22, 2019 20:02 |
|
Happy_Misanthrope posted:Yeah those are the kind of artifacts you might sometimes see. The tree branches look quite good though, so one rendering quirk does not mean every element that shares a similar structure will also share those artifacts. Thanks for posting this! For me, the biggest problem with the DLSS in the FFXV videos was the shimmering effect caused by aliasing of distant thin objects (or even just edges of small objects). You can see some indication of that in the still shots: 4k 80% vs. 4k DLSS The wires and pipe/beam edges look blurry but smoother in the 80% rendering. The trade-off is that foreground objects look much better: 4k 80% vs. 4k DLSS Obviously we'd need to see it in motion to determine how much the aliasing causes noticeable temporal effects, but I suspect there's still a tradeoff. On the other hand, all of our rendering techniques have some artifacts, so it's just a matter of which one's bother you the least! Stickman fucked around with this message at 20:12 on Feb 22, 2019 |
# ? Feb 22, 2019 20:09 |
|
The rumor mill grinds apace: http://www.jeuxvideo.com/news/1006598/projet-scarlet-les-deux-prochaines-xbox-devoilees-a-l-e3-2019.htm quote:The Xbox brand is quietly charting its future going forward. In addition to its partnership with Nintendo, Xbox is expected to present two new machines at E3 2019, already known as "Lockhart" and "Anaconda" So, that's when we should expect to see Navi, then. (Man, Google's translate has gotten a lot better for French over the past year. I remember when I was translating Canard PC articles talking about the Zen leaks a few years back. Now, I could just copy-paste the whole thing straight out of Google Translate and it would be largely intelligible.)
|
# ? Feb 22, 2019 20:15 |
|
Happy_Misanthrope posted:Bear in mind DLSS is limited to max of 60fps, so 'competitive' will vary depending on what kind of display you own. If you're a BFV player on a 144hz monitor, any significant DLSS improvement won't apply to you. It's perfect for a single player game like Metro though. This is one of the things that makes me have a negative outlook on pc gaming, by the way. Not that it will die entirely or anything, but it's funny how PC gaming has become this hodgepodge of niche use cases on the high end. It seems to work for now though.
|
# ? Feb 22, 2019 20:28 |
|
Here's some more DLSS comparison shots between native 1440p and DLSS 4k. Note this is somewhat unfair as DLSS 4K doesn't deliver 1440p performance, it's closer to 1800p, but just shows that at the very least, it's a huge improvement over it's native starting res. 1440p upscaled normally to 4k: 4k DLSS: edit: Credit to user Dictator at Beyond3d.com for these. Happy_Misanthrope fucked around with this message at 20:35 on Feb 22, 2019 |
# ? Feb 22, 2019 20:31 |
|
Taima posted:This is one of the things that makes me have a negative outlook on pc gaming, by the way. Not that it will die entirely or anything, but it's funny how PC gaming has become this hodgepodge of niche use cases on the high end. It seems to work for now though.
|
# ? Feb 22, 2019 20:33 |
|
Excuse my ignorance but I just got into the RTX game on my gaming PC... so DLSS has to be specifically implemented right? So it won't be a universal thing ever? I love the idea, because my 2080 doesn't guarantee 4k/60fps gaming like the 2080 TI does, so if this could help guarantee 4K gaming on my end into the future it would be a relief. I'm already having trouble with some specific, even older titles like Rise of the Tomb Raider.
Taima fucked around with this message at 20:36 on Feb 22, 2019 |
# ? Feb 22, 2019 20:34 |
|
Correct. DLSS relies on tuned neural nets to work, and that tuning more or less has to be done by NVidia. So you can probably expect to see DLSS profiles for most popular games, but your random indy game is unlikely to get one. Then again, most indy games aren't nearly as complex as mainstream AAA titles, and thus usually need it less to begin with.
|
# ? Feb 22, 2019 20:36 |
|
Taima posted:Excuse my ignorance but I just got into the RTX game on my gaming PC... so DLSS has to be specifically implemented right? So it won't be a universal thing ever? I love the idea, because my 2080 doesn't guarantee 4k/60fps gaming like the 2080 TI does, so if this could help guarantee 4K gaming on my end into the future it would be a relief.
|
# ? Feb 22, 2019 20:37 |
|
Taima posted:Excuse my ignorance but I just got into the RTX game on my gaming PC... so DLSS has to be specifically implemented right? So it won't be a universal thing ever? I love the idea, because my 2080 doesn't guarantee 4k/60fps gaming like the 2080 TI does, so if this could help guarantee 4K gaming on my end into the future it would be a relief. Right now you get better performance and cleaner results by simply using the render scale slider in most modern games. That doesn't require anything, it's just sub-native rendering. But this new Metro implementation could be something to watch out for.
|
# ? Feb 22, 2019 20:37 |
|
HalloKitty posted:Right now you get better performance and cleaner results by simply using the render scale slider in most modern games. That doesn't require anything, it's just sub-native rendering. As far as I know, DLSS in actual games has currently not been proven to be better in any way than just.. running at a lower resolution. You bring up a good point. The issue is that I game on a 75 inch TV. And the honest truth is that even 1440p tends to look relatively terrible compared to native 4K at that resolution. But I really hope to a find a solution to that. I think if I had a smaller TV, turning the resolution down would be a fine solution, just not sure it's for me based on what I've experienced with my rig. A big display has a way of making everything look like poo poo as far as I can tell, unless you're running native. I am worried that by upgrading my rig and TV, I've entered this inescapable rabbit hole where nothing is ever going to look right again. But when it works, it's incredible. Titanfall 2 at 4K/60 on a 75 inch monitor is glorious, for instance. It's just not the norm. Taima fucked around with this message at 20:42 on Feb 22, 2019 |
# ? Feb 22, 2019 20:40 |
|
Taima posted:You bring up a good point. The issue is that I game on a 75 inch TV. And the honest truth is that even 1440p tends to look relatively terrible compared to native 4K at that resolution. But I really hope to a find a solution to that. I think if I had a smaller TV, turning the resolution down would be a fine solution, just not sure it's for me based on what I've experienced with my rig. quote:A big display has a way of making everything look like poo poo as far as I can tell, unless you're running native. I am worried that by upgrading my rig and TV, I've entered this inescapable rabbit hole where nothing is ever going to look right again. But when it works, it's incredible. Titanfall 2 at 4K/60 on a 75 inch monitor is glorious, for instance. It's just not the norm.
|
# ? Feb 22, 2019 20:56 |
|
For all the bigass TV folks, how do you set your antialiasing? I know you don't really need maxed out AA on smaller 4k monitors since the pixel density is so high but you guys have...viewing distance?
|
# ? Feb 22, 2019 20:58 |
|
Truga posted:I wanted to run games well on 2560x1600@72hz, also be ready for VR, which is why I bought a 980Ti. I went from a 980Ti to 2080Ti on 1440p and had an average fps jump - depending on the game/engine - from 100-200%. I went from 65 fps in BF5 to 140+ for example. Assassins Creed was like 40 fps and is now at 80 fps on the same Ultra settings. Forza Horizon 4 with MSAA8 and several Extreme settings allows 90-100 fps instead of 40+ and the speed with that brilliant graphics feels amazing. It is so much more fun, for me. Every game of my libraries showed me reasons with every single additional fps why that invest was great. Metro Exodus with RTX Features is visually incredible. Of course, if you care less about visual immersion and don’t find joy in that, you won’t find incentives to upgrade a 4 years old GPU. Enthusiasts like me don’t have a rational approach. I now realize how the 980Ti held significant aspects of fun back from me, now that I got double or triple fps. And RTX stuff on top of that. And I paid the price form disposable income. Pc Gaming and Hardware is not an expensive hobby compared to cars, traveling, house+garden or even some sports. I find it absurd how gamers gonna diss and mock other gamers for spending 3K evey 3 years tbh.
|
# ? Feb 22, 2019 21:01 |
|
Mr.PayDay posted:I find it absurd how gamers gonna diss and mock other gamers for spending 3K evey 3 years tbh. Phew username/post combo I think at that point it's because for most people the marginal gains are pretty tiny over spending 1500 every six years (and maybe upgrading the GPU at 3-year intervals). It's all relative, of course, but I guess that's just the price you pay
|
# ? Feb 22, 2019 21:14 |
|
The 2k I will spend this year will get me: At least quadruple the thread count DDR4 (and twice as much) NVME storage (again twice as much) PCIE 3.0/4.0?
|
# ? Feb 22, 2019 21:19 |
|
Seamonster posted:For all the bigass TV folks, how do you set your antialiasing? Simple post-AA is more than fine enough, but even at 4k that depends on the game. 4K doesn't get rid of subpixel aliasing by any means though. Rise of the Tomb Raider is an extreme case but even at 4K there's still quite a bit of shimmering, it's a game that benefits massively from higher res but it still needs better AA. Was thinking that would be an ideal case for DLSS 2X, but don't know if the game has to use TAA first before DLSS can work (not like it would be patched regardless).
|
# ? Feb 22, 2019 21:24 |
|
Happy_Misanthrope posted:Simple post-AA is more than fine enough, but even at 4k that depends on the game. 4K doesn't get rid of subpixel aliasing by any means though. Rise of the Tomb Raider is an extreme case but even at 4K there's still quite a bit of shimmering, it's a game that benefits massively from higher res but it still needs better AA. Was thinking that would be an ideal case for DLSS 2X, but don't know if the game has to use TAA first before DLSS can work (not like it would be patched regardless). Since DLSS upscaling doesn't use a TAA pass, I suspect 2X won't as well!
|
# ? Feb 22, 2019 21:29 |
|
The DLSS blurb on Nvidia's developer site originally said it requires motion vectors and the previous frame as input à-la TAA, but the current version of the page no longer says that. Maybe they ditched the temporal element at some point.
|
# ? Feb 22, 2019 21:36 |
|
|
# ? Jun 5, 2024 06:03 |
|
repiv posted:The DLSS blurb on Nvidia's developer site originally said it requires motion vectors and the previous frame as input à-la TAA, but the current version of the page no longer says that. Maybe they ditched the temporal element at some point. Requiring motion vectors is different from a TAA pass, though. That's just extra input. E: Since you're on now, what do you think about pipelining DLSS? Since it just uses completed framebuffers (and maybe a motion vector buffer) as input, it seems like should be relatively easy to start the next frame while DLSS finishes the first, effectively eliminate the frame rate hit over the input resolution (but not the frame time hit). It seems like they definitely didn't do this, because DLSS is a bit frame rate hit over 1440p, but is there a good reason not to besides maybe requiring a small amount of extra hardware? Stickman fucked around with this message at 21:46 on Feb 22, 2019 |
# ? Feb 22, 2019 21:42 |