|
shrike82 posted:Nvidia still isn't a software company? They've been trying to bind the ML stack to Nvidia hardware but Google successfully pushing researchers and ML framework towards their TPU solution is a good example of that. Hmm, and yet OpenCL has yet to achieve more than trivial traction in the GPGPU ecosystem, ROCm is just as much as a failure as HIP Framework and Bolt Framework (have fun even googling it, it's that forgotten) before it, popular ML ecosystems still don't really support OpenCL as a first-class ecosystem, and AMD has been left behind yet again on software integration with DLSS. In comparison, how many years did AMD spend playing with their dick before Mantle/Vulkan/etc caught any traction? How many years did they spend "building" the Adaptive Sync ecosystem with partners before NVIDIA finally took pity and did it properly for them, assuming control of the market in the process? How long did it take before they managed to write good drivers for Navi, or Vega, or Fury X before it? NVIDIA is absolutely correct, they're a software company. The best way to get people to use your hardware is to write good software for it. It's the icc/mkl/etc approach. Paul MaudDib fucked around with this message at 07:12 on Jun 22, 2020 |
# ? Jun 22, 2020 07:06 |
|
|
# ? May 31, 2024 01:28 |
|
I don't get why AMD is relevant. Google gets to charge an hourly rate to TPU users while Nvidia just sells hardware one-off.
|
# ? Jun 22, 2020 07:09 |
|
shrike82 posted:I don't get why AMD is relevant. AMD is what happens when you don't think you're a software company. Google is a software company as well - poo poo, they won't even sell you their TPU hardware, you just run their software and pay by the tensor-flop.
|
# ? Jun 22, 2020 07:09 |
|
Nvidia is a hardware company that got framework developers onboard because their hardware is good. Google is a services and compute company that built TPUs because they didn't want to pay the Nvidia tax. Guess who's going to win the compute war.
|
# ? Jun 22, 2020 07:12 |
|
shrike82 posted:Nvidia is a hardware company that got framework developers onboard because their hardware is good. Wrongo, NVIDIA is a hardware company that paid to write good software for their hardware. Like I said, it's the icc/mkl/etc model. Their developers did that, just like they paid to write DLSS and VRR and other stuff. And they dumped a shitload of money into the edu pipeline as well. That's why they're a software company. NVIDIA knows that nobody is going to pay developers to learn your hardware except you. That can be in-house library coders, that can be driver wizards, that can be people in the edu pipeline, that can be people at studios (ideally all of the above), but at the end of the day if you don't foot the bill nobody writes code that runs well on your hardware. AMD has a captive audience in the console market but that only goes so far, it took them roughly 7 years before mantle had any kind of traction, despite PS4 and Xbone being written around it. And that's why they've got good hardware when someone else is writing software for it (Linux/consoles/etc), but they completely poo poo the bed on Windows drivers for a year after launch on all their new products, every generation. Paul MaudDib fucked around with this message at 07:19 on Jun 22, 2020 |
# ? Jun 22, 2020 07:16 |
|
Yah I'm not sure why you keep on making like AMD is the competition to Nvidia. Google is. I wouldn't be surprised if we see Amazon and Microsoft rollout their own compute hardware a la TPUs. Bye bye Nvidia in the compute market.
|
# ? Jun 22, 2020 07:18 |
|
shrike82 posted:Yah I'm not sure why you keep on making like AMD is the competition to Nvidia. lol ok, AMD and NVIDIA play in no similar fields whatsoever, you said it champ. And nvidia has nothing whatsoever to do with software! quote:Google is. I wouldn't be surprised if we see Amazon and Microsoft rollout their own compute hardware a la TPUs. When did Google launch their thing? 2017 or something? Hasn't happened yet. Hasn't even seen traction yet. Hasn't seen non-trivial adoption yet. People like hardware that they can develop and run demo programs on their $200 gaming GPU, and test on a rig full of 4 1080 Tis. People like software that isn't restricted to a single cloud vendor's metered hourly servers paying by the flop. But yeah, Google writing the software to tie Tensorflow to their hardware is certainly better than... the AMD model of hoping someone else does it for you. And that's why Google is a software company. Paul MaudDib fucked around with this message at 07:29 on Jun 22, 2020 |
# ? Jun 22, 2020 07:21 |
|
You're pretty out of touch with the compute market, especially on the ML-side. Google has an edu-prosumer-professional stack of cloud services ranging from Colab to GCP TPU. A TPU v3 is faster than the A100. ML on a gaming GPU as an entry point for students isn't feasible anymore when contempory models require more memory than is availabe on the Titan SKU. The thing that really struck me how the market's changed is looking at recent ML papers and seeing how more and more experiments are being run on TPU pods rather than Nvidia GPU clusters.
|
# ? Jun 22, 2020 07:30 |
|
Given that a lot of AAA titles are built for consoles which will be AMD powered, do you think that will slow the uptake of DLSS? As it will only affect a portion of a portion of the total user base?
|
# ? Jun 22, 2020 12:36 |
|
More than if it hadn't been the case but as something approaching a low effort magic bullet to get your game running well at 4k I think devs and publishers would see it as easy win.
|
# ? Jun 22, 2020 13:31 |
|
https://videocardz.com/newz/nvidia-announces-a100-pcie-accelerator Nvidia announced the PCIe variant of the A100 and like usual it's a 250W part, in spite of the SXM variant being 400W. It's still possible that the consumer cards will go past that line but I doubt it.
|
# ? Jun 22, 2020 13:40 |
|
GTO posted:Given that a lot of AAA titles are built for consoles which will be AMD powered, do you think that will slow the uptake of DLSS? As it will only affect a portion of a portion of the total user base? If you're doing a PC port anyway and it's easy to implement, I don't think that would slow adoption much.
|
# ? Jun 22, 2020 16:10 |
|
Dogen posted:If you're doing a PC port anyway and it's easy to implement, I don't think that would slow adoption much. Yeah, in terms of return on dev time and risk, implementing DLSS has to be near the top of the list, especially because any AAA studio can just whisper “I wonder why this isn’t working” and have NVIDIA devrel rappel in with debug drivers and editors in hand.
|
# ? Jun 22, 2020 16:28 |
|
Looks like rogame's backdoor into getting unlisted 3dmark results is closed now. https://twitter.com/_rogame/status/1275093907240599552
|
# ? Jun 22, 2020 16:58 |
|
pyrotek posted:I think it looks better in some ways, worse in other ways than native rendering. There's also this video by DF, in particular this section which shows that in motion, DLSS can have significantly superior detail compared to TAA. Remember that even 'native' is compromised to some degree by how TAA works.
|
# ? Jun 22, 2020 18:17 |
|
I just think it's funny that nVidia has chosen this point in time to release RTX Voice and DLSS 2.0, which are features that would be perfect to build into a new console generation. Maybe nVidia could release a standalone device that just does RTX voice, and make that compatible with console headsets.
|
# ? Jun 22, 2020 18:59 |
|
Nvidia had no chance of getting into the new consoles anyway, Sony/MS had no choice but to stick with AMD to maintain near-full backwards compatibility and keep the cost benefit of a single chip solution. AMD are the only company that can supply a fast CPU and fast GPU combination - NV have fast GPUs but slow CPUs, and Intel have fast CPUs but slow GPUs (for now at least). Maybe the Switch Pro will have DLSS though...
|
# ? Jun 22, 2020 19:04 |
|
Isn't there bad blood between MS and Nvidia because of the original Xbox?
|
# ? Jun 22, 2020 19:40 |
|
Zedsdeadbaby posted:Isn't there bad blood between MS and Nvidia because of the original Xbox? It was Intel, not nVidia IIRC. Intel wanted a flat rate per processor sold, guaranteed for the full lifespan of the console. This is one of the bigger reasons the original Xbox had a shortened lifespan. But NV has pissed off a lot of other companies big and small so it could just as well be them too.
|
# ? Jun 22, 2020 19:42 |
|
as far as bumpgate goes, I think it's not realistic to expect your suppliers to eat the cost of the first generation of a novel material flaking out and causing unexpected problems. I'm sure NVIDIA would have been happy to ship it with a lead-based solder but the client wanted RoHS compliance. this wasn't a NVIDIA specific problem, this was the industry not really having a handle on how to use RoHS compliant solders properly, and it affected everyone. This is really when the trend of baking your GPU got started, and that affected cards from all brands. an analogy: if you buy an early OLED panel when the technology is not mature and it gets burn-in, well, sucks to be you. If you wait until the technology is mature and it suddenly regresses to way below the expected standard - that is a faulty product and a cause for action. or - let's say there is something wrong with a TSMC risk production node and all your early chips have a fault in a layer that will cause them to die in 2 years. Does AMD/NVIDIA have a recourse against TSMC in this situation? Not really, they're gonna tell you to get hosed. this is more about Apple throwing their weight around and bullying their suppliers into paying for the downsides of the engineering risks they took. They are a big client worth a lot of money, but they are also a pretty demanding client by all accounts. They want the premium bins of silicon, they want custom products engineered just for them, and they want minimum prices. Paul MaudDib fucked around with this message at 20:10 on Jun 22, 2020 |
# ? Jun 22, 2020 19:54 |
|
Penisaurus Sex posted:It was Intel, not nVidia IIRC. Intel wanted a flat rate per processor sold, guaranteed for the full lifespan of the console. This is one of the bigger reasons the original Xbox had a shortened lifespan. It was Nvidia, and this was quoted as a major reason for why they took more direct "ownership" of the GPU/SoC in their contracts with AMD since the Xbox 360, so that couldn't happen again.
|
# ? Jun 22, 2020 20:13 |
|
repiv posted:https://videocardz.com/newz/nvidia-announces-a100-pcie-accelerator I'm in the market for a replacement for my RTX Titan with more VRAM so hoping the Titan SKU is 40GB as well.
|
# ? Jun 22, 2020 23:01 |
|
Happy_Misanthrope posted:There's also this video by DF, in particular this section which shows that in motion, DLSS can have significantly superior detail compared to TAA. Remember that even 'native' is compromised to some degree by how TAA works. The specific part that looks worse to me is the halos they point out from the sharpening. Halos on edges is one of those things that really bothers me (lots of old Blu-rays had that problem from over-sharpening too) and I can see it in motion on any DLSS 2.0 game. It is probably the worst in Minecraft. Ignoring that issue, I prefer the DLSS 2.0 image to native resolution. Hopefully future games add DLSS options for things such as sharpening.
|
# ? Jun 23, 2020 06:05 |
|
Again watch this video, in particular the part starting around 8:00 I thought DLSS 2.0 looked over-sharpened at first, but the reality is that almost all of it is that it's just extremely compliant to a 32x supersampled image, which is obviously more correct. The reason they can do this is because DLSS 2.0 is doing a really good job of using motion data to supersample across multiple frames. There is some artifacting, especially under particularly chaotic and significant motion, but even then it looks pretty good. I think it was DF that showed some examples of hard camera cuts still showing surprisingly good quality, which in that circumstance I guess is just the ML doing its thing.
|
# ? Jun 23, 2020 06:20 |
|
I'm about as curious at all manners of scaling/encoding performance and overhead in Ampere as I am anything else. In the age of capture cards and display setups its kind of wild there isn't an All-in-Wonder again that can do streaming/routing/capture beyond just a dock or digital outputs.
|
# ? Jun 23, 2020 09:35 |
|
I wonder if Cyberpunk will be the game that comes bundled with the new cards.
|
# ? Jun 23, 2020 16:20 |
|
It's gonna sell crazyshitballs, what's in it for CDPR / why would Nvidia need to juice the 3k series like that?
|
# ? Jun 23, 2020 16:23 |
|
By the time it actually comes out, how many people are gonna be left that haven't pre-ordered it, anyhow?
|
# ? Jun 23, 2020 17:08 |
|
DrDork posted:By the time it actually comes out, how many people are gonna be left that haven't pre-ordered it, anyhow? Imagine loving preordering a game in 2020
|
# ? Jun 23, 2020 17:17 |
|
Zero VGS posted:Imagine loving preordering a game in 2020 I know you mean that sarcastically, but TW3 had well over 1.5M pre-orders, and Cyberpunk 2077 has been noted as having "way higher" pre-orders, soo...
|
# ? Jun 23, 2020 17:29 |
|
|
# ? Jun 23, 2020 18:13 |
|
I think the current list is Control (post-updates) Wolfenstein Youngblood (got DLSS 2.0 early before it was announced) MechWarrior 5 Deliver Us The Moon Minecraft RTX Beta AMID EVIL Control is the best showcase because it has tons of raytracing effects that are nicely offset by the DLSS perf boost
|
# ? Jun 23, 2020 18:22 |
|
Control on my current non-RTX GPU can't even run the normal reflection effects without glitches, but looks rad even with them turned off. I'd love to see it on RTX hardware.
|
# ? Jun 23, 2020 21:04 |
|
repiv posted:I think the current list is The annoying part is that only Wolfenstein Youngblood has a goddamn benchmark tool (maybe amid evil does? I haven’t booted that on my RTX machine yet). The fact I can’t test different settings in a standardized way with clear results drives me up a wall.
|
# ? Jun 23, 2020 21:07 |
|
I take it GPGPU performance isn't really affected much by driver updates right? It's mostly D3D and GL API calls?
|
# ? Jun 23, 2020 22:33 |
|
repiv posted:I think the current list is I would love to see RTX effects added to Observation but it'll never happen.
|
# ? Jun 23, 2020 22:37 |
|
Since Apple is going ARM on Macs and designs its own GPUs for its products that use A-chips, what are the odds it effectively becomes a 4th horse in the GPU race?
|
# ? Jun 23, 2020 22:51 |
|
Zero.
|
# ? Jun 23, 2020 23:06 |
|
Gyrotica posted:Since Apple is going ARM on Macs and designs its own GPUs for its products that use A-chips, what are the odds it effectively becomes a 4th horse in the GPU race? dGPU race? Very slim. Bigger and more powerful GPUs for new "desktop class" SoC parts? Incredibly likely. Comically, the (barely connected) lineage for the GPU in the Apple products is the PowerVR Kyro/Kyro 2, and ever further back the weird original PowerVR video cards for PCs.
|
# ? Jun 24, 2020 00:19 |
|
|
# ? May 31, 2024 01:28 |
|
They're going to kill it with iGPUs on their laptops given how lovely current solutions are.
|
# ? Jun 24, 2020 00:37 |