Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

shrike82 posted:

Nvidia still isn't a software company? They've been trying to bind the ML stack to Nvidia hardware but Google successfully pushing researchers and ML framework towards their TPU solution is a good example of that.

If anything, anything Nvidia services/software related has been pretty bad - GeForce Now, Experience, their cloud compute platform etc. It's a reason why I'm skeptical they'll be able to make DLSS a universal SS solution for games.

Hmm, and yet OpenCL has yet to achieve more than trivial traction in the GPGPU ecosystem, ROCm is just as much as a failure as HIP Framework and Bolt Framework (have fun even googling it, it's that forgotten) before it, popular ML ecosystems still don't really support OpenCL as a first-class ecosystem, and AMD has been left behind yet again on software integration with DLSS.

In comparison, how many years did AMD spend playing with their dick before Mantle/Vulkan/etc caught any traction? How many years did they spend "building" the Adaptive Sync ecosystem with partners before NVIDIA finally took pity and did it properly for them, assuming control of the market in the process? How long did it take before they managed to write good drivers for Navi, or Vega, or Fury X before it?

NVIDIA is absolutely correct, they're a software company. The best way to get people to use your hardware is to write good software for it. It's the icc/mkl/etc approach.

Paul MaudDib fucked around with this message at 07:12 on Jun 22, 2020

Adbot
ADBOT LOVES YOU

shrike82
Jun 11, 2005

I don't get why AMD is relevant.
Google gets to charge an hourly rate to TPU users while Nvidia just sells hardware one-off.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

shrike82 posted:

I don't get why AMD is relevant.
Google gets to charge an hourly rate to TPU users while Nvidia just sells hardware one-off.

AMD is what happens when you don't think you're a software company.

Google is a software company as well - poo poo, they won't even sell you their TPU hardware, you just run their software and pay by the tensor-flop.

shrike82
Jun 11, 2005

Nvidia is a hardware company that got framework developers onboard because their hardware is good.

Google is a services and compute company that built TPUs because they didn't want to pay the Nvidia tax.

Guess who's going to win the compute war.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

shrike82 posted:

Nvidia is a hardware company that got framework developers onboard because their hardware is good.

Wrongo, NVIDIA is a hardware company that paid to write good software for their hardware. Like I said, it's the icc/mkl/etc model. Their developers did that, just like they paid to write DLSS and VRR and other stuff. And they dumped a shitload of money into the edu pipeline as well. That's why they're a software company.

NVIDIA knows that nobody is going to pay developers to learn your hardware except you. That can be in-house library coders, that can be driver wizards, that can be people in the edu pipeline, that can be people at studios (ideally all of the above), but at the end of the day if you don't foot the bill nobody writes code that runs well on your hardware.

AMD has a captive audience in the console market but that only goes so far, it took them roughly 7 years before mantle had any kind of traction, despite PS4 and Xbone being written around it.

And that's why they've got good hardware when someone else is writing software for it (Linux/consoles/etc), but they completely poo poo the bed on Windows drivers for a year after launch on all their new products, every generation.

Paul MaudDib fucked around with this message at 07:19 on Jun 22, 2020

shrike82
Jun 11, 2005

Yah I'm not sure why you keep on making like AMD is the competition to Nvidia.

Google is. I wouldn't be surprised if we see Amazon and Microsoft rollout their own compute hardware a la TPUs.
Bye bye Nvidia in the compute market.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

shrike82 posted:

Yah I'm not sure why you keep on making like AMD is the competition to Nvidia.

lol ok, AMD and NVIDIA play in no similar fields whatsoever, you said it champ. And nvidia has nothing whatsoever to do with software!

quote:

Google is. I wouldn't be surprised if we see Amazon and Microsoft rollout their own compute hardware a la TPUs.
Bye bye Nvidia in the compute market.

When did Google launch their thing? 2017 or something?

Hasn't happened yet. Hasn't even seen traction yet. Hasn't seen non-trivial adoption yet.

People like hardware that they can develop and run demo programs on their $200 gaming GPU, and test on a rig full of 4 1080 Tis. People like software that isn't restricted to a single cloud vendor's metered hourly servers paying by the flop.

But yeah, Google writing the software to tie Tensorflow to their hardware is certainly better than... the AMD model of hoping someone else does it for you. And that's why Google is a software company.

Paul MaudDib fucked around with this message at 07:29 on Jun 22, 2020

shrike82
Jun 11, 2005

You're pretty out of touch with the compute market, especially on the ML-side.

Google has an edu-prosumer-professional stack of cloud services ranging from Colab to GCP TPU. A TPU v3 is faster than the A100.
ML on a gaming GPU as an entry point for students isn't feasible anymore when contempory models require more memory than is availabe on the Titan SKU.

The thing that really struck me how the market's changed is looking at recent ML papers and seeing how more and more experiments are being run on TPU pods rather than Nvidia GPU clusters.

avantgardener
Sep 16, 2003

Given that a lot of AAA titles are built for consoles which will be AMD powered, do you think that will slow the uptake of DLSS? As it will only affect a portion of a portion of the total user base?

Carecat
Apr 27, 2004

Buglord
More than if it hadn't been the case but as something approaching a low effort magic bullet to get your game running well at 4k I think devs and publishers would see it as easy win.

repiv
Aug 13, 2009

https://videocardz.com/newz/nvidia-announces-a100-pcie-accelerator

Nvidia announced the PCIe variant of the A100 and like usual it's a 250W part, in spite of the SXM variant being 400W.

It's still possible that the consumer cards will go past that line but I doubt it.

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride

GTO posted:

Given that a lot of AAA titles are built for consoles which will be AMD powered, do you think that will slow the uptake of DLSS? As it will only affect a portion of a portion of the total user base?

If you're doing a PC port anyway and it's easy to implement, I don't think that would slow adoption much.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Dogen posted:

If you're doing a PC port anyway and it's easy to implement, I don't think that would slow adoption much.

Yeah, in terms of return on dev time and risk, implementing DLSS has to be near the top of the list, especially because any AAA studio can just whisper “I wonder why this isn’t working” and have NVIDIA devrel rappel in with debug drivers and editors in hand.

DOOMocrat
Oct 2, 2003

Looks like rogame's backdoor into getting unlisted 3dmark results is closed now.

https://twitter.com/_rogame/status/1275093907240599552

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

pyrotek posted:

I think it looks better in some ways, worse in other ways than native rendering.

https://www.youtube.com/watch?v=tMtMneugt0A

There's also this video by DF, in particular this section which shows that in motion, DLSS can have significantly superior detail compared to TAA. Remember that even 'native' is compromised to some degree by how TAA works.

Maxwell Adams
Oct 21, 2000

T E E F S
I just think it's funny that nVidia has chosen this point in time to release RTX Voice and DLSS 2.0, which are features that would be perfect to build into a new console generation.

Maybe nVidia could release a standalone device that just does RTX voice, and make that compatible with console headsets.

repiv
Aug 13, 2009

Nvidia had no chance of getting into the new consoles anyway, Sony/MS had no choice but to stick with AMD to maintain near-full backwards compatibility and keep the cost benefit of a single chip solution.

AMD are the only company that can supply a fast CPU and fast GPU combination - NV have fast GPUs but slow CPUs, and Intel have fast CPUs but slow GPUs (for now at least).

Maybe the Switch Pro will have DLSS though...

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
Isn't there bad blood between MS and Nvidia because of the original Xbox?

Penisaurus Sex
Feb 3, 2009

asdfghjklpoiuyt

Zedsdeadbaby posted:

Isn't there bad blood between MS and Nvidia because of the original Xbox?

It was Intel, not nVidia IIRC. Intel wanted a flat rate per processor sold, guaranteed for the full lifespan of the console. This is one of the bigger reasons the original Xbox had a shortened lifespan.

But NV has pissed off a lot of other companies big and small so it could just as well be them too.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
as far as bumpgate goes, I think it's not realistic to expect your suppliers to eat the cost of the first generation of a novel material flaking out and causing unexpected problems. I'm sure NVIDIA would have been happy to ship it with a lead-based solder but the client wanted RoHS compliance.

this wasn't a NVIDIA specific problem, this was the industry not really having a handle on how to use RoHS compliant solders properly, and it affected everyone. This is really when the trend of baking your GPU got started, and that affected cards from all brands.

an analogy: if you buy an early OLED panel when the technology is not mature and it gets burn-in, well, sucks to be you. If you wait until the technology is mature and it suddenly regresses to way below the expected standard - that is a faulty product and a cause for action.

or - let's say there is something wrong with a TSMC risk production node and all your early chips have a fault in a layer that will cause them to die in 2 years. Does AMD/NVIDIA have a recourse against TSMC in this situation? Not really, they're gonna tell you to get hosed.

this is more about Apple throwing their weight around and bullying their suppliers into paying for the downsides of the engineering risks they took. They are a big client worth a lot of money, but they are also a pretty demanding client by all accounts. They want the premium bins of silicon, they want custom products engineered just for them, and they want minimum prices.

Paul MaudDib fucked around with this message at 20:10 on Jun 22, 2020

orcane
Jun 13, 2012

Fun Shoe

Penisaurus Sex posted:

It was Intel, not nVidia IIRC. Intel wanted a flat rate per processor sold, guaranteed for the full lifespan of the console. This is one of the bigger reasons the original Xbox had a shortened lifespan.

But NV has pissed off a lot of other companies big and small so it could just as well be them too.
https://www.eetimes.com/microsoft-takes-nvidia-to-arbitration-over-pricing-of-xbox-processors/

It was Nvidia, and this was quoted as a major reason for why they took more direct "ownership" of the GPU/SoC in their contracts with AMD since the Xbox 360, so that couldn't happen again.

shrike82
Jun 11, 2005

repiv posted:

https://videocardz.com/newz/nvidia-announces-a100-pcie-accelerator

Nvidia announced the PCIe variant of the A100 and like usual it's a 250W part, in spite of the SXM variant being 400W.

It's still possible that the consumer cards will go past that line but I doubt it.

I'm in the market for a replacement for my RTX Titan with more VRAM so hoping the Titan SKU is 40GB as well.

pyrotek
May 21, 2004



Happy_Misanthrope posted:

There's also this video by DF, in particular this section which shows that in motion, DLSS can have significantly superior detail compared to TAA. Remember that even 'native' is compromised to some degree by how TAA works.

The specific part that looks worse to me is the halos they point out from the sharpening. Halos on edges is one of those things that really bothers me (lots of old Blu-rays had that problem from over-sharpening too) and I can see it in motion on any DLSS 2.0 game. It is probably the worst in Minecraft.

Ignoring that issue, I prefer the DLSS 2.0 image to native resolution. Hopefully future games add DLSS options for things such as sharpening.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Again watch this video, in particular the part starting around 8:00

I thought DLSS 2.0 looked over-sharpened at first, but the reality is that almost all of it is that it's just extremely compliant to a 32x supersampled image, which is obviously more correct. The reason they can do this is because DLSS 2.0 is doing a really good job of using motion data to supersample across multiple frames. There is some artifacting, especially under particularly chaotic and significant motion, but even then it looks pretty good. I think it was DF that showed some examples of hard camera cuts still showing surprisingly good quality, which in that circumstance I guess is just the ML doing its thing.

DOOMocrat
Oct 2, 2003

I'm about as curious at all manners of scaling/encoding performance and overhead in Ampere as I am anything else.

In the age of capture cards and display setups its kind of wild there isn't an All-in-Wonder again that can do streaming/routing/capture beyond just a dock or digital outputs.

Cactus
Jun 24, 2006

I wonder if Cyberpunk will be the game that comes bundled with the new cards.

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

It's gonna sell crazyshitballs, what's in it for CDPR / why would Nvidia need to juice the 3k series like that?

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
By the time it actually comes out, how many people are gonna be left that haven't pre-ordered it, anyhow?

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

DrDork posted:

By the time it actually comes out, how many people are gonna be left that haven't pre-ordered it, anyhow?

Imagine loving preordering a game in 2020

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Zero VGS posted:

Imagine loving preordering a game in 2020

I know you mean that sarcastically, but TW3 had well over 1.5M pre-orders, and Cyberpunk 2077 has been noted as having "way higher" pre-orders, soo... :shrug:

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
I was just looking for a list of games with DLSS 2.0 support, to try it out and see it with my own eyes. Saints Row 2 is on it. What? Why? Nevermind, that was for Geforce Now... :doh:

repiv
Aug 13, 2009

I think the current list is

Control (post-updates)
Wolfenstein Youngblood (got DLSS 2.0 early before it was announced)
MechWarrior 5
Deliver Us The Moon
Minecraft RTX Beta
AMID EVIL

Control is the best showcase because it has tons of raytracing effects that are nicely offset by the DLSS perf boost

sean10mm
Jun 29, 2005

It's a Mad, Mad, Mad, MAD-2R World
Control on my current non-RTX GPU can't even run the normal reflection effects without glitches, but looks rad even with them turned off. I'd love to see it on RTX hardware.

Ugly In The Morning
Jul 1, 2010
Pillbug

repiv posted:

I think the current list is

Control (post-updates)
Wolfenstein Youngblood (got DLSS 2.0 early before it was announced)
MechWarrior 5
Deliver Us The Moon
Minecraft RTX Beta
AMID EVIL

Control is the best showcase because it has tons of raytracing effects that are nicely offset by the DLSS perf boost

The annoying part is that only Wolfenstein Youngblood has a goddamn benchmark tool (maybe amid evil does? I haven’t booted that on my RTX machine yet). The fact I can’t test different settings in a standardized way with clear results drives me up a wall.

Shaocaholica
Oct 29, 2002

Fig. 5E
I take it GPGPU performance isn't really affected much by driver updates right? It's mostly D3D and GL API calls?

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva

repiv posted:

I think the current list is

Control (post-updates)
Wolfenstein Youngblood (got DLSS 2.0 early before it was announced)
MechWarrior 5
Deliver Us The Moon
Minecraft RTX Beta
AMID EVIL

Control is the best showcase because it has tons of raytracing effects that are nicely offset by the DLSS perf boost
Deliver Us The Moon used RTX really well, and turning on DLSS gave an immediate speed jump with no noticeable image degradation. Control and Metro were both really impressive, but that one's the one that actually sold me on ray tracing.
I would love to see RTX effects added to Observation but it'll never happen.

Gyrotica
Nov 26, 2012

Grafted to machines your builders did not understand.
Since Apple is going ARM on Macs and designs its own GPUs for its products that use A-chips, what are the odds it effectively becomes a 4th horse in the GPU race?

ConanTheLibrarian
Aug 13, 2004


dis buch is late
Fallen Rib
Zero.

Cygni
Nov 12, 2005

raring to post

Gyrotica posted:

Since Apple is going ARM on Macs and designs its own GPUs for its products that use A-chips, what are the odds it effectively becomes a 4th horse in the GPU race?

dGPU race? Very slim. Bigger and more powerful GPUs for new "desktop class" SoC parts? Incredibly likely.

Comically, the (barely connected) lineage for the GPU in the Apple products is the PowerVR Kyro/Kyro 2, and ever further back the weird original PowerVR video cards for PCs.

Adbot
ADBOT LOVES YOU

shrike82
Jun 11, 2005

They're going to kill it with iGPUs on their laptops given how lovely current solutions are.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply