Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Hemish
Jan 25, 2005

Cygni posted:

RAGE MODE is all in caps in the official documents so please refer to RAGE MODE as RAGE MODE and not the pathetic, weak, sickly "rage mode" when referring to RAGE MODE.



RAGE MODE.


Thanks, now I'll always shout that in my head when I'll read that term...

Adbot
ADBOT LOVES YOU

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

shrike82 posted:

AI-based upsampling is an active area of research so don't assume nvidia's approach is the most efficient way to go, or that you need the speed benefit of the tensor cores to have an effective solution.

even if such a technique is discovered (that doesn’t completely poo poo up quality to the extent you’re better off just upscaling directly like early iterations of DLSS did), NVIDIA still comes out on top because they’ve got the hardware acceleration for it, while AMD will still be running it on general purpose hardware. In that scenario RT on AMD becomes a bit less useless, but NVIDIA benefits even more, plus the drastic acceleration of AI-accelerated upscaling implementations means NVIDIA’s advantage stops being ”a few large AAA titles” and starts being just “NVIDIA wins everywhere”.

There is not a scenario where buying a $649 card without tensor cores (or whatever AMD calls their AI acceleration units) is going to be worth it vs just gritting your teeth and paying the extra $50.

And no, AMD stock is not going to be great either. Set your alarm and get your bot ready.

Paul MaudDib fucked around with this message at 01:10 on Oct 29, 2020

Truga
May 4, 2014
Lipstick Apathy

Cygni posted:

RAGE MODE is all in caps in the official documents so please refer to RAGE MODE as RAGE MODE and not the pathetic, weak, sickly "rage mode" when referring to RAGE MODE.



RAGE MODE.


THREADRIPPER
RAGE MODE

i don't even care anymore, amd is the best now

shrike82
Jun 11, 2005

dude stop talking about ML - the inferencing is going to be done on GPU even if it's a console or desktop AMD card so it's all hardware accelerated.
the tensor cores are a specific optimization and like i said, it's not clear that an efficient upsampling technique requires it.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
The only RAGE for me is ATI RAGE

Tyro
Nov 10, 2009
Best Buy has 3070s up on their website now as "Coming Soon" for tomorrow morning. The FE, 2 Gigabyte models, the EVGA XC3, and two PNYs.

Truga
May 4, 2014
Lipstick Apathy
i just realised that what i'm saying is, the way RTX is currently used in non-quake2/minecraft games in basically hairworks, thanks thread for that association lol

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

shrike82 posted:

dude stop talking about ML - the inferencing is going to be done on GPU even if it's a console or desktop AMD card so it's all hardware accelerated.
the tensor cores are a specific optimization and like i said, it's not clear that an efficient upsampling technique requires it.

No, there is a distinct difference between performance when you run neural nets on the general-purpose hardware and when you have tensor cores to accelerate it. That’s why tensor cores exist in the first place, and why they’re on the consumer cards. And yes, you can verify from profiling tools that the tensor cores are being used, it’s not just for show.

No, you’re incorrect that “just inference” is quick enough to run without the tensor cores. They already take ~2-3ms to run, that would be around 5-6x worse without the tensor cores. An extra 15ms or so of latency in the render pipeline cannot be easily hid. It just can’t.

Not sure why you’re trying to minimize the impact of tensor cores, they will make a big difference in AI-upscale performance. You like to keep slinging around accusations me not understanding what tensor cores do but I really think you don’t understand what they do very well yourself.

Paul MaudDib fucked around with this message at 01:19 on Oct 29, 2020

PC LOAD LETTER
May 23, 2005
WTF?!
If they knocked the price down $50 on the 6800XT and probably ~$80 on the 6800 they'd probably sell a heap of them easily and people, even here, would probably would be OK with that given the lack of DLSS-equivalent + ~30% slower RT.

Like others have said they're probably taking advantage of NV's supply issues with these prices.

I was wrong on power (they pumped clocks up higher than I expected too) and that 128MB cache (which still seems nuts) but otherwise it performs aboooout as expected for regular raster stuff which nearly everything, including most new games over the next few years, will be focused on. I think if you're in love with RT or DLSS it'll be a disappointment at these prices but for otherwise its a generally good card.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

Zero VGS posted:

The only RAGE for me is ATI RAGE

shrike82
Jun 11, 2005

being able to offer a bundled CPU + GPU package is going to let them sell out whatever supply they have of the cards for the foreseeable future.
lmao wanna see an all AMD gaming laptop go head-to-head with a Koduri-powered Intel XE gaming laptop

Alchenar
Apr 9, 2008

The War Thunder devs have said that DLSS is coming to their game, which I take as a particularly optimistic sign because their engine is a proprietary ancient patchwork quilt of cobbled together systems and rewritten code. It's like the worst case scenario for trying to patch some new feature in that they're not writing themselves and if they can implement DLSS then anyone can.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

shrike82 posted:

dude stop talking about ML - the inferencing is going to be done on GPU even if it's a console or desktop AMD card so it's all hardware accelerated.
the tensor cores are a specific optimization and like i said, it's not clear that an efficient upsampling technique requires it.

It's possible that a solution exists that doesn't require something like tensor cores. Problem is, no such solution has yet to be demonstrated.

I mean, we can discuss what AMD might come up with all day long, but it doesn't mean anything until they actually do it.

If you care about RT, this generation goes to NVidia because AMD doesn't have any viable way of dealing with the performance hit right now or in the near future, simple as that. Next gen might be different--we can reassess then.

PC LOAD LETTER posted:

If they knocked the price down $50 on the 6800XT and probably ~$80 on the 6800 they'd probably sell a heap of them easily and people, even here, would probably would be OK with that given the lack of DLSS-equivalent + ~30% slower RT.

Like others have said they're probably taking advantage of NV's supply issues with these prices.

Yeah, that sounds to me about where the pricing "should" be: A $600 6800XT and a $500 (at best) 6800, if not a $450 6800 (which should also really not have 16GB VRAM, but I guess the memory bandwidth off-card is already bad enough they can't afford not to).

But why bother with "good" pricing when you can take advantage of sky high demand and sell at a markup for a while? You can always cut prices later--you can't really raise them. They don't even need to bother selling bundles--their Zen 3 pricing already has shown they don't give a gently caress about being the "good deal" company anymore (and on the CPU front, they absolutely don't need to be, either).

Truga
May 4, 2014
Lipstick Apathy

DrDork posted:

But why bother with "good" pricing when you can take advantage of sky high demand and sell at a markup for a while? You can always cut prices later--you can't really raise them. They don't even need to bother selling bundles--their Zen 3 pricing already has shown they don't give a gently caress about being the "good deal" company anymore (and on the CPU front, they absolutely don't need to be, either).

amd does this with their cpus too, even zen2 was pretty expensive at launch, and zen3 especially, but after a few months when supply stabilizes they'll 100% lower prices if they can afford it to sell more hardware like they did with zen/+/2

PC LOAD LETTER
May 23, 2005
WTF?!
AMD has never been shy about pricing their stuff higher if they think they can get away with it and I've said as much in the AMD thread before multiple times.

That they were willing to keep their prices a fair bit lower than the competition at all, usually more so than the performance difference was worth, was how they got a rep for value vs Intel or (at one time) even NV.

Kraftwerk
Aug 13, 2011
i do not have 10,000 bircoins, please stop asking

So to save me from scrolling through 350 posts, could someone please sum up what the AMD cards are supposed to be and how they compare to the NVIDIA ones? Has AMD actually found a way to beat them?

Cygni
Nov 12, 2005

raring to post

PC LOAD LETTER posted:

If they knocked the price down $50 on the 6800XT and probably ~$80 on the 6800 they'd probably sell a heap of them easily and people, even here, would probably would be OK with that given the lack of DLSS-equivalent + ~30% slower RT.

Like others have said they're probably taking advantage of NV's supply issues with these prices.

AMD has their own supply issues with the lack of 7nm wafers, and i think profit/wafer calcs are going to mean that there isnt going to be a glut of Big Navi hanging around if Zen 3 is a hit.

~what follows is just back-of-the-envelope math and no i am not considering yields, or the BOM of other parts involved, etc relax~

Each Zen3 die is ~74mm sq and is being sold for $449 (5800X).
Each Navi 21 is ~536mm sq and is being sold for $649 (6800XT)

If I am wafer constrained, i aint gonna run many Navi 21 wafers as long as there is Zen 3 demand.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Kraftwerk posted:

So to save me from scrolling through 350 posts, could someone please sum up what the AMD cards are supposed to be and how they compare to the NVIDIA ones? Has AMD actually found a way to beat them?

If you don't care at all about RT now or (probably) in the future, the 6800XT is very possibly a better buy than the 3080, particularly if you're combining it with a Zen 3 CPU. The 6800 is probably too expensive at $580 to make it a good deal compared to a $500 3700. The 6900XT seems like a good option if you want the best DX11 experience.

If you care about RT, think DLSS is cool, or play mostly DX12 games, NVidia is still the better buy.

Whether AMD actually has cards to sell is an entirely open question--they are acting confident on Twitter, but there are numerous reasons to think that they will also be largely a paper launch.

PC LOAD LETTER
May 23, 2005
WTF?!
AMD roughly doubled their supply of TSMC 7nm wafer buys for Zen3 and RDNA2 vs Zen2. I think they have around 30% of TSMC's total 7nm supply bought up until around mid 2021 or so.

What that exactly means for GPU supply is unknown, because they're also launching Zen3 on that same batch of supply, but I don't think you can be assured that their supply will be as bad as NV's or even just kinda bad at this point.

Cygni posted:

If I am wafer constrained, i aint gonna run many Navi 21 wafers as long as there is Zen 3 demand.

Absolutely true but they also probably know they're not gonna have the demand NV will + they want to improve their brand after the mess with RDNA1. And these higher end gaming dGPU's are all niche anyways. Look how many have 1080's, 1080ti's, 2070's, 2080's, etc on the Steam hardware surveys. The market is tiny.

edit: the dGPU die is a big deal but memory costs can rival it or even be higher and that is without using HBM + interposer. GDDR6 and GDDR6X are both expensive though I think GDDR6 has come down a bit in price since its intro since its been around for a couple of years now. Exact costs aren't published so all you can find are somewhat speculative estimates though.\/\/\/\/\/

PC LOAD LETTER fucked around with this message at 01:52 on Oct 29, 2020

shrike82
Jun 11, 2005

Cygni posted:

AMD has their own supply issues with the lack of 7nm wafers, and i think profit/wafer calcs are going to mean that there isnt going to be a glut of Big Navi hanging around if Zen 3 is a hit.

~what follows is just back-of-the-envelope math and no i am not considering yields, or the BOM of other parts involved, etc relax~

Each Zen3 die is ~74mm sq and is being sold for $449 (5800X).
Each Navi 21 is ~536mm sq and is being sold for $649 (6800XT)

If I am wafer constrained, i aint gonna run many Navi 21 wafers as long as there is Zen 3 demand.

is die size the key driver of costs for these chips? seems like CPUs would be a lot more profitable than GPUs then

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

PC LOAD LETTER posted:

What that exactly means for GPU supply is unknown, because they're also launching Zen3 on that same batch of supply, but I don't think you can be assured that their supply will be as bad as NV's or even just kinda bad at this point.

Yeah, nothing's a done deal at this point. But AMD has slow rolled some of their CPU parts (Renoir, for example), which would suggest that they're still having to pick and choose what gets produced, rather than having a comfortable amount of production capacity. But for all we know, they might have paused Renoir because they didn't think it would sell super well and shifted that capacity over to producing a gently caress ton of Big Navi. We'll find out in a month, I guess.

shrike82 posted:

is die size the key driver of costs for these chips? seems like CPUs would be a lot more profitable than GPUs then

It is a big chunk, yeah. They never publish actual numbers, but some people have done some ballpark calculations and suggested the 2080 Ti die, for example, cost ~$150. TSMC's 7nm node is not known to be a "budget" node, either, unlike Samsung's 8nm node, with some reports being that it's upwards of twice as expensive. Rumors, true, but in AMD's case their CPUs absolutely carry a considerably higher profit margin than their GPUs do.

DrDork fucked around with this message at 01:41 on Oct 29, 2020

Truga
May 4, 2014
Lipstick Apathy

Kraftwerk posted:

So to save me from scrolling through 350 posts, could someone please sum up what the AMD cards are supposed to be and how they compare to the NVIDIA ones? Has AMD actually found a way to beat them?

their card is $50 cheaper, has 6 gigs of extra vram if that matters to you at all, and performs on par in games that don't have dlss/rtx.

e;fb

Truga fucked around with this message at 01:42 on Oct 29, 2020

repiv
Aug 13, 2009

https://www.youtube.com/watch?v=haAPtu06eYI&t=1055s

AMD told Steve that their super-resolution thing will have "broader application" than DLSS whatever that means

Some kind of driver hook like RIS? I don't see how that could get close to DLSS quality without engine hooks for LOD biasing, motion vectors, etc

also someone tell steve that dlss doesn't require per-game training anymore

PC LOAD LETTER
May 23, 2005
WTF?!

DrDork posted:

But AMD has slow rolled some of their CPU parts (Renoir, for example),

They've been slow rolling their APU's for a long time (too long) now though. I don't know if you can use them as a example.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

repiv posted:

AMD told Steve that their super-resolution thing will have "broader application" than DLSS whatever that means

Some kind of driver hook like RIS? I don't see how that could get close to DLSS quality without engine hooks for LOD biasing, motion vectors, etc

FidelityFX also has a "broader application" than DLSS, but is also total crap compared to DLSS. It wouldn't surprise me if their super-resolution thing ends up being just an adaptation of conventional super-sampling with some add-ons to try to punch it up a little.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

Kraftwerk posted:

So to save me from scrolling through 350 posts, could someone please sum up what the AMD cards are supposed to be and how they compare to the NVIDIA ones? Has AMD actually found a way to beat them?

The 6800 competes with the 3070, but is more expensive (579 vs 499)

The 6800XT competes with the 3080, and is a little cheaper (649 vs 700)

The 6900 competes with the 3090, and is significantly cheaper (1000 vs 1500)

When I say competes, that comes with some heavy caveats:

* their tests included the use of RAGE MODE
* their tests also included this bespoke set-up where using a Ryzen processor combined with an 6000-series card is supposed to provide some kind of performance boost
* there was very little mention of ray-tracing performance, and what few technical notes they did offer suggest that NVidia is still better at ray-tracing
* since AMD also doesn't have DLSS, it's practically guaranteed that any game that does RT+DLSS is going to be faster on an NVidia card, and so any competitiveness AMD has relies on the game not having DLSS
* and that's on top of the general caveat that you shouldn't necessarily trust performance numbers that aren't being provided by third-party independent reviewers anyway
* the pricing of the 6800 is awkward at best and just straight up too high at worst. Even if you worked through all of these caveats the better deal (assuming availability is set aside) is still either the 6800XT or the 6900
* again, again, again, don't make decisions until reviews come out

buglord
Jul 31, 2010

Cheating at a raffle? I sentence you to 1 year in jail! No! Two years! Three! Four! Five years! Ah! Ah! Ah! Ah!

Buglord
Has hairworks ever looked nicer than default hair? I’ve ticked it on for a few games and even though it’s all physics based, it looks kinda dorky because it’s sparse 3D noodles over a flat hair texture. Also (and not sure if hairworks causes it) the weird bouncy hair effect which is sometimes happens during dialogue is goofy as heck too.

PC LOAD LETTER
May 23, 2005
WTF?!

repiv posted:

Some kind of driver hook like RIS? I don't see how that could get close to DLSS quality without engine hooks for LOD biasing, motion vectors, etc

Obviously meeting or beating DLSS with some sort of equivalent would be ideal for AMD (and everyone) but realistically if they even get half way there (in terms of overall benefit to performance with similar image quality) and it works with minimal developer effort that'll be a big deal and for now from a brand standpoint would be "good enough".

Price wise they could probably leave things as they are if they pull that off BUUUT that is a big ol' but which would depend on them actually pulling it off + it working well. I wouldn't go buying AMD's stuff (or NV's) based off future promises that amount to "at some point in the future we'll do x or y or feature z will now be supported by everyone" because usually that never pans out.

Cygni
Nov 12, 2005

raring to post

PC LOAD LETTER posted:

AMD roughly doubled their supply of TSMC 7nm wafer buys for Zen3 and RDNA2 vs Zen2. I think they have around 30% of TSMC's total 7nm supply bought up until around mid 2021 or so.

What that exactly means for GPU supply is unknown, because they're also launching Zen3 on that same batch of supply, but I don't think you can be assured that their supply will be as bad as NV's or even just kinda bad at this point.

Considering AMDs laptop partners have reportedly told investors that they aren't shipping as many AMD laptops as they have orders due to AMD supply issues, the skip on AMD APUs for desktop supposedly because of shortages, the rumored shortages for Rome server parts due to big iron demand, the rumors that AMD/Intel/Nvidia have been bidding on every single wafer TSMC frees up, and that Big Navi was itself delayed about 6 months, I am fairly confident that they are capacity constrained. Maybe I'm wrong, I dont know poo poo. I'm just talking.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

buglord posted:

Has hairworks ever looked nicer than default hair? I’ve ticked it on for a few games and even though it’s all physics based, it looks kinda dorky because it’s sparse 3D noodles over a flat hair texture. Also (and not sure if hairworks causes it) the weird bouncy hair effect which is sometimes happens during dialogue is goofy as heck too.

TW3's implementation looked ok I thought, though sometimes awkward like you say. I thought it actually looked a lot better on animals, so maybe FurWorks would have been a more honest name.

Killed performance regardless, though.

PC LOAD LETTER
May 23, 2005
WTF?!

Cygni posted:

*shortages stuff*
Yeah that is with their current supply from TSMC. They've bought more for Zen3 and RDNA2. Lots more.

Again they've doubled it from where it was.

Obviously no one other than AMD knows what they can actually deliver at this point but generally doubling wafer supply should improve things. The gotcha in there, that I've already brought up, is Zen3 and it looks like they'll be able to sell all they possibly could of those.

edit: we'll see what happens. I didn't expect them to put a 128MB cache on RDNA2 either. They might be able to pull a surprise off again here.\/\/\/\/

PC LOAD LETTER fucked around with this message at 02:08 on Oct 29, 2020

repiv
Aug 13, 2009

PC LOAD LETTER posted:

Obviously meeting or beating DLSS with some sort of equivalent would be ideal for AMD (and everyone) but realistically if they even get half way there (in terms of overall benefit to performance with similar image quality) and it works with minimal developer effort that'll be a big deal and for now from a brand standpoint would be "good enough".

It would be a reasonable trade-off if they could get almost-as-good quality but without engine hooks, but I can't see how they could make anything worthwhile without those hooks

Even the less impressive modern upscalers like checkerboard rendering need motion vectors to work and there's no generic way to get those without developer intervention

Working without motion vectors basically sets you back to FXAA/SMAA type algorithms that just squint at the undersampled input and guess what it's supposed to look like

Fuzz
Jun 2, 2003

Avatar brought to you by the TG Sanity fund
So who has actually gotten a 3080 step up card, and when did you sign up?

MadFriarAvelyn
Sep 25, 2007

As someone who got a 3080 FE on launch day I wish my card was capable of RAGE MODE.

shrike82
Jun 11, 2005

tbh, people should be following Microsoft more than AMD for a viable DLSS alternative -
this is what they had to say about their console's RDNA2 GPU

quote:

Through close collaboration and partnership between Xbox and AMD, not only have we delivered on this promise, we have gone even further introducing additional next-generation innovation such as hardware accelerated Machine Learning capabilities for better NPC intelligence, more lifelike animation, and improved visual quality via techniques such as ML powered super resolution.

Truga
May 4, 2014
Lipstick Apathy
give me supersampling or give me death.

*runs quake 3 at 4x supersample* see, runs at 700fps, works fine!

ufarn
May 30, 2009
Don't Xbox also use ML for their fancy tonemapping feature?

repiv
Aug 13, 2009

Truga posted:

give me supersampling or give me death.

*runs quake 3 at 4x supersample* see, runs at 700fps, works fine!

All this temporal anti-aliasing/upscaling fuckery basically is a roundabout way to do supersampling without setting the GPU on fire

To varying degrees of success depending on the implementation

ufarn posted:

Don't Xbox also use ML for their fancy tonemapping feature?

Yes, that's the only ML thing they've shipped so far I think

MeruFM
Jul 27, 2010
also HDR for older games, microsoft is way into the ML for gaming stuff.

Adbot
ADBOT LOVES YOU

buglord
Jul 31, 2010

Cheating at a raffle? I sentence you to 1 year in jail! No! Two years! Three! Four! Five years! Ah! Ah! Ah! Ah!

Buglord
NVIDIA just sent an email saying that the 3070s are on sale at 6AM PST. figure most of y’all already know that but for those who are catching up/just dropping in

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply