Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Sure, you can do anything on GPGPU that you can do with tensor units, or with RT units for that matter. That Nvidia is spending more on RT and dunking all over AMD in that area, and that both Nvidia and Intel have achieved superior image quality with the same or lower performance penalties then AMD has without tensor accelerators is fairly strong evidence they are worth the die space.

So is the fact that AMD used to go pretty much back and forth with Nvidia on software features, but the past 4 years they've been left in the dust.

There's probably more tensor acceleration than is necessary, but we are a solid four years into this and it's pretty clear which path has been the correct one.

Adbot
ADBOT LOVES YOU

Yudo
May 15, 2003

Subjunctive posted:

Is it NVIDIA who has made people believe that, or is it that AMD hasn’t shown it being done without tensor cores? what does Intel call the XeSS-helping circuits in Arc?

On their datacenter oriented products, they have the "matrix engine", but I can't recall what they call it on consumer chips. And yes, Nvidia has done so e excellent marketing.

Arc has more matrix acceleration cores as a function of transistor budget than Lovelace and far more than rnda3. That's good if the goal is to recycle the design for pro applications. The problem with that approach is it means more silicon does nothing 99% of the time when playing games. As die real-estate become more and more precious, this approach may become problematic. It also makes Intel look silly given that unlike Ada, arc isn't very good.

Yudo
May 15, 2003

K8.0 posted:

Sure, you can do anything on GPGPU that you can do with tensor units, or with RT units for that matter. That Nvidia is spending more on RT and dunking all over AMD in that area, and that both Nvidia and Intel have achieved superior image quality with the same or lower performance penalties then AMD has without tensor accelerators is fairly strong evidence they are worth the die space.

So is the fact that AMD used to go pretty much back and forth with Nvidia on software features, but the past 4 years they've been left in the dust.

There's probably more tensor acceleration than is necessary, but we are a solid four years into this and it's pretty clear which path has been the correct one.

AMD cards have matrix acceleration. Tensor cores spend most of their life doing absolutely nothing in games. Rt and matrix acceleration are different things.

shrike82
Jun 11, 2005

lol ac: mirage is an intel joint so only xess supported (at least at launch).
you still have them recommending an nvidia or amd card for best performance

https://twitter.com/assassinscreed/status/1704170994846364117?s=20

Yudo
May 15, 2003

shrike82 posted:

lol ac: mirage is an intel joint so only xess supported (at least at launch).
you still have them recommending an nvidia or amd card for best performance

https://twitter.com/assassinscreed/status/1704170994846364117?s=20

I wonder if this will inspire the same foot stamping and gnashing of teeth as starfield, or if omitting dlss is no longer considered a crime against humanity.

Dr. Video Games 0031
Jul 17, 2004

I think the latest DP4a versions of XeSS actually look better than FSR2, so it wouldn't be as big of a loss if that turns out to be true. It doesn't look like a very demanding game either, so I'm guessing most people will be able to get by just fine without upscaling.

pyrotek
May 21, 2004



shrike82 posted:

lol ac: mirage is an intel joint so only xess supported (at least at launch).
you still have them recommending an nvidia or amd card for best performance

https://twitter.com/assassinscreed/status/1704170994846364117?s=20

I'll never understand CPU recommendations like these. Why would you need to go from an 8700K/3600 to get 60FPS at 1080p/high to an 11600K/5600X to get 60FPS at 2160p/high. Is UHD really that much more CPU demanding at the same detail setting?

Yudo posted:

I wonder if this will inspire the same foot stamping and gnashing of teeth as starfield, or if omitting dlss is no longer considered a crime against humanity.

Of course it is awful, why wouldn't it be? It should have FSR2 as well.

Yudo
May 15, 2003

Dr. Video Games 0031 posted:

I think the latest DP4a versions of XeSS actually look better than FSR2, so it wouldn't be as big of a loss if that turns out to be true. It doesn't look like a very demanding game either, so I'm guessing most people will be able to get by just fine without upscaling.

It does look better, but on my pc in Cyberpunk it is at least 10fps slower than fsr. I don't recall what version of XeSS cyberpunk is using, and so my point may be moot.

pyrotek posted:

Of course it is awful, why wouldn't it be? It should have FSR2 as well.

I like the way you think.

Dr. Video Games 0031
Jul 17, 2004

pyrotek posted:

I'll never understand CPU recommendations like these. Why would you need to go from an 8700K/3600 to get 60FPS at 1080p/high to an 11600K/5600X to get 60FPS at 2160p/high. Is UHD really that much more CPU demanding at the same detail setting?

Of course it is awful, why wouldn't it be? It should have FSR2 as well.

to be fair, they're also going from high to ultra, which may have a more cpu-heavy shadow setting or something. the weirdest part is the bump from 6 to 8-core cpus when going from 1080p high to 1440p high. you cannot convince me that this game is even capable of using more than six cores.

and i think people are jumping to conclusions a little by assuming it won't have dlss and fsr2. they don't mention it on their site, but that may just be a thing where intel doesn't want them to promote the presence of competing technologies, but maybe they're fine with them being in the game? also, i'm a bit of a pragmatist when it comes to these things. i'm still not happy about the lack of dlss, but it's less offensive when the alternative is xess. i don't want to be forced to use amd's technologies when those technologies are just worse than everyone else's.

Yudo
May 15, 2003

Itt people jumped to the conclusion that Bethesda and AMD had crafted a dastardly plan to to do grievous ill upon father Nvidia and his virtuous flock. It is what we do: get mad about poo poo that doesn't matter.

pyrotek
May 21, 2004



Dr. Video Games 0031 posted:

to be fair, they're also going from high to ultra, which may have a more cpu-heavy shadow setting or something. the weirdest part is the small bump to minimum specs when going from 1080p high to 1440p high.

Somehow I missed it jumping from high to ultra for UHD, which is pretty dumb since they literally have it labeled as ultra. Your point is what I should have said.

Dr. Video Games 0031 posted:

i'm still not happy about the lack of dlss, but it's less offensive when the alternative is xess. i don't want to be forced to use amd's technologies when those technologies are just worse than everyone else's.

Including XESS but not DLSS would be the exact same scenario as Starfield, just to a hopefully lesser quality difference. I don't want to be forced to use XESS any more than FSR.

Edit: the biggest difference here is that I wouldn't be surprised if, unlike with Starfield, you are unable to inject other upscalers with the way Ubisoft goes overboard on the DRM.

pyrotek fucked around with this message at 01:48 on Sep 20, 2023

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
CPU requirements have always seemed fuzzy to me, in that it's a combination of whatever the devs had available to test against, plus a marketing pull of "the higher tiers obviously need to mention a better CPU"

shrike82
Jun 11, 2005

lol let's see if puredark comes out with a (paid) FG mod for AC: mirage

Dr. Video Games 0031
Jul 17, 2004

Profanity posted:

DF put up an hour long roundtable discussion with Nvidia and CDPR about DLSS 3.5/RR, pretty interesting so far.

lol at Nvidia guy throwing a bit of shade at Starfield's optimisation.


https://www.youtube.com/watch?v=Qv9SLtojkTU

I'm watching this now and why on earth did they invite this reddit bozo to this roundtable.

pyrotek
May 21, 2004



Yudo posted:

On their datacenter oriented products, they have the "matrix engine", but I can't recall what they call it on consumer chips. And yes, Nvidia has done so e excellent marketing.

Arc has more matrix acceleration cores as a function of transistor budget than Lovelace and far more than rnda3. That's good if the goal is to recycle the design for pro applications. The problem with that approach is it means more silicon does nothing 99% of the time when playing games. As die real-estate become more and more precious, this approach may become problematic. It also makes Intel look silly given that unlike Ada, arc isn't very good.

Per the leaks today, Microsoft is looking to add an NPU in the next-generation Xbox in 2028, partially for ML-based super resolution.

buglord
Jul 31, 2010

Cheating at a raffle? I sentence you to 1 year in jail! No! Two years! Three! Four! Five years! Ah! Ah! Ah! Ah!

Buglord
Is FSR3 supposed to look nicer than FSR2 or is it just framegen stuff that’s being added? I’m playing my second FSR2 game and in motion sometimes FSR looks rough with ghosting/shimmering.

BurritoJustice
Oct 9, 2012

Dr. Video Games 0031 posted:

I'm watching this now and why on earth did they invite this reddit bozo to this roundtable.

I had to skip over any of his sections due to sheer cringe. PCMR is the worst PC subreddit and I hate that they're validating it.

At least his "but isn't DLSS just an excuse for bad optimisation" gotcha question led to both the Starfield joke and a decent discussion on the history of performance improvements in rendering techniques.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

buglord posted:

Is FSR3 supposed to look nicer than FSR2 or is it just framegen stuff that’s being added? I’m playing my second FSR2 game and in motion sometimes FSR looks rough with ghosting/shimmering.

FSR3 is frame gen.

pyrotek
May 21, 2004



BurritoJustice posted:

I had to skip over any of his sections due to sheer cringe. PCMR is the worst PC subreddit and I hate that they're validating it.

I don't understand how people haven't learned their lesson about "ironic" Nazi references by this point.

BurritoJustice
Oct 9, 2012

gradenko_2000 posted:

FSR3 is frame gen.

It's unclear if there will be a super resolution update coming alongside FSR3, AMD has just said "The latest version of Super Resolution" but it's hard to tell if that just means 2.2.2 or a newer version that hasn't launched yet.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Dr. Video Games 0031 posted:

I think the latest DP4a versions of XeSS actually look better than FSR2, so it wouldn't be as big of a loss if that turns out to be true. It doesn't look like a very demanding game either, so I'm guessing most people will be able to get by just fine without upscaling.

Is the comment about the latest DP4a based on your own testing or separate analysis? Just curious, since my understanding is that even DP4a leans on Arc’s XMX cores, so it wouldn’t look as good on non-Intel hardware.

pyrotek posted:

Somehow I missed it jumping from high to ultra for UHD, which is pretty dumb since they literally have it labeled as ultra. Your point is what I should have said.

Including XESS but not DLSS would be the exact same scenario as Starfield, just to a hopefully lesser quality difference. I don't want to be forced to use XESS any more than FSR.

Edit: the biggest difference here is that I wouldn't be surprised if, unlike with Starfield, you are unable to inject other upscalers with the way Ubisoft goes overboard on the DRM.

I think what rubs me the wrong way at times is that, regardless of whether one finds one solution or the other to be superior, doing this type of exclusivity is bad regardless of whether it’s AMD, Intel, or nVidia.

And to try and justify away or minimize it is just textbook hypocrisy.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
There’s also the fact that probably a whole lot more people were interested in Starfield than Mirage, which isn’t lighting the world on fire even among AC fans.

I have been frustrated by the lack of upscaling options in AC games, though. They are demanding games, but none of them (till now at least) had even FSR2 support. Although Ubisoft did include their own TAAU solution, which they obfuscated by labeling as varying qualities of anti-aliasing.

Kazinsal
Dec 13, 2011

SourKraut posted:

Is the comment about the latest DP4a based on your own testing or separate analysis? Just curious, since my understanding is that even DP4a leans on Arc’s XMX cores, so it wouldn’t look as good on non-Intel hardware.

DP4a works fine on GTX 1060s and newer, and Vega 20 and Navi 12 and newer. It's just an instruction that does ([int8,int8,int8,int8] · [int8,int8,int8,int8] → int32) + int32 → int32.

Kramjacks
Jul 5, 2007

Someone was talking about wanting factory liquid cooled 40X0 cards, well here ya go!


https://www.youtube.com/watch?v=eT765FI8Jqg


It's actually not that small, especially with the big rad, and the tubes come out the back making it a bit longer in practice than the specs would suggest.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

Kazinsal posted:

DP4a works fine on GTX 1060s and newer, and Vega 20 and Navi 12 and newer. It's just an instruction that does ([int8,int8,int8,int8] · [int8,int8,int8,int8] → int32) + int32 → int32.

Iirc XeSS on my brother’s 1070 wasn’t gaining him performance in FH5.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Kazinsal posted:

DP4a works fine on GTX 1060s and newer, and Vega 20 and Navi 12 and newer. It's just an instruction that does ([int8,int8,int8,int8] · [int8,int8,int8,int8] → int32) + int32 → int32.

Yeah, I wasn’t disputing that DP4a works fine on non-Intel cards; more that I thought the upscaling is significantly better on an Arc card.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
my understanding of what's to come with FSR is:

1. they've already released HYPR-RX, which is supposed to be their latency reducing tech to compete with NVidia Reflex. It allows a game to run Radeon Super Resolution (driver-level FSR1 upscaling), Radeon Anti-Lag (frame queue management) and Radeon Boost (dynamic resolution scaling) all at the same time, in an effort to keep frame rates up, and input latency low, during busy scenes. These three features already exist in the Adrenalin drivers, but they can't all be used at the same time, except under HYPR-RX. Notably, HYPR-RX is only supported by RX 7000 cards (and later).

2. FSR3 is frame-gen, and it's going to work on most cards, even old and non-AMD ones, if the game officially supports it.

3. there's going to be a driver-level frame-gen implementation, but it's only going to be for RX 7000 cards (and later).

there's still a level of, shall we say, "gatekeeping" here, since even though AMD is bragging about their frame gen being available to """everyone""", the latency reduction tech that's supposed to make it more bearable is only going to be on their latest cards.

As well, the frame-gen that can be used on "every game" (which not even DLSS frame gen tries to do) is also only going to be available on their latest cards. That said, I personally think this one is less of a big deal if only because you probably wouldn't want to throw frame gen on everything anyway

Yudo
May 15, 2003

pyrotek posted:

Per the leaks today, Microsoft is looking to add an NPU in the next-generation Xbox in 2028, partially for ML-based super resolution.

2028? Microsoft must get input into RDNA4/RDNA5: they could just ask for more MAC throughput. Anyways, AI acceleration could look very different by then. I think it would be cool, though. I miss the days of consoles having exotic hardware. I think Sony floated the idea of making their own ray tracing accelerator, but then the same argument applies (i.e. bitch at AMD as Sony has even more pull).

The constraints around console APUs are restrictive: the power and transistor budgets are limited. I don't know if the idea is to offload functions or if that would even work given the demands of software, but if they are going to include dedicated ML stuff, they better have a very good reason and a plan to keep them utilized. Tensor cores are great because they handle mixed precision extremely well (not because other GPUs can't accelerate MAC), which AFAIK is not available on RDNA but is on CDNA. It makes sense: mixed precision has little to do with games.

Josh Lyman
May 24, 2009


Taima posted:

I guess that's fair. My experience with silicon valley has been that the more corporate that entity is, the less it benefits from individual employees being passionate, and the less passionate those employees tend to be. If that is not the case with MS, fair enough I guess, but I would find it surprising.

There is an undeniable vitality to the small companies that end up succeeding in the valley, and in tech generally, which causes those companies to punch above their weight on a per-person average. That is usually in no small part because the hours being worked in these smaller companies are greater, though of course there are many factors.

That structural advantage is eventually lost by 99% of companies, but is very commonly made up for by economies of scale. For the vast majority of tech firms, the average worker hours declines as the company matures. That is part and parcel of the normal maturation cycle. They can then utilize other benefits of maturity to make up for that.
It's not about how corporate an entity is or even necessarily about its size. It's about the comp.

The reason why small companies have vitality, as you put it, is because those employees tend to be on the cap table. You join a startup, in large part, because of the promise of a multi-million dollar payday. Sure, people will talk about a dynamic work environment, and I'm sure for some people that's the overriding factor, but in general, it's about the buyout or IPO.

If you're "just" making L5 salary, of course you're not going to put in 60 hour weeks.

quote:

That's why imo Microsoft would get great ROI out of porting to ARM; it's a largely administrative task (vs creating the game) that you can mostly throw money at to solve. That highly benefits the economies of scale that exist for companies like MS, and I would argue that in general, large companies are very happy to accommodate such ventures because they have the muscle to do so. This is in stark contrast to creative work, which even a company as big as MS can fail at, even with its huge structural resources at its back.
Porting Windows to ARM is insufficient because all your software needs to be ported as well. See: Apple's transitioning from x86 to ARM

Josh Lyman fucked around with this message at 08:36 on Sep 20, 2023

BurritoJustice
Oct 9, 2012

gradenko_2000 posted:

my understanding of what's to come with FSR is:

1. they've already released HYPR-RX, which is supposed to be their latency reducing tech to compete with NVidia Reflex. It allows a game to run Radeon Super Resolution (driver-level FSR1 upscaling), Radeon Anti-Lag (frame queue management) and Radeon Boost (dynamic resolution scaling) all at the same time, in an effort to keep frame rates up, and input latency low, during busy scenes. These three features already exist in the Adrenalin drivers, but they can't all be used at the same time, except under HYPR-RX. Notably, HYPR-RX is only supported by RX 7000 cards (and later).

There are already modded drivers that let you enable HYPR-RX with any AMD GPU, but it remains to be seen if they'll let you enable the driver level frame gen (AMD FMF).

Dr. Video Games 0031
Jul 17, 2004

SourKraut posted:

Yeah, I wasn’t disputing that DP4a works fine on non-Intel cards; more that I thought the upscaling is significantly better on an Arc card.

It is significantly better on an Arc card, almost as good as DLSS. DP4a is not a huge downgrade though, or at least not as big of one as FSR2. This is what I've noticed when toying around with Remnant 2 and Ratchet and Clank. I haven't really paid attention to the performance cost though, so maybe that's an advantage for FSR2.

BurritoJustice posted:

There are already modded drivers that let you enable HYPR-RX with any AMD GPU, but it remains to be seen if they'll let you enable the driver level frame gen (AMD FMF).

Good. At least with DLSS FG, there's the excuse (which seems to be valid) that the algorithm only works well on cards with the upgraded optical flow accelerators in Ada. I don't see any valid reason why the HYPR-RX features should be locked to RDNA3, though. Especially Anti-Lag+.

Dr. Video Games 0031 fucked around with this message at 11:16 on Sep 20, 2023

pyrotek
May 21, 2004



Yudo posted:

2028? Microsoft must get input into RDNA4/RDNA5: they could just ask for more MAC throughput. Anyways, AI acceleration could look very different by then. I think it would be cool, though. I miss the days of consoles having exotic hardware. I think Sony floated the idea of making their own ray tracing accelerator, but then the same argument applies (i.e. bitch at AMD as Sony has even more pull).

The constraints around console APUs are restrictive: the power and transistor budgets are limited. I don't know if the idea is to offload functions or if that would even work given the demands of software, but if they are going to include dedicated ML stuff, they better have a very good reason and a plan to keep them utilized. Tensor cores are great because they handle mixed precision extremely well (not because other GPUs can't accelerate MAC), which AFAIK is not available on RDNA but is on CDNA. It makes sense: mixed precision has little to do with games.

You can get the slides here. Most of what they think they can achieve with ML feels like fanciful bullshit to me, but we shall see.

If you look at the decision to add an APU strictly from a gaming perspective, it seems questionable, but I wouldn't be surprised if the gaming division was being pushed in that direction as part of Microsoft's larger push into AI. They are still talking about utilizing the cloud for parts of games, which was a big push as far back as the Xbox One launch but has amounted to nothing more than the destructible environment in Crackdown 3's Wrecking Zone mode. They've invested resources into the cloud and AI so I guess they want them integrated into the Xbox whether it makes sense or not.

kliras
Mar 27, 2021
speaking of, it looks like apple's metalfx upscaling only works on later chips based both on flatpanelshd reporting, but also some ... very confused developers on the official forums

cool for the nerds who buy new iphones every year, but not as cool as dlss 2, xess, and fsr 2

goes to show how fragmented something like "ios" and tvos still are

surely there has to be a website that has all the info somewhere

MarcusSA
Sep 23, 2007

99% of the for Apple is new feature? New chips.

wargames
Mar 16, 2008

official yospos cat censor

K8.0 posted:


So is the fact that AMD used to go pretty much back and forth with Nvidia on software features, but the past 4 years they've been left in the dust.


What is something that nvidia has currently that AMD doesn't other then frame gen?

MarcusSA
Sep 23, 2007

wargames posted:

What is something that nvidia has currently that AMD doesn't other then frame gen?

Market share

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

wargames posted:

What is something that nvidia has currently that AMD doesn't other then frame gen?

NVidia Broadcast

MarcusSA
Sep 23, 2007

gradenko_2000 posted:

NVidia Broadcast

Amd has a version of their own. I don’t think it works as well though

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

wargames posted:

What is something that nvidia has currently that AMD doesn't other then frame gen?

DLSS is signficantly better on more aggressive upscaling than FSR is. When you run the "performance" mode it's not even close in terms of image.

Adbot
ADBOT LOVES YOU

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Does AMD have a shadowplay-alike?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply