Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Inept
Jul 8, 2003

Dr. Video Games 0031 posted:

Also, I'm not sure how much encoding latency factors into the wider latency picture for cloud gaming, but I wonder if this could help make that more responsive.

I don't see cloud gaming moving beyond HEVC for a few years. It'll take a while for clients to support AV1, and unlike stored media on Youtube, there's no existing media to transcode from AV1 to HEVC/H264, just raw game output.

Adbot
ADBOT LOVES YOU

Kibner
Oct 21, 2008

Acguy Supremacy

Cygni posted:

It's for Twitch. Like, the company. Not the users on either end.

It's also for users that are simulcasting to Twitch, Youtube, and whatever other streaming service while also recording a higher quality version on their local machine/network.

kliras
Mar 27, 2021
how many pci lanes do you need to even set aside for that beast

Cygni
Nov 12, 2005

raring to post

Kibner posted:

It's also for users that are simulcasting to Twitch, Youtube, and whatever other streaming service while also recording a higher quality version on their local machine/network.

The previous generation card, the Alveo U30, doesnt have windows drivers and you can't buy it at retail. This isn't a consumer product. From the AT Article:

quote:

The target market for the card is, like its predecessor, the data center market. AMD’s principle clients are live streaming services and other interactive video services (think Twitch, cloud gaming, video conferencing, etc), all of whom need to encode large numbers of video streams in real-time in a server environment. So like AMD’s EPYC processors, this is very much a server part aimed at a select group of businesses.

AMDs presentation made it pretty clear that it was targeted at:


orcane
Jun 13, 2012

Fun Shoe
Yeah the ServeTheHome article on the announcement also emphasizes this in their last paragraph:

quote:

We asked AMD and this is not the type of card meant to be purchased one or two at a time. Instead, this is meant to be sold to AMD’s large streaming partners.

Shipon
Nov 7, 2005
Was able to put in an order for the 7800x3d at central computers but I guess their shipment was delayed a bit so it should arrive this weekend. Will be the first time I've had an AMD CPU since the very first computer I built with an Athlon XP 1700+ way back in 2003

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
I think we need to put into context some of the concepts of "bandwidth costs" being a matter more of bandwidth limits on the client side similar to why there's such a focus upon power efficiency on the hyperscaler side. The hyperscalers already hit the legal limits of power allowed in their various municipalities and so they generally can't scale horizontally by consuming more electricity even (think all land has been bought so all you can do is build more dense apartments). For consumers in the US mass market adoption of a lot of technologies relies upon the simply horrible cellular infrastructure combined with the lack of upward economic mobility of younger consumers, so to improve the experience for many of these users it's going to be easier to improve encoding standards rather than expect bandwidth to keep going up linearly for them like it did years ago.

BlankSystemDaemon
Mar 13, 2009



Speaking of Zen5, I'm assuming it's getting a new socket instead of reusing AM5?

Combat Pretzel posted:

I was sorta gleaning at the post you quoted.

I remember that P67 chipset and its loving SATA flaw. I had issues returning my P8P67 from Asus for a Rev. 3 version. The dipshits took it to the back of the shop and miraculously found bent pins in the socket, that I'm fairly certain weren't there when I put the plastic lid back in. Caused some in-store drama.
Yeah, my workstations SATAII ports went the way of the dodo a few months after I got it, but luckily I only ever needed two SATAIII port in the system.

I held off on making use of the recall because of that, and now I'm happy I did.

Cygni posted:

It's for Twitch. Like, the company. Not the users on either end.
I can absolutely see streamers buying it if they can get their hands on it, considering they have no problem spending $2000 for a GPU to do the encoding, and this unit takes up less space and less energy.

Khorne
May 1, 2002

BlankSystemDaemon posted:

Speaking of Zen5, I'm assuming it's getting a new socket instead of reusing AM5?
It will use AM5.

BlankSystemDaemon
Mar 13, 2009



Khorne posted:

It will use AM5.
Oh, so a natural upgrade path consisting of used processors for all the people getting 7800X3Ds? :allears:

Shipon
Nov 7, 2005

BlankSystemDaemon posted:

Oh, so a natural upgrade path consisting of used processors for all the people getting 7800X3Ds? :allears:

I have yet to actually remove an LGA processor from its socket once installed because lol Intel's socket changes

BlankSystemDaemon
Mar 13, 2009



Shipon posted:

I have yet to actually remove an LGA processor from its socket once installed because lol Intel's socket changes
Surely part of that was that any inter-generational IPC uplift was, at one point, basically guaranteed to be around 7% - because Intel didn't have any competition forcing them to innovate.
So everyone ended up skipping a few generations each time, meaning a socket change would be inevitable.

Shipon
Nov 7, 2005

BlankSystemDaemon posted:

Surely part of that was that any inter-generational IPC uplift was, at one point, basically guaranteed to be around 7% - because Intel didn't have any competition forcing them to innovate.
So everyone ended up skipping a few generations each time, meaning a socket change would be inevitable.

Yeah I pretty much always skipped like 3-4 generations

BlankSystemDaemon
Mar 13, 2009



Shipon posted:

Yeah I pretty much always skipped like 3-4 generations
I ended up using my Sandy Bridge between 2011 and 2015, and was about to jump on to Skylake at the end of that year.
Then in early 2016 I got cancer (which is in remission now), and I'm only just now jumping back onto the wagon of new hardware.

So I would've ended up following that pattern too, I think - before then, I think I was on a 2-year upgrade cycle up until around the end of Dennard scaling.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

BlankSystemDaemon posted:

I heard of the Alveo U30, but the thing is that unless you were a hyperscaler, you genuinely couldn't buy it, no matter how many money your threw at Xilinx - so it never seemed worth looking into it.

there's some weird products in this niche that aren't likely to cross consumer radar assuming they're even purchasable at all, like this one that was basically 3 skylake-R CPUs (with socketed SODIMM memory, and crystal well EDRAM side cache)

https://www.youtube.com/watch?v=wnf6NwTgPZ0

but yeah I'm also extremely interested in the new AMD thing as well. It would be cool for doing VDI with accelerated streams as well... although the things AMD is advertising this for is, uh, going to be considerably less popular:

Cygni posted:

The previous generation card, the Alveo U30, doesnt have windows drivers and you can't buy it at retail. This isn't a consumer product. From the AT Article:

AMDs presentation made it pretty clear that it was targeted at:

you missed the best slide:



so, basically, the immediate application is gambling and home shopping network type stuff lol. maybe camgirl porn?

but "micro-transaction revenue" is how they're framing it.

BlankSystemDaemon
Mar 13, 2009



Paul MaudDib posted:

there's some weird products in this niche that aren't likely to cross consumer radar assuming they're even purchasable at all, like this one that was basically 3 skylake-R CPUs (with socketed SODIMM memory, and crystal well EDRAM side cache)

https://www.youtube.com/watch?v=wnf6NwTgPZ0

but yeah I'm also extremely interested in the new AMD thing as well. It would be cool for doing VDI with accelerated streams as well... although the things AMD is advertising this for is, uh, going to be considerably less popular:
That's a loving hilarious daughterboard; I bet Intel were thinking "this is how we're gonna make big bugs from our Scalable Video Technology"

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

BlankSystemDaemon posted:

That's a loving hilarious daughterboard; I bet Intel were thinking "this is how we're gonna make big bugs from our Scalable Video Technology"

the choice of crystal well is kind of weird at first but I was looking through phoronix and video encoding (x264 and x265) heavily benefit from v-cache so I bet video encoding a bunch of streams in parallel needs a pretty decent working set even with iGPU-based encoding. video encoding is a lot more intense than people give it credit for, and the idea of CPU encoding your stream while you game has always been a little... dubious

Anime Schoolgirl
Nov 28, 2002

Shipon posted:

I have yet to actually remove an LGA processor from its socket once installed because lol Intel's socket changes
otoh the "drop cpu corner first right into the LGA pins" special now costs $200-300 instead of $50-80

($120-200 if you buy intel)

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

BlankSystemDaemon posted:

That's a loving hilarious daughterboard; I bet Intel were thinking "this is how we're gonna make big bugs from our Scalable Video Technology"

SVT is software encoding, I think the idea here would be to turn the CPUs into iGPU video encode offload accelerators, literally using CPUs just for the GPU perf (and throwing a bunch of side cache at it for a dedicated scratch space). Or at least using client/laptop quad-cores is not gonna be that fast, probably within similar reach of a config you could build with normal server CPUs (albeit probably a more expensive server, but I doubt that would have been a cheap product either).

It's somewhat interesting that a number of companies have made a run at the hardware video encoding market for a number of years. Rabb.it had custom servers with a bunch of jetson tk1s inside a 1U with some NUCs controlling them (arise, methenar). Intel made this video encode board. Google reportedly commissioned AMD to make RDNA1 HEVC really good for Stadia, because they had the same business need too. Now Xilinx/AMD have another take on it (without any attached GPU compute this time, to drop costs per stream).

I guess it's a very common need to just encode some video quickly for whatever use-case, video-encode-as-a-service tailored to some segment. I kinda thought that had already been fairly democratized, it's not like it hasn't been a thing NVIDIA has addressed for a while. the Grid series were basically designed for these sorts of things (VDI, teleconferencing, etc) even 10 years ago, and more recently a t4 card is a very cheap buy-in for a hardware nvenc encoder with turing quality. And hell every phone SOC (raspberry pi etc) has come with video encode for a long time. It's interesting that apparently a quality-optimized encoder running modern codecs at the lowest possible cost ($1600 is fairly cheap for server given what it does) makes a relatively modern ASIC on probably a decent node (6nm? 5nm?) a viable proposition.

I guess AMD see a big future in AV1 content delivery/encoding, and in all fairness it could result in tangible savings for companies like Google (your youtube upload never watched? now it's squished to av1 for space savings), although of course at a certain scale they can also commission whatever ASICs they want (custom 16nm or even 6nm isn't all that insane if you're a hyperscaler). I wonder what market penetration looks like for AV1 decode these days.

Based on the slide Cygni linked, it sounds like maybe it's also grid compute stuff, video encoders for your PS Now and stadia and other streaming services. But again the question there is what you're attaching them to that doesn't already have an encoder - consoles do, dGPUs do, phones do, almost every PC does. Is your quality enough better to warrant the cost than just having the client's laptop do it (if they're cheap, then welp quality sucks, sucks to be you)? If it's just pure VDI, isn't having the dGPU portion of the card an advantage to accelerate the desktop? Twitch/onlyfans datacenter re-encoding, or youtube archival data compression - sure I guess, but how big is that market?

It's a real cool product, I'd love to have one for turbonerd reasons, but it's a mildly perplexing product in terms of market fit, like who does this product fit that wouldn't just buy a bunch of quadro A4000s or T4s instead? twitch and teleconferencing I guess?

I guess the answer may just be that this was developed before Xilinx was acquired and these are the pieces of the product they had in 2019 or 2020 or whatever.

Paul MaudDib fucked around with this message at 02:44 on Apr 8, 2023

Kazinsal
Dec 13, 2011
AV1 is pretty cool and I'm a bit upset that the RTX 30 series doesn't have an AV1 encoder now that YouTube supports it fully for streaming and Twitch is rolling it out slowly iirc. It's great at all scales of real-time encoding from what I've seen of it and if I ended up streaming again I'd probably want to take advantage of it just for the quality improvements at any given bitrate target.

It's also making some pretty big waves rollout wise -- Cisco's gone all in for it on Webex and it's done wonders for making screensharing more viable for people with poo poo connections and on mobile. One of my project managers lives in the boonies and until recently was on a connection that barely passed as ADSL and it was literally impossible for her to do screen shares and video at the same time. When the AV1 update rolled out screensharing bandwidth got cut down to like, a quarter of what it was previously and she could actually reliably share her screen and have her camera on at the same time in meetings.

New Zealand can eat me
Aug 29, 2008

:matters:


Dr. Video Games 0031 posted:

(35W for 32 simultaneous 1080p60 decodes!)

It's encodes not decodes, which is much more impressive.

Paul MaudDib posted:

It's a real cool product, I'd love to have one for turbonerd reasons, but it's a mildly perplexing product in terms of market fit, like who does this product fit that wouldn't just buy a bunch of quadro A4000s or T4s instead? twitch and teleconferencing I guess?

If it's actually good it will be huge for the segment between "$16,000 REMI bridges" and prosumer. Doubly so if the AI enhancement actually manages to get more out of 6500kbps twitch than any other workflow.

New Zealand can eat me fucked around with this message at 09:47 on Apr 8, 2023

kliras
Mar 27, 2021

Kazinsal posted:

AV1 is pretty cool and I'm a bit upset that the RTX 30 series doesn't have an AV1 encoder now that YouTube supports it fully for streaming and Twitch is rolling it out slowly iirc. It's great at all scales of real-time encoding from what I've seen of it and if I ended up streaming again I'd probably want to take advantage of it just for the quality improvements at any given bitrate target.
technically you're not streaming av1 on youtube yet afaik; they just accept it as ingest and transcode it to the usual vp9/avc h264. basically the same way hevc is handled, except av1 is actually served for uploads (ie not streams)

but there's always the chance that they'll transcode it to av1 if it's a popular vod or something i guess

BlankSystemDaemon
Mar 13, 2009



Paul MaudDib posted:

SVT is software encoding, I think the idea here would be to turn the CPUs into iGPU video encode offload accelerators, literally using CPUs just for the GPU perf (and throwing a bunch of side cache at it for a dedicated scratch space). Or at least using client/laptop quad-cores is not gonna be that fast, probably within similar reach of a config you could build with normal server CPUs (albeit probably a more expensive server, but I doubt that would have been a cheap product either).

It's somewhat interesting that a number of companies have made a run at the hardware video encoding market for a number of years. Rabb.it had custom servers with a bunch of jetson tk1s inside a 1U with some NUCs controlling them (arise, methenar). Intel made this video encode board. Google reportedly commissioned AMD to make RDNA1 HEVC really good for Stadia, because they had the same business need too. Now Xilinx/AMD have another take on it (without any attached GPU compute this time, to drop costs per stream).

I guess it's a very common need to just encode some video quickly for whatever use-case, video-encode-as-a-service tailored to some segment. I kinda thought that had already been fairly democratized, it's not like it hasn't been a thing NVIDIA has addressed for a while. the Grid series were basically designed for these sorts of things (VDI, teleconferencing, etc) even 10 years ago, and more recently a t4 card is a very cheap buy-in for a hardware nvenc encoder with turing quality. And hell every phone SOC (raspberry pi etc) has come with video encode for a long time. It's interesting that apparently a quality-optimized encoder running modern codecs at the lowest possible cost ($1600 is fairly cheap for server given what it does) makes a relatively modern ASIC on probably a decent node (6nm? 5nm?) a viable proposition.

I guess AMD see a big future in AV1 content delivery/encoding, and in all fairness it could result in tangible savings for companies like Google (your youtube upload never watched? now it's squished to av1 for space savings), although of course at a certain scale they can also commission whatever ASICs they want (custom 16nm or even 6nm isn't all that insane if you're a hyperscaler). I wonder what market penetration looks like for AV1 decode these days.

Based on the slide Cygni linked, it sounds like maybe it's also grid compute stuff, video encoders for your PS Now and stadia and other streaming services. But again the question there is what you're attaching them to that doesn't already have an encoder - consoles do, dGPUs do, phones do, almost every PC does. Is your quality enough better to warrant the cost than just having the client's laptop do it (if they're cheap, then welp quality sucks, sucks to be you)? If it's just pure VDI, isn't having the dGPU portion of the card an advantage to accelerate the desktop? Twitch/onlyfans datacenter re-encoding, or youtube archival data compression - sure I guess, but how big is that market?

It's a real cool product, I'd love to have one for turbonerd reasons, but it's a mildly perplexing product in terms of market fit, like who does this product fit that wouldn't just buy a bunch of quadro A4000s or T4s instead? twitch and teleconferencing I guess?

I guess the answer may just be that this was developed before Xilinx was acquired and these are the pieces of the product they had in 2019 or 2020 or whatever.
Intel Quick Sync is comparable to the x264 superfast in terms of picture quality, so I'm not sure if that's the way they're going, considering that's pretty piss-poor quality.

The Xilinx acquisition was by AMD, not Intel - so I'm not sure what you mean with the last sentence.
There's very sparse information on the kinds of FPGAs available for encoding video - according to the 2021 comparison there were only 5 available to test. Maybe by October or November, we'll have the FPGA report for 2022.

EDIT: The Xilinx FPGA product dates back to 2018 at least.

BlankSystemDaemon fucked around with this message at 13:04 on Apr 8, 2023

kliras
Mar 27, 2021
i just hope the encoding quality trickles down to consumer gpu's, because amd's av1 encoding quality was pretty rear end when eposvox benchmarked the 7000 series

kliras fucked around with this message at 13:32 on Apr 8, 2023

Kibner
Oct 21, 2008

Acguy Supremacy

kliras posted:

i just hopes the encoding quality trickles down to consumer gpu's, because amd's av1 encoding quality was pretty rear end when eposvox benchmarked the 7000 series

I thought he said that was because of some bug that he's waiting for amd to fix before going whole hog on testing it?

redeyes
Sep 14, 2002

by Fluffdaddy
We could have a cpu buying boom if AMD or Intel put AV1 hardware encoders on the CPUs themselves.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.
From what I've seen AV1 software encoders are already really good, and Facebook reported that they're able to encode faster than x264 "faster" or x265 "ultrafast" speed settings while producing better looking output: https://engineering.fb.com/2023/02/21/video-engineering/av1-codec-facebook-instagram-reels/

Why does the AV1 have to be hardware? Why not just buy a CPU with more cores then games can use anyway and do encoding in software? Software encoders look significantly better than any of the hardware encoders if you can throw enough CPU at them. Software encoders also get better and better over time.

kliras
Mar 27, 2021

Twerk from Home posted:

Why does the AV1 have to be hardware? Why not just buy a CPU with more cores then games can use anyway and do encoding in software? Software encoders look significantly better than any of the hardware encoders if you can throw enough CPU at them. Software encoders also get better and better over time.
cpu encoding doesn't cap in a straightforward way like nvenc does. with nvenc-style encoding, you first of all don't have to basically figure out how ffmpeg works, but second, the likelihood of dropped frames and bottlenecked performance dependent on each single game is much lower. pretty much same exact reason people get dedicated streaming and recording pc's

a dedicated encoding gpu like intel arc was compelling for the same reason: simple without being fiddly, but unfortunately the driver situation made it moot

Indiana_Krom
Jun 18, 2007
Net Slacker

Twerk from Home posted:

Why does the AV1 have to be hardware? Why not just buy a CPU with more cores then games can use anyway and do encoding in software? Software encoders look significantly better than any of the hardware encoders if you can throw enough CPU at them. Software encoders also get better and better over time.

Power. An ASIC will do it more reliably for 1000x less energy cost, quality is secondary.

BlankSystemDaemon
Mar 13, 2009



redeyes posted:

We could have a cpu buying boom if AMD or Intel put AV1 hardware encoders on the CPUs themselves.
Well, general purpose CPUs aren't really very good at handling the processing required, which is why we invented GPUs, and then general-purpose GPUs to offload a lot of the work. Intel has supported AV1 since the 12th gen iGPUs, and it's part of RDNA3/Navi3x (the iGPU in Ryzen 7000-series are RDNA2/Navi2x, so by the time you upgrade to a Zen5 CPU using the same AM5 socket, you might get it for free if you pick the right CPU).
I don't know that there's a whole lot of room for an ASIC chip on the processor package, even if there was thermal room for it.

EDIT: Intel was also working on the single-slot half-height passively cooled Arctic Sound GPU for datacenters, which can do AV-1 encoding, as an equivalent of the AMD Alveo MA35D. I don't know if it ever launched.

BlankSystemDaemon fucked around with this message at 16:55 on Apr 8, 2023

SwissArmyDruid
Feb 14, 2014

by sebmojo
Leaked bench hints that the next GPD device may use a R7 7840U

New Zealand can eat me
Aug 29, 2008

:matters:


FWIW Youtube's AV1 and HEVC transcodes are poo poo unless you give them 4k+ to start. My 4k/2880p HEVC uploads look great, but anything I've tried in 1080 has come out looking less than stellar (even if it's 100MB/s). If you're just using Adrenaline's record & stream poo poo, using AVC (h264) will give you the best results still.

BlankSystemDaemon
Mar 13, 2009



All of youtube's compression options are poo poo.

New Zealand can eat me
Aug 29, 2008

:matters:


This looks perfectly fine in 4k, it could be better but for 'free' that's pretty awesome. https://www.youtube.com/watch?v=7dOu0hmvVug, and it's not like DiRT is an easy encode.

New Zealand can eat me fucked around with this message at 13:28 on Apr 9, 2023

BlankSystemDaemon
Mar 13, 2009



New Zealand can eat me posted:

This looks perfectly fine in 4k, it could be better but for 'free' that's pretty awesome. https://www.youtube.com/watch?v=7dOu0hmvVug, and it's not like DiRT is an easy encode.
It's not free, it's a loss-leader for Alphabet to make money using Google AdSense.

Kibner
Oct 21, 2008

Acguy Supremacy

BlankSystemDaemon posted:

It's not free, it's a loss-leader for Alphabet to make money using Google AdSense.

Pretty sure that's why "free" was in quotes.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.
What's the support situation like for 48GB and 96GB kits on Zen 4? They are looking really appealing: https://www.tomshardware.com/news/teamgroup-ddr5-8000-48gb-96gb

DDR5-8000, now we are talking.

BlankSystemDaemon
Mar 13, 2009



Twerk from Home posted:

What's the support situation like for 48GB and 96GB kits on Zen 4? They are looking really appealing: https://www.tomshardware.com/news/teamgroup-ddr5-8000-48gb-96gb

DDR5-8000, now we are talking.
I think AMD said that 6000MT is where it's at.

Kibner
Oct 21, 2008

Acguy Supremacy

BlankSystemDaemon posted:

I think AMD said that 6000MT is where it's at.

8000 would also work well, as it can be evenly divided by the max 2000 mhz fabric clock or whatever it is (I obviously forgot the details). There is just a big gap between 6000 and 8000 where things would be uneven and likely lower performance.

e: v yeah, there is that lol :v:

Kibner fucked around with this message at 18:34 on Apr 9, 2023

Adbot
ADBOT LOVES YOU

SlapActionJackson
Jul 27, 2006

You aren't going to get the memory controller up to 8gig

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply