Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Oh, when you said “just more lower powerish cores” I thought you meant with the same ISA

Adbot
ADBOT LOVES YOU

Cygni
Nov 12, 2005

raring to post

Subjunctive posted:

I wonder if the NPUs were originally targeted at upscaling, if the design goes that far back. I guess Apple had been doing the Neural Engine thing for a while by then

My understanding is that they were initially for doing computational photography tricks, which is how phones get the “pictures” from the tiny sensor to look good to people. Behind the scenes, it’s stitching together multiple pictures and applying filters to them in real time to make a hybrid monster image that people like.

Supposedly it also gets used for FaceID and those Animoji/memoji you haven’t seen in years. Ultimately it has some real uses, and the chip makers have been hoping to get traction on the “AI” buzzword for them for like 8 years… they just finally succeeded in the wake of the successive Crypto/Metaverse/NFT scams.

Dr. Video Games 0031
Jul 17, 2004

Cygni posted:

My understanding is that they were initially for doing computational photography tricks, which is how phones get the “pictures” from the tiny sensor to look good to people. Behind the scenes, it’s stitching together multiple pictures and applying filters to them in real time to make a hybrid monster image that people like.

This sounds surprisingly like TAAU/DLSS. I never knew how phone cameras worked with their tiny sensors, but this makes a lot of sense. Are they taking multiple photos in quick succession while using natural hand shaking as the "jitter" and reconstructing a super high-res image by sampling all of the images?

Llamadeus
Dec 20, 2005

Dr. Video Games 0031 posted:

This sounds surprisingly like TAAU/DLSS. I never knew how phone cameras worked with their tiny sensors, but this makes a lot of sense. Are they taking multiple photos in quick succession while using natural hand shaking as the "jitter" and reconstructing a super high-res image by sampling all of the images?
Exactly that, here's a 2019 paper from Google on their method: https://arxiv.org/abs/1905.03277

Some stills cameras can also do it by using the sensor stabiliser to provide the jitter.

Cygni
Nov 12, 2005

raring to post

MKBHD has done a few videos with the question "what is a photo?" discussing the processing pipeline and computations etc which I find pretty interesting, even though I'm not personally a super camera guy:

https://www.youtube.com/watch?v=88kd9tVwkH8

the funniest one is Huawei/Samsung getting busted faking the moon in their pictures. Huawei literally superimposing a canned image of the moon into your photos when the software recognized it, and Samsung adding fake details from a "reference image":

https://www.youtube.com/watch?v=1afpDuTb-P0&t=78s

Dr. Video Games 0031
Jul 17, 2004

Does this mean that technically a phone mounted on a stable tripod will take worse photos than a handheld phone?

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Cygni posted:

My understanding is that they were initially for doing computational photography tricks

Sorry, I meant the AMD NPUs!

hobbesmaster
Jan 28, 2008

Subjunctive posted:

Sorry, I meant the AMD NPUs!

I actually just asked some AMD guys this earlier today: their origins are all in the “traditional” VLIW DSP “cores” and more or less function like that from a programmer’s perspective.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

hobbesmaster posted:

I actually just asked some AMD guys this earlier today: their origins are all in the “traditional” VLIW DSP “cores” and more or less function like that from a programmer’s perspective.

ah, cool—thanks!

ryanrs
Jul 12, 2011

Llamadeus posted:

Some stills cameras can also do it by using the sensor stabiliser to provide the jitter.

lmao

BobHoward
Feb 13, 2012

The only thing white people deserve is a bullet to their empty skull

Cygni posted:

My understanding is that they were initially for doing computational photography tricks, which is how phones get the “pictures” from the tiny sensor to look good to people. Behind the scenes, it’s stitching together multiple pictures and applying filters to them in real time to make a hybrid monster image that people like.

Supposedly it also gets used for FaceID and those Animoji/memoji you haven’t seen in years.

Nah, computational photog stuff in Apple's SoCs is a dedicated block, the Image Signal Processor (ISP). It's existed in their chips a lot longer than the Apple Neural Engine (ANE). Apple's far from the only company with an ISP, all cellphone SoCs have had one for ages.

The ANE and similar "AI" engines are coprocessors heavily specialized to accelerate matrix math, as that's the root of so-called "AI". I don't think I've ever heard much about what's in a typical ISP but my guess would be DSP cores for some programmability and possibly some image filter engines that are slightly less programmable.

Apple has recently started borrowing the ANE for some camera functions - an example being that on Apple Silicon MacBooks, the ANE is used to do advanced "AI" image enhancement on the webcam's output. However, as far as I know, the ANE postprocesses the ISP's output rather than taking over the whole pipeline.

The first ANE was in 2017's A11 Bionic, used in the iPhone X, which was the first iPhone with FaceID - so yeah, at the time, FaceID was the ANE's headline feature.

Nolgthorn
Jan 30, 2001

The pendulum of the mind alternates between sense and nonsense

Cygni posted:

the funniest one is Huawei/Samsung getting busted faking the moon in their pictures.

It's one of those things that in retrospect of course was going to happen.

Cygni
Nov 12, 2005

raring to post

BobHoward posted:

Nah, computational photog stuff in Apple's SoCs is a dedicated block, the Image Signal Processor (ISP). It's existed in their chips a lot longer than the Apple Neural Engine (ANE). Apple's far from the only company with an ISP, all cellphone SoCs have had one for ages.

The ANE and similar "AI" engines are coprocessors heavily specialized to accelerate matrix math, as that's the root of so-called "AI". I don't think I've ever heard much about what's in a typical ISP but my guess would be DSP cores for some programmability and possibly some image filter engines that are slightly less programmable.

My understanding from articles back at the time was that the NPU was an offshoot of the ISPs to allow them to do different computational photography tricks than the ISPs were doing, stuff like subject recognition etc. I might fully be wrong though, not really a camera guy :shrug:

BobHoward
Feb 13, 2012

The only thing white people deserve is a bullet to their empty skull

Cygni posted:

My understanding from articles back at the time was that the NPU was an offshoot of the ISPs to allow them to do different computational photography tricks than the ISPs were doing, stuff like subject recognition etc. I might fully be wrong though, not really a camera guy :shrug:

I guess those articles weren't wrong, but also not quite right? The wrong part is that it was a new block designed to accelerate inference, not an offshoot of the ISP. The right part is that one of the things you can do with spicy matrix math is, as you mentioned, the new kinds of image processing tasks made possible by building and training a model. So, sometimes it does have a role to play in work that was formerly ISP-only. But the ISP is still there to this day; it wasn't made redundant by the NPU.

movax
Aug 30, 2008

ISPs I would think would be 'new' to the desktop space -- on mobiles / cameras they have been around for decades(?) to do HW de-Bayering / interface to the raw sensor. If anything I guess there is some internal alignment that has occurred where ISP/DSP blocks have converged so the ISP block on a modern SoC is a derivative case from whatever generic DSP cores exist. Or not, since validation is $$$ and don't gently caress with what works, I guess.

Cygni posted:

My understanding from articles back at the time was that the NPU was an offshoot of the ISPs to allow them to do different computational photography tricks than the ISPs were doing, stuff like subject recognition etc. I might fully be wrong though, not really a camera guy :shrug:

This would make more sense; IMO the ISP sits in the pipeline as a block that ingests raw sensor data, processes (many steps to get from raw sensor data to a x by y pixels frame) and outputs frames upstream to whatever is next, NPU or otherwise.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
https://twitter.com/VideoCardz/status/1790086418984816810?t=jmjSYjBKaFQxhArYEkVIkA&s=19

Dr. Video Games 0031
Jul 17, 2004

Worth stressing that this will have lower cache amounts than the 7000 desktop parts on top of the lower clock speeds. So the 8700F will be considerably slower than the 7700 (X or not). Great naming AMD, very clear.

hobbesmaster
Jan 28, 2008

Dr. Video Games 0031 posted:

Great naming AMD, very clear.

Xilinx guys: thanks!

Anime Schoolgirl
Nov 28, 2002

the benefit of APUs or chips made from them is that they do support faster RAM than the chiplet CPUs and have asynchronous IF clock but this has little practical use outside of very specific applications

Cygni
Nov 12, 2005

raring to post

e: now that AMD has posted the product pages, i was wrong! i thought the 8400F was an 8500G die without graphics, but its actually an 8600G without graphics.

so:

8700G with no graphics = 8700F
8600G with no graphics = 8400F(??)

i assumed the names would mean something, and that was silly of me. i know better.

Cygni fucked around with this message at 08:26 on May 14, 2024

Harik
Sep 9, 2001

From the hard streets of Moscow
First dog to touch the stars


Plaster Town Cop
stupid question but cross-check me on this: Is there any difference between a 7302 and a 7302P aside from multi-processor support? I thought the P just had that fused off for segmentation reasons, but I don't want to discover that half the PCIe lanes are unusable on a single-chip motherboard because I have the "wrong" one.

I've got a week to return it for the "right" processor while waiting on the rest of the parts to arrive.

Anime Schoolgirl
Nov 28, 2002

this article is for epyc 1 but on the non-P cpu and board STH tested, the 64 lanes that would be used for IF on a dual socket board are routed properly to devices on the board. don't think they'd change this behavior going forward, especially as some boards were upgradeable to epyc 2. you could of course change the CPU if you wanted to be sure.

https://www.servethehome.com/single-socket-amd-epyc-7000-faq-answers-common-questions/

priznat
Jul 7, 2009

Let's get drunk and kiss each other all night.
I’m annoyed you can’t get the pro models in retail boxes. I want one for a nas build! Integrated graphics and ECC? Nice.

Anime Schoolgirl
Nov 28, 2002

priznat posted:

I’m annoyed you can’t get the pro models in retail boxes. I want one for a nas build! Integrated graphics and ECC? Nice.
on alibaba & associates there's a lot of loose used ryzen pro 4650gs floating around from decommissioned dells and HPs, depending on the upgrade cadence of said offices we might be seeing used pro 5650gs or pro 8600gs within the next 5 years

Harik
Sep 9, 2001

From the hard streets of Moscow
First dog to touch the stars


Plaster Town Cop

Anime Schoolgirl posted:

this article is for epyc 1 but on the non-P cpu and board STH tested, the 64 lanes that would be used for IF on a dual socket board are routed properly to devices on the board. don't think they'd change this behavior going forward, especially as some boards were upgradeable to epyc 2. you could of course change the CPU if you wanted to be sure.

https://www.servethehome.com/single-socket-amd-epyc-7000-faq-answers-common-questions/

thanks, that checks with what I was seeing as well. It makes sense since there's not -P variants of most of the lineup, and I don't see them selling their top-of-the-line chip only in pairs.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
Is there a thread discussing MS’s new ARM push? (Asking here cause it’s CPU centric)

https://www.theverge.com/2024/5/20/24160486/microsoft-copilot-plus-ai-arm-chips-pc-surface-event

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
So anyhow, here's the dumbest question of the week: If Apple can make such a super efficient CPU, why can't AMD or Intel? Surely can't be just the packaged RAM on Mx or the bigger µOp decoder on x86.

Arivia
Mar 17, 2011

Combat Pretzel posted:

So anyhow, here's the dumbest question of the week: If Apple can make such a super efficient CPU, why can't AMD or Intel? Surely can't be just the packaged RAM on Mx or the bigger µOp decoder on x86.

helps when you don't need to give remotely a poo poo about backwards compatibility and can make a CPU to do EXACTLY what you want it to do and dictate all the software match you, instead of being stuck with 40 years of standards

repiv
Aug 13, 2009

also helps if it doesn't matter how much the chip costs to make because it will only ever be packaged with an entire system with enormous margins

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Arivia posted:

helps when you don't need to give remotely a poo poo about backwards compatibility and can make a CPU to do EXACTLY what you want it to do and dictate all the software match you, instead of being stuck with 40 years of standards

Apple’s ARM stuff has a bunch of things specifically for backward compatibility with the x86 memory model, as it happens. This is one of the reasons that Rosetta 2 works so well.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
So that x86S stuff Intel's planning could eventually result in some performance advantages?

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.
You can't overlook economies of scale for Apple, either. The iPhone sells more units annually than the entire global laptop market combined, which helps them because of the substantial sharing across all their processors.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Combat Pretzel posted:

So that x86S stuff Intel's planning could eventually result in some performance advantages?

I doubt it, they’re pretty trivial simplifications. might let them take a few gates off a cut-down chip, but nothing in there is going to make x86 as fast to decode as ARM AFAICT

repiv
Aug 13, 2009

like not to undersell apples silicon engineering, because it's good, but they have something of a head-start when part of their strategy is just to give the worlds leading foundry a quadrillion dollars to completely buy out their latest most efficient nodes for their first year

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Intel could have its foundry do whatever it wanted…

VorpalFish
Mar 22, 2007
reasonably awesometm

Subjunctive posted:

Intel could have its foundry do whatever it wanted…

Except be good at things.

karoshi
Nov 4, 2008

"Can somebody mspaint eyes on the steaming packages? TIA" yeah well fuck you too buddy, this is the best you're gonna get. Is this even "work-safe"? Let's find out!
Remember when Intel had a 2 year manufacturing lead on the whole world? :allears: And then they decided that was just a cost center.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
It's definitely a combination of a bunch of things. Apple hired very good people to design efficient processors for mobile devices on the most advanced nodes possible. They aren't trying to compete with Intel and AMD, they're trying to dominate the mobile device world, and frankly in terms of CPUs they are.

AMD and Intel have a different target. Laptops are in it, but so are desktops and servers. Performance, broad spectrums of hardware acceleration, and hardware backcompat matter more to them. Apple isn't competitive with them in most of their applications, and they aren't trying to be. Their entire hardware and software stack is basically bespoke, they can optimize for their goals and say "lol it's not supposed to do that" to anything else.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



repiv posted:

like not to undersell apples silicon engineering, because it's good, but they have something of a head-start when part of their strategy is just to give the worlds leading foundry a quadrillion dollars to completely buy out their latest most efficient nodes for their first year

Pays to be the world’s richest tech company

Adbot
ADBOT LOVES YOU

Klyith
Aug 3, 2007

GBS Pledge Week
The M chips can't add an extra .25ghz if you shove 200 extra watts into them, sounds like a CPU for wusses. Over here in x86 land we have Chad CPUs!

haha electrons go brrrrrrr

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply