Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
LRADIKAL
Jun 10, 2001

Fun Shoe
It's impossible to know if Paul is being willfully obtuse or just lacking a certain type of perception.

Adbot
ADBOT LOVES YOU

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Dr. Video Games 0031 posted:

How does the switch to DDR5 actually affect performance? Does it do anything to be faster than DDR4 at the same speeds and timings, or are they directly comparable to one another? What I mean is, is DDR5-4800 at those timings as terrible as it looks on paper, or is there a secret sauce that makes it good?

really good DDR4 is about 10ns latency and so far DDR5 seems to be about 15ns latency, although we'll really probably see what it can actually do once alder lake reviews hit.

on the other hand ddr5 moves from a 1x64b model per stick to 2x32b per stick - 2 sticks is notionally quad-channel RAM, just narrower channels, and that means there can be more, narrower requests in-flight at any given time, and the burst length (amount of bits you get per address read) is doubled so you get the same amount of data in total, meaning you incur the CAS penalty less often. So basically it's built around the idea of a lot more requests being in-flight at a given time, narrower but longer chunks, and you read more chunks in parallel.

https://www.rambus.com/blogs/get-ready-for-ddr5-dimm-chipsets/

I'm really terribly curious about the alder lake reviews because that's going to be our first real look at how all of this actually shakes out. Obviously the next gen will be better both in memory controllers and the available RAM, but it's not just "ddr4 but faster" like ddr4 was relative to ddr3.

DrDork posted:

I don't really think you can worry about that without sliding into "Intel subscription-loving consumers!" slope arguments: the only market segments that they're likely to target with this stuff are megacorps where internet access will always be a given and the only resale will be far enough down the line that they'd only be interesting to the types of people who today are thinking that buying a Dell 710 rack is a good idea.

Even if it did work its way down that far, I'd assume the answer would be something like it has to phone home ever 7 or 30 days or whatever to keep the DLC going so you don't have problems just because your network dropped for 5 minutes or whatever. Or it just reverts to the non-DLC mode and you get to chug along without those extra cores until you get your internet back.

For resale, though, yeah that'd be an interesting question: does the DLC permanently unlock the features via some writable bit within the CPU? Or are you now trying to sell a CPU + intel.com account to go along with it? Worries for another day, at least.

is this the part where I bring up how AMD is exploring the exciting innovation of using kill-bits to lock processors to a specific motherboard to prevent resale?

because yeah that's literally the route AMD is going down: at first boot the processor ties itself to the motherboard, ostensibly to prevent someone sneaky from yoinking the processor or some other BS thing that doesn't actually happen. Most likely actual reason imo is because AMD will offer the chip to vendors at a discount if they make sure the chip won't enter the resale market - in hindsight I'm seriously wondering if those HP Epyc upgrade kits that were under actual AMD tray pricing were already pre-locked to HP hardware.

To be fair it does just mean you still can sell it, you just have to sell it with a board, but it's still a bullshit path that AMD is going down

Paul MaudDib fucked around with this message at 04:07 on Oct 13, 2021

hobbesmaster
Jan 28, 2008

Paul MaudDib posted:

really good DDR4 is about 10ns latency and so far DDR5 seems to be about 15ns latency, although we'll really probably see what it can actually do once alder lake reviews hit.

on the other hand ddr5 moves from a 1x64b model per stick to 2x32b per stick - 2 sticks is notionally quad-channel RAM, just narrower channels, and that means there can be more, narrower requests in-flight at any given time, and the burst length (amount of bits you get per address read) is doubled so you get the same amount of data in total, meaning you incur the CAS penalty less often. So basically it's built around the idea of a lot more requests being in-flight at a given time, smaller chunks but you read more chunks in parallel, and they can be different chunks.

https://www.rambus.com/blogs/get-ready-for-ddr5-dimm-chipsets/

I'm really terribly curious about the alder lake reviews because that's going to be our first real look at how all of this actually shakes out. Obviously the next gen will be better both in memory controllers and the available RAM, but it's not just "ddr4 but faster" like ddr4 was relative to ddr3.

is this the part where I bring up how AMD is exploring the exciting innovation of using kill-bits to lock processors to a specific motherboard to prevent resale?

because yeah that's literally the route AMD is going down: at first boot the processor ties itself to the motherboard, ostensibly to prevent someone sneaky from yoinking the processor or some other BS thing that doesn't actually happen. Most likely actual reason imo is because AMD will offer the chip to vendors at a discount if they make sure the chip won't enter the resale market.

To be fair it does just mean you still can sell it, you just have to sell it with a board, but it's still a bullshit path that AMD is going down

OTOH this is the only way they can do a secure boot in the sense that say an iPhone has.

If you have national security customers that believe that TSMC is about to be invaded or whatever then maybe that’s a good product to have to sell.

hobbesmaster fucked around with this message at 02:32 on Oct 13, 2021

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

hobbesmaster posted:

OTOH this is the only way they can do a secure boot in the sense that say an iPhone has.

If you have national security customers that believe that TSMC is about to be invaded or whatever then maybe that’s a good product to have to sell.

Wouldn't the security usage be the other way around--a kill-bit on the motherboard? Like, you yank a CPU out and...you get nothing from it by plopping it into a different (hostile) motherboard unless cache contents stay around way longer than I think they do. And even then, you'd have to be able to go from a running system storing something interesting in the CPU cache to your hostile system pretty quick--selling it on eBay or whatever is not gonna let you recover anything from that CPU. And you'd have to manage to keep the CPU from clearing the cache on boot. Seems...excessively difficult. Having the CPU say "this ain't my motherboard" doesn't seem to provide any meaningful security advantage.

A motherboard I could see being a bit more interesting if you want to go super far down the state-sponsored attack vector pathway: something like a hostile actor yanking a CPU out of a system and replacing it with a hostile CPU. Having the motherboard say "no, this ain't the CPU it should be" could be useful in that context.

Locking a CPU to a motherboard seems to only prevent it from entering the resale market. Locking a motherboard to a CPU would seem to be more in line with Secure Boot and other paranoid security concerns, unless I'm missing something here.

e; although a place I worked at did have a disgruntled employee walk in to an office on a Friday afternoon during COVID when no one else was there and yanked out the RAM, CPU, and SSDs from 20+ workstations and then started posting them on eBay. Not sure how that one ended, but obviously the police got involved pretty quick.

DrDork fucked around with this message at 05:34 on Oct 13, 2021

WhyteRyce
Dec 30, 2001

Let’s talk about freeze spraying dram

Perplx
Jun 26, 2004


Best viewed on Orgasma Plasma
Lipstick Apathy
Isn’t the tpm on modern cpus now? So it would be the root of trust. Also it should be harder to extract keys from a 5nm cpu than a 28nm or whatever tpm chip.

BlankSystemDaemon
Mar 13, 2009



Wait, what if they put racing stripes on the memory? Surely that'll make it go faster!

Perplx posted:

Isn’t the tpm on modern cpus now? So it would be the root of trust. Also it should be harder to extract keys from a 5nm cpu than a 28nm or whatever tpm chip.
AMD tends to come with fTPM - ie. emulated TPM in the firmware.

kliras
Mar 27, 2021

BlankSystemDaemon posted:

Wait, what if they put racing stripes on the memory? Surely that'll make it go faster!

AMD tends to come with fTPM - ie. emulated TPM in the firmware.
It tends to be off by default. Wonder when that's going to be flipped so the average consumer will use it.

repiv
Aug 13, 2009

kliras posted:

It tends to be off by default. Wonder when that's going to be flipped so the average consumer will use it.

A good chunk of motherboards have BIOS updates that enable the relevant bits for Windows 11 by default now, so presumably new boards will have that out of the box already

repiv
Aug 13, 2009

https://www.intel.com/content/www/us/en/developer/articles/guide/alder-lake-developer-guide.html

Apparently it is possible to enable AVX512 on Alder Lake after all, provided the smol cores are disabled

e: or maybe not, this developer guide is dated from April so maybe plans changed since then

repiv fucked around with this message at 18:04 on Oct 15, 2021

hobbesmaster
Jan 28, 2008

repiv posted:

https://www.intel.com/content/www/us/en/developer/articles/guide/alder-lake-developer-guide.html

Apparently it is possible to enable AVX512 on Alder Lake after all, provided the smol cores are disabled

e: or maybe not, this developer guide is dated from April so maybe plans changed since then

Interesting, but is it going to burn the chip to a crisp like AVX-256?

mmkay
Oct 21, 2010

Jesus, I wouldn't want to be part of the team that needs to write a scheduler. I thought it was finicky enough before ('on Bulldozer we have shared resources here, on Intel cpus there, on zen the caches behave badly when you do that'), but this is another level. At least there isn't that fuckup where Samsung (I think?) allowed for different instruction sets between cores due to twiddling in the kernel.

Arzachel
May 12, 2012

mmkay posted:

Jesus, I wouldn't want to be part of the team that needs to write a scheduler. I thought it was finicky enough before ('on Bulldozer we have shared resources here, on Intel cpus there, on zen the caches behave badly when you do that'), but this is another level. At least there isn't that fuckup where Samsung (I think?) allowed for different instruction sets between cores due to twiddling in the kernel.

Best part was that Samsung themselves patched the kernel to disable the instruction check.

mmkay
Oct 21, 2010

repiv posted:

https://www.intel.com/content/www/us/en/developer/articles/guide/alder-lake-developer-guide.html

e: or maybe not, this developer guide is dated from April so maybe plans changed since then

US dating system, so 4th of October (you can compare the dates here, also the linked github changes are from 10 days ago as well).

repiv
Aug 13, 2009

bamboozled by nonsensical american dates yet again :argh:

repiv
Aug 13, 2009

:rip:

https://twitter.com/IanCutress/status/1449080507229216769

It must have been planned at some point to have made it into that guide

Maybe in 13th gen

AARP LARPer
Feb 19, 2005

THE DARK SIDE OF SCIENCE BREEDS A WEAPON OF WAR

Buglord
please be aware that I’m p dumb but what is the point of offering avx on chips that can’t handle the thermals of running said institutions?

Kazinsal
Dec 13, 2011



Freedom Trails posted:

please be aware that I’m p dumb but what is the point of offering avx on chips that can’t handle the thermals of running said institutions?

Marketing checkbox

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

Freedom Trails posted:

please be aware that I’m p dumb but what is the point of offering avx on chips that can’t handle the thermals of running said institutions?

higher Cinebench scores

isndl
May 2, 2012
I WON A CONTEST IN TG AND ALL I GOT WAS THIS CUSTOM TITLE

Freedom Trails posted:

please be aware that I’m p dumb but what is the point of offering avx on chips that can’t handle the thermals of running said institutions?

As far as Intel is concerned, the thermals are the OEM's problem. Not Intel's fault if your laptop doesn't having the cooling to run AVX without melting.

repiv
Aug 13, 2009

Freedom Trails posted:

please be aware that I’m p dumb but what is the point of offering avx on chips that can’t handle the thermals of running said institutions?

downclocking when using the instructions that double math throughput per clock is still a win, as long as you downclock by less than 50%

TOOT BOOT
May 25, 2010

https://www.tomshardware.com/news/intel-alder-lake-cpus-may-not-work-with-older-games

Apparently there's going to be compatibility breakage with older DRM.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Dr. Video Games 0031 posted:

How does the switch to DDR5 actually affect performance? Does it do anything to be faster than DDR4 at the same speeds and timings, or are they directly comparable to one another? What I mean is, is DDR5-4800 at those timings as terrible as it looks on paper, or is there a secret sauce that makes it good?

charlie demerjian actually put out a good article that explains a lot of this in design terms - although of course the stuff he's talking about is server-focused (eg "DDR4 is 3200" is obviously not true for gaming RAM but that's where server stuff is)

quote:

In the end DDR5 will have a lot of tricks to play vs DDR4. It starts out 50% faster, 4800MT/s vs 3200MT/s, than DDR4 and runs at a lower voltage for more efficiency. Toss in the dual channel party tricks and you add a lot more flexibility to the mix, plus reliability is said to be at least as good as the older parts. With the RCDs and DBs in production now for the first gen DDR5 and well underway for the second, it looks like the module side has things covered. Now we wait for the CPUs
https://www.semiaccurate.com/2021/10/13/rambus-releases-2nd-gen-ddr5-rcds/

WhyteRyce
Dec 30, 2001

Less than a day after Pat says he wants to win Apple back by out competing them, Apple figuratively craps all over them in the Mac event

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map

WhyteRyce posted:

Less than a day after Pat says he wants to win Apple back by out competing them, Apple figuratively craps all over them in the Mac event

OK Apple, now make your next magic trick the running of Linux and Windows on M1 Pro & Max

admittedly, one of those is probably going to be easier than the other

Sidesaddle Cavalry fucked around with this message at 22:45 on Oct 18, 2021

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Sidesaddle Cavalry posted:

OK Apple, now make your next magic trick the running of Linux and Windows on M1 Pro & Max

admittedly, one of those is probably going to be easier than the other

That'll happen much sooner than later, regardless.

The real magic trick would be getting over their butt-hurt from like a decade ago vis a vis NVidia and a batch of bas laptop GPU solders and finally put some graphical hardware into their machines that isn't terrible.

trilobite terror
Oct 20, 2007
BUT MY LIVELIHOOD DEPENDS ON THE FORUMS!

DrDork posted:

finally put some graphical hardware into their machines that isn't terrible.

they may have already done that today

Sidesaddle Cavalry posted:

OK Apple, now make your next magic trick the running of Linux and Windows on M1 Pro & Max

you can already do Linux VMs on M1

Gwaihir
Dec 8, 2009
Hair Elf

DrDork posted:

That'll happen much sooner than later, regardless.

The real magic trick would be getting over their butt-hurt from like a decade ago vis a vis NVidia and a batch of bas laptop GPU solders and finally put some graphical hardware into their machines that isn't terrible.

The top end chip released today is literally a tiny CPU wrapped around a gigantic GPU. It's got performance stats on par with a full size desktop level RTX 2080. But a side effect is that the CPU also gets memory bandwidth 4-10x higher than any existing desktop or server CPU.

trilobite terror
Oct 20, 2007
BUT MY LIVELIHOOD DEPENDS ON THE FORUMS!

Gwaihir posted:

The top end chip released today is literally a tiny CPU wrapped around a gigantic GPU. It's got performance stats on par with a full size desktop level RTX 2080. But a side effect is that the CPU also gets memory bandwidth 4-10x higher than any existing desktop or server CPU.

veering too much into GPUchat but isn’t the 3060Ti basically on par with the 2080?

I bought a big-rear end desktop tower in April with a 3060Ti + 11700K in it and the idea that there’s a $1200-1900 Mac Mini on the horizon that will literally match it, and possibly beat it, on several fronts fills me with feelings

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

Ok Comboomer posted:

veering too much into GPUchat but isn’t the 3060Ti basically on par with the 2080?

2080 Super, even:

trilobite terror
Oct 20, 2007
BUT MY LIVELIHOOD DEPENDS ON THE FORUMS!
brb gonna spec out a $3600 14” ultraportable notebook that can go ten rounds with my $2300 gaming tower

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.


Ok Comboomer posted:

veering too much into GPUchat but isn’t the 3060Ti basically on par with the 2080?

I bought a big-rear end desktop tower in April with a 3060Ti + 11700K in it and the idea that there’s a $1200-1900 Mac Mini on the horizon that will literally match it, and possibly beat it, on several fronts fills me with feelings

I guess the question is "beat it in what?"

If you bought that thing to do video or photography workflows, then yeah. If that's a gaming box, then *shrug*.

That's the way I view these notebooks. They are extremely powerful energy efficient machines that shame the gently caress out of intel. They are also offer me zero utility over what I already have.

There's nothing I'm doing on my XPS 13 2:1 with an i7-1065G7 that would be better on one of these devices and I enjoy the XPS form factor better. My desktop is pretty much exclusively for gaming so no real use case there. Ditto with my gaming notebook.

The one place where I could use the extra performance and it would make my day to day better is work. But that's also a place where I have zero choice what device I get to use and it will be ages if ever before an M1 based Mac is on the list of equipment. So I get to chug along with the T480 POS and crappy 8th gen i7, fans going full tilt and unable to drive both my 1440p monitors without using displaylink.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

bull3964 posted:

I guess the question is "beat it in what?"

You're right about most of this, and I find myself in the same situation: I just don't need a mobile video-editing station, and even mobile gaming is still better done with a Windows-based machine.

Buuuuuut hot drat the battery life. Right now I often travel with a Chromebook because of the silly battery life that little guy gets compared to my X1E, but these things should be able to go even longer and have considerably more horsepower.

I just...you know...don't have much of a personal need for it, and absolutely cannot justify a $2500+ purchase for it. Others, though, I'm sure can and will.

Does make me wonder if AMD has anything even vaguely similar in the works based off their previous SoC work. I mean, RISC vs CISC aside, going for a giant gently caress-off SoC that lets you shove everything together like that obviously has considerable advantages for a laptop / SFF type box where you are soldering everything to the motherboard anyhow. I'd wonder if Intel was considering it, too, but I don't see any evidence they've even considered it.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

DrDork posted:

Does make me wonder if AMD has anything even vaguely similar in the works based off their previous SoC work. I mean, RISC vs CISC aside, going for a giant gently caress-off SoC that lets you shove everything together like that obviously has considerable advantages for a laptop / SFF type box where you are soldering everything to the motherboard anyhow. I'd wonder if Intel was considering it, too, but I don't see any evidence they've even considered it.

I think the financial aspects make it non-viable for anyone except Apple. AMD/Intel would have to get paid, and then someone else would have to make a device that could be sold at a market-viable price.

Fair market price for that SOC is probably $4k-8k, and Apple is giving you the laptop for free.

That's a somewhat shocking concept but think about it - that's server tier hardware right there, in design terms. It's twice as many transistors as a 3090 that sells for $2k. It's on TSMC 5nm, it's got 64 GB of stacked RAM. If it were DDR4 it'd be an octochannel part - undeniably server territory.

Since the M1, Apple has actually been giving you a fairly incredible amount of performance for the price - it is actually one of the best-value systems in the market, if you can live with non-x86, MacOS, and the Apple hardware design mentality. But the math literally only works because Apple both makes and sells the chip, that is what is getting them the economy here, they can eliminate the x86 tax and even the ARM tax and just give you at-cost SOCs.

Oh yeah and you still get ripped for storage/RAM of course... although with stacked RAM there's at least an argument that it's different from normal soldered/socketed RAM.

Paul MaudDib fucked around with this message at 17:01 on Oct 19, 2021

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.


DrDork posted:

You're right about most of this, and I find myself in the same situation: I just don't need a mobile video-editing station, and even mobile gaming is still better done with a Windows-based machine.



Yeah, my PERSONAL computing really is all about the browser and I don't really need a machine to do much. I honestly spend most of my time on a tablet or my phone, pulling out the XPS 13 or my Pixelbook if I really need a keyboard and desktop browser for something.

I'm just a boring middle aged person who works from home. I'm not a creator. Most of my internet time is spent shitposting here or reading about other aspirational devices that I have no clear use case for.

The battery life is nice, sure, but I'm not using devices like this away from a wall outlet very long and everything charges via USB-C now anyways.

Again, the big place where all of this stuff would come together is if I could use something like this for work. But I can't. It would actually be overkill for work too, but at least I would have to listen to fans cycling on and off all day.

Really feeling like the average person is lacking a use case for all the compute power we have at our fingertips now. We need some revolution in software/services that elevate things. It feels like the whole market is driven either to play games or to edit video and there has got to be something else out there that could benefit from this much power that's a more accessible use case.

It's like, Apple could come out with a device tomorrow and says "can do anything you do on a computer instantly!' and I think I would still shrug. Anything I do on a computer is pretty much instant as it is right now. I need a use case that taxes the systems I have for that to be of value for me.

It's obviously value to people who make their living on stuff taking less time, but that's not the whole of the market.

bull3964 fucked around with this message at 17:11 on Oct 19, 2021

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
It is an interesting question, yeah. I agree with you that the M1 Pro certainly leans more towards "server style specs," but at the same time, the x86/ARM tax isn't $1500+, either.

But obviously the only way to make it profitable even for Apple is to do a rather limited number of SKUs so you can really crank volume on the selected parts. No idea how much uptake AMD/Intel would get if they went to OEMs and went "Ok, we can give you a SoC that is 25% more performant and 50% better at power but you only get to pick from 4 configurations."

Though if Apple keeps making GBS threads on them with M-series chips, at some point they may have to start accepting those sorts of trades to stay competitive with anything over the $1000 price point.

bull3964 posted:

Really feeling like the average person is lacking a use case for all the compute power we have at our fingertips now. ... It feels like the whole market is driven either to play games or to edit video and there has got to be something else out there that could benefit from this much power that's a more accessible use case.

This has arguably been the case for years already. And the push for PaaS and doing the heavy lifting in the Cloud is only going to accelerate it. Content creation certainly benefits from the sort of performance the M1 series brings, but yeah, content creators can't be more than like 1-2% of users I'd imagine. Everyone else just needs a reasonably speedy web interface and/or gaming box.

DrDork fucked around with this message at 17:13 on Oct 19, 2021

repiv
Aug 13, 2009

DrDork posted:

But obviously the only way to make it profitable even for Apple is to do a rather limited number of SKUs so you can really crank volume on the selected parts.

I saw it pointed out that the M1 Pro is just the top half of an M1 Max (assuming the official die shots are accurate)



I wonder if there's some funky manufacturing optimization going on where they use the same masks for both chips

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

repiv posted:

I wonder if there's some funky manufacturing optimization going on where they use the same masks for both chips

I have no idea, but it'd certainly make a lot of logical sense to have one mask that you can really iterate heavily on to get right and then add another segment for the Max, vs two entirely separate masks. But it's been a very, very long time since I've done any sort of chip design, so who knows.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

bull3964 posted:

Most of my internet time is spent shitposting here or reading about other aspirational devices that I have no clear use case for.

I mean I get it, the "aspirational devices" part is nice. I don't even really NEED to upgrade to Alder Lake or to look for an 18-core Haswell but playing with this stuff or just knowing that you have the horsepower underneath the hood is cool.

Adbot
ADBOT LOVES YOU

Walked
Apr 14, 2003

gradenko_2000 posted:

I mean I get it, the "aspirational devices" part is nice. I don't even really NEED to upgrade to Alder Lake or to look for an 18-core Haswell but playing with this stuff or just knowing that you have the horsepower underneath the hood is cool.

ngl 80% of my computer upgrade cycles these days are "more power would be cool" and nothing more
me: work in tech; 98% of my day is in terminal, vscode, or a browser

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply