|
Hello Spaceman posted:i wonder if the 16gb ram limitation for the m1 macs is a repeat of the limitation where apple only offered macbooks with up to 16gb because the intel chipsets didn't support low power ddr ram? Hi I am a silicon weenie, idk about wizard It's very expensive to design, validate, and start manufacturing a new chip design. On the other hand, once all these non-recurring costs are paid, the incremental cost to make each chip is relatively low. So, to save money on setup, often you want to design a multirole chip. You identify two or more products or markets that are close enough, and make the chip have all the features and performance it needs to serve any of them. You're wasting something by having so-called "dark silicon" in each product (stuff that's needed for other products, but is not active in this one), but if your finance and marketing guys successfully predict sales of all the variants, and you make good decisions about how much space to cover with each design (you don't want to overreach), it's a net win. This is an example. By all appearances, the M1 will lead a double life as the A14X in an eventual iPad Pro refresh. In that application, it'll be programmed with lower thermal limits, an iPad probably won't have two USB4/Thunderbolt, you probably won't see 16GB RAM in an iPad, yada yada yada. (And on the Mac side, there's unused stuff too - the touchscreen controller, for example.) That's where the M1's limitations come from. Its specs resulted from an Apple review of: "Which Macs can we power with a slightly upfeatured iPad Pro chip"? The answer they came up with is all the low-end Macs. Everything which previously already had a limitation of 16GB RAM, or two USB/thunderbolt ports, or an i3. Note that they've kept around high end versions of the Intel mini and MBP 13". Only the MacBook Air got fully replaced with M1 models, because no Intel MBA had more than 16GB RAM or a giant SSD or two USB/TB ports. To make bigger Macs, they're going to need bigger and more capable M series chips. These will not get to share a design with a high volume iOS device. Apple probably delayed them relative to M1 for two basic reasons: getting M1 out the door helps them ship the iPad Pro refresh too, and it's most important to transition the low end Macs first because they're the most popular models. Low end Macs also should get the most uplift from Apple Silicon, in relative terms.
|
# ? Nov 13, 2020 21:44 |
|
|
# ? May 27, 2024 04:11 |
|
Is the reason the M1 Geekbench Multi-Core results aren't nearly as dominant as Single Core (compared to similar core count Intel chips) due to lack of Hyper Threading, or is there some other reason the M1 is much better at single core benches?
|
# ? Nov 13, 2020 22:01 |
|
The M1 only has 4 fast cores, the other 4 are to save power. A demanding app like geekbench will only run on 4 cores, which is why the multithread score is only 4.4x the single thread score.
|
# ? Nov 13, 2020 22:29 |
|
Binary Badger posted:I wonder if, with the new GPU, and Rosetta 2 being supposedly fantabulous, and Apple is incredibly keeping its moribund OpenGL 1.4 support and OpenCL 1.2 support, will older games run like lightning on the new M1 Macs? Lots of them won't run at all, because 32-bit lol. quote:Betchoo Apple kept that old graphics framework around because all the people who do molecular imaging and biotech would scream bloody murder if their software, which also happens to lean heavily on those two older APIs, broke on the new hardware. As far as I know, Apple's GL and CL were transformed into Metal API wrappers years ago. Keeping them around and porting to ARM is mostly free. There's no low-level driver code to worry about, everything is very stable, all that the GL and CL wrappers need is low-intensity maintenance. Fix bugs, tweak as needed to keep up with Metal API evolution. Fame Douglas posted:Lacking backwards compatibility aside, considering all the games they showed in their keynote didn't look to be running all that well, I don't think they'll "run like lightning". We're still talking about integrated graphics, those are never all that great. This is not your dad's integrated graphics But seriously, M1 GPU: 2.6 teraflops, 82 gigatexel fill rate AMD Radeon Pro 5600M: 5.3 teraflops, 165 gigatexel fill rate The 5600M is the best GPU Apple offers in the 16" MBP. Having about half its raw computational throughput in such a low power chip (Big Navi there is about a 50W chip all on its own) is huge. And because the TBDR architecture of Apple's GPU is a lot more efficient at using that raw throughput, it might punch a bit above its weight when rendering graphics.
|
# ? Nov 13, 2020 22:33 |
|
Fame Douglas posted:I didn't take that as "they'll run faster under emulation". I took that as a "our new CPUs are so much faster than the old ones even emulated these applications will run faster on the new machines". I'm not completely sure which claim you're talking about, but if it's the one about games, I think it amounted to games, specifically, running faster on the new machines even under emulation. Which is plausible, because old low end Macs all had such anemic Intel GPUs that most games were completely GPU-limited, not CPU-limited.
|
# ? Nov 13, 2020 22:38 |
|
It's because games written in metal run native as metal on AS, the game logic has to be emulated, but the graphics primitives are the same and can just be passed to the video card.
|
# ? Nov 13, 2020 22:43 |
|
Combat Pretzel posted:Yeah, right, 37% higher score over a 10700K at 3.8GHz. None of the above. It's really that good. Some of it is that the 10700K is Yet Another 14nm Skylake Rehash. Here's some Anandtech test data against Intel's latest 10nm Willow Cove core: https://www.anandtech.com/show/16226/apple-silicon-m1-a14-deep-dive/4 That's a ~5W iPhone chip hanging right with Intel and AMD's best. Note that the Intel design is a 28W mobile chip and the AMD is a 105W desktop gaming chip. The M1 has the same CPU core as the A14, with higher thermal limits.
|
# ? Nov 13, 2020 22:54 |
|
BobHoward posted:Lots of them won't run at all, because 32-bit lol. Yeah this. In this forum post Octane talks about getting GPU accelerated rendering setup for Metal (to support AMD and Apple silicon gpus). They were able to run their benchmark (OB is OctaneBench) on an A13 where it performs similarly to the integrated Intel GPU. I would imagine the quote:
|
# ? Nov 13, 2020 22:54 |
|
People don’t (can’t) use Apple laptops for gaming so I dunno how meaningful the M1’s graphics performance is Even with Apple Arcade, it’s been disappointing how little Apple cares about gaming given they could have a mobile platform that plays better than a Switch
|
# ? Nov 13, 2020 22:56 |
|
shrike82 posted:People don’t (can’t) use Apple laptops for gaming so I dunno how meaningful the M1’s graphics performance is Wait for the Zoom effects marketplace!
|
# ? Nov 13, 2020 23:06 |
|
BobHoward posted:None of the above. It's really that good. The thing I’m curious about is how an integrated package can stand up to some of the bigger, badder discrete packages. Unless they make those m1x dies absolutely massive and damage their yields, idk what sorcery they are going to perform in order to stand toe to toe with a chip that is 251mm^2 and dedicated just to graphics alone (the 5600m is based on the 5700XT navi chip iirc).
|
# ? Nov 13, 2020 23:07 |
|
Thinking about replacing my 2015 5k iMac with a Mini but can't quite pull the trigger because the only external monitor I have is a 24" 1920x1200 Dell monitor. Otherwise the M1 benchmarks absolutely murder both the CPU and GPU on my 5 year old iMac.
|
# ? Nov 13, 2020 23:10 |
|
gret posted:Thinking about replacing my 2015 5k iMac with a Mini but can't quite pull the trigger because the only external monitor I have is a 24" 1920x1200 Dell monitor. Otherwise the M1 benchmarks absolutely murder both the CPU and GPU on my 5 year old iMac. get a 43” 4K tv for $250-$450 depending on which model you get/what you prioritize. or a 4K desktop display I guess, but you won’t spend any less and where’s the fun in that if you decide to dump it for an AS iMac down the road, the screen is a good size to put in a home office or bedroom, or possibly a kitchen depending on your living situation. trilobite terror fucked around with this message at 23:16 on Nov 13, 2020 |
# ? Nov 13, 2020 23:14 |
|
shrike82 posted:People don’t (can’t) use Apple laptops for gaming so I dunno how meaningful the M1’s graphics performance is They get a 30% cut from all the games on the app store, they're doing fine on mobile gaming.
|
# ? Nov 13, 2020 23:32 |
|
gret posted:Thinking about replacing my 2015 5k iMac with a Mini but can't quite pull the trigger because the only external monitor I have is a 24" 1920x1200 Dell monitor. Otherwise the M1 benchmarks absolutely murder both the CPU and GPU on my 5 year old iMac. If you like your iMac just wait till the iMac gets refreshed next year.
|
# ? Nov 13, 2020 23:34 |
|
I'm currently playing around with editing some 4K 10-bit files from a GH5 on my iPhone 12 Pro Max in Lumafusion. Playback, scrubbing and export speeds are not much slower than my i9-9900K with Vega 56 graphics. It’s insane.
|
# ? Nov 13, 2020 23:40 |
|
ptier posted:If you like your iMac just wait till the iMac gets refreshed next year. Yeah my iMac once in a while develops graphical glitches and then freezes up, so it's probably near its last legs. Hope new ARM iMacs are announced sooner rather than later.
|
# ? Nov 13, 2020 23:48 |
|
gret posted:Thinking about replacing my 2015 5k iMac with a Mini but can't quite pull the trigger because the only external monitor I have is a 24" 1920x1200 Dell monitor. Otherwise the M1 benchmarks absolutely murder both the CPU and GPU on my 5 year old iMac. I want to hear how this goes once people do it
|
# ? Nov 14, 2020 00:11 |
|
somewhat related to my earlier post, looks like I may be taking the plunge and getting a new MacBook. Replacing a mid-2012 non retina MPB. I mostly use my Mac for recording in Logic Pro X. Also do a little lightweight video editing. My current MBP is solid with Logic as long as I don't use too many midi tracks, and iMovie is fine except exporting anything over 720p takes forever. Also would like to be able to zoom or live stream without smoke pouring out of the vents One thing I'm concerned about is ram in the new M1 Macs. I put 16GB in my machine back in March, am I going to feel the cut to 8GB? Seems like maybe I won't with the geekbench specs posted earlier?
|
# ? Nov 14, 2020 00:15 |
|
I'd wait for reviews especially for anyone using their laptop for more than web-browsing, light data entry stuff.
|
# ? Nov 14, 2020 00:18 |
|
Puppy Galaxy posted:One thing I'm concerned about is ram in the new M1 Macs. I put 16GB in my machine back in March, am I going to feel the cut to 8GB? Seems like maybe I won't with the geekbench specs posted earlier? I'm not sure how CPU GeekBench scores will tell you how much you'll hate having 8gb of ram. That said, it is a tough call. But considering how long you've kept your 2012 machine, and have upgraded it to keep it relevant long term, I'd get the 16gb. Even if 16gb doesn't sound like a lot now a days.
|
# ? Nov 14, 2020 00:26 |
|
shrike82 posted:I'd wait for reviews especially for anyone using their laptop for more than web-browsing, light data entry stuff. It seems pretty clear that they’re going to kick rear end at small-to-medium-scale development given how they leaned so hard on integer perf and the memory bandwidth that looks to be high. I wouldn’t try to build Firefox on the M1BA, probably, but for most mobile and desktop app development I think it would be pretty fabulous. I suspect it sucks more for server development when one of your containers gets relegated to the LITTLE cores, but maybe not. When pytorch and TF learn to talk to the Neural Cores or whatever we’ll usher in a new wave of “predicts better on my machine”.
|
# ? Nov 14, 2020 01:50 |
|
No one's going to be running inference on Apple laptops or desktops. If you're interested in custom silicon for ML inferencing, Amazon's shifting away from Nvidia cards to using their own ASICs for Alexa AI queries on the cloud.
|
# ? Nov 14, 2020 01:54 |
|
shrike82 posted:No one's going to be running inference on Apple laptops or desktops. If you're interested in custom silicon for ML inferencing, Amazon's shifting away from Nvidia cards to using their own ASICs for Alexa AI queries on the cloud. I don’t know about that. I could see lots of apps doing light inference over a given personal corpus like photos or emails or browser history or hand-written annotations or whatever and using the assist hardware for it. Most will be wasting their time, because they won’t be better than manually engineered state machines, but they’ll try it.
|
# ? Nov 14, 2020 02:02 |
|
oddly enough the examples you mentioned are being done on the cloud including Apple's implementations
|
# ? Nov 14, 2020 02:03 |
|
Subjunctive posted:It seems pretty clear that they’re going to kick rear end at small-to-medium-scale development given how they leaned so hard on integer perf and the memory bandwidth that looks to be high. I wouldn’t try to build Firefox on the M1BA, probably, but for most mobile and desktop app development I think it would be pretty fabulous. I suspect it sucks more for server development when one of your containers gets relegated to the LITTLE cores, but maybe not. What's actually going to suck is that you can't run your x64 Docker containers, ARM machines are unsuitable to a whole lot of tasks.
|
# ? Nov 14, 2020 02:21 |
|
shrike82 posted:oddly enough the examples you mentioned are being done on the cloud including Apple's implementations I thought Apple was using neural engine for photos?
|
# ? Nov 14, 2020 05:27 |
|
gret posted:Yeah my iMac once in a while develops graphical glitches and then freezes up, so it's probably near its last legs. Hope new ARM iMacs are announced sooner rather than later. The scuttlebutt is Q1 2021, fwiw.
|
# ? Nov 14, 2020 06:20 |
|
It's about time we get a redesigned iMac too without the huge chin and height adjustment would be nice
|
# ? Nov 14, 2020 06:25 |
|
Also, in terms of the performance of M1/A14 and x86/etc., I thought Anandtech did a decent discussion on it, and in particular, the following segments I found interesting:Anandtech posted:What really defines Apple’s Firestorm CPU core from other designs in the industry is just the sheer width of the microarchitecture. Featuring an 8-wide decode block, Apple’s Firestorm is by far the current widest commercialized design in the industry. IBM’s upcoming P10 Core in the POWER10 is the only other official design that’s expected to come to market with such a wide decoder design, following Samsung’s cancellation of their own M6 core which also was described as being design with such a wide design.
|
# ? Nov 14, 2020 06:35 |
|
SourKraut posted:Also, in terms of the performance of M1/A14 and x86/etc., I thought Anandtech did a decent discussion on it, and in particular, the following segments I found interesting: This is cool poo poo and based on this and the launch stuff I’m super hyped for the late 2021 pro line. E; question from a non hardware person tho, everything reads like Apple just lapped intel/AMD completely in watts/power, but this is essentially an iPad chipset with more room right? I’m confused how just now everyone’s going holy poo poo? squirrelzipper fucked around with this message at 06:43 on Nov 14, 2020 |
# ? Nov 14, 2020 06:39 |
|
what kind of heavy lifting do goons do on their macs?
|
# ? Nov 14, 2020 06:44 |
|
shrike82 posted:what kind of heavy lifting do goons do on their macs? Hell, after a few years, casual web browsing becomes a heavy lift task when your computer’s thermal solution was designed by folks who apparently live in a world without dust. Mostly blender rendering on a mid-2012 MBP. Video editing too, but I stick to HD footage so that’s typically not much of a problem. Baronash fucked around with this message at 06:56 on Nov 14, 2020 |
# ? Nov 14, 2020 06:53 |
|
Mu Zeta posted:It's about time we get a redesigned iMac too without the huge chin and height adjustment would be nice They're going to keep the chin and add a notch
|
# ? Nov 14, 2020 07:07 |
|
shrike82 posted:what kind of heavy lifting do goons do on their macs? Software Development and Illustration
|
# ? Nov 14, 2020 07:49 |
|
shrike82 posted:what kind of heavy lifting do goons do on their macs? Carrying it to the couch to live post The Mandalorian
|
# ? Nov 14, 2020 08:12 |
|
Someone should compile a binary for zoom because as far as I can tell it’s written in R or something. Although speaking of R, the dog poo poo language that is like 1,000,000 packages that are not internally consistent and don’t conform to any guidelines, the poo poo relies on some weird fortran compiler that can’t be ported to Apple silicon or something. loving dog poo poo language die. Hey gotta write this simple thing in R and I know how to do it just need to write it in five min and then spend four hours on stackoverflow figuring out why it doesn’t. Everyone: multithreading. R: lol cowofwar fucked around with this message at 08:49 on Nov 14, 2020 |
# ? Nov 14, 2020 08:45 |
|
shrike82 posted:what kind of heavy lifting do goons do on their macs? 4K compositing is prolly the most taxing. Photoshop can get fucky too if it’s large print res files, like billboards or vinyls. The usual. Mister Facetious posted:Carrying it to the couch to live post The Mandalorian Fool. That’s what the iPad is for. (Wait they’re how much faster?!?) squirrelzipper fucked around with this message at 09:08 on Nov 14, 2020 |
# ? Nov 14, 2020 09:04 |
|
cowofwar posted:Someone should compile a binary for zoom because as far as I can tell it’s written in R or something. %>% go brrr
|
# ? Nov 14, 2020 09:08 |
|
|
# ? May 27, 2024 04:11 |
|
shrike82 posted:what kind of heavy lifting do goons do on their macs? Games.
|
# ? Nov 14, 2020 09:35 |