|
I'll let slip what I can that's not PII as I have it. We didn't even know the dev kit was a thing until announced even if we had suspicions.
|
# ? Jun 22, 2020 20:01 |
|
|
# ? Jun 1, 2024 17:55 |
|
I would hope that they are sandbagging with today's demos and the A12Z DTK to release some unheard of 16-core monster SoC in a MBA form factor later this year. It's going to take a very significant performance increase over x86 hardware to get me to switch away from Mojave into a potentially even more locked down platform, though I admit that I haven't heard anything in this direction today. Are apps outside the app store still going to be a thing? Do we know what's going to happen to the package managers like brew and macports? With all the iOS-ification going on it wouldn't surprise me to see those go away and Apple will point at Virtualization as the replacement. This might turn out ok (see WSL2) or a total nightmare (see current Docker on macOS).
|
# ? Jun 22, 2020 20:10 |
|
Splinter posted:Yeah I imagine the ARM Macs will initially be great as basic web/email/streaming machines, but I'd bet it'll be years before most professional/creative/content creation software has been ported to run on ARM natively. I mean, yeah you’re probably right but also a big part of the demo was them showing off how MS Office, Adobe CS, Final Cut/Logic, and Maya were already “ready to go” for ARM, and the implication seemed to be that this wasn’t Rosetta 2-based, but that they were native. And then they also showed Shadow of the Tomb Raider (lol, Apple’s only contemporary AAA game) running via Rosetta 2 to show how good the emulation was. frogbs posted:Looks like it's $500, and you have to return it to Apple: Is it normal for companies to charge for access to a temporary Devkit like that? I imagine it’s more a show of good faith/to weed out potential leakers and flippers than a serious attempt at getting money out of developers right?
|
# ? Jun 22, 2020 20:12 |
|
Splinter posted:Yeah I imagine the ARM Macs will initially be great as basic web/email/streaming machines, but I'd bet it'll be years before most professional/creative/content creation software has been ported to run on ARM natively. They demoed Word, Excel, PowerPoint, Final Cut Pro, Lightroom, and Photoshop running natively. They had Maya running under Rosetta 2. There will be roadblocks for sure, but there will also be plenty of people who can move over seamlessly. The really interesting thing here is going to be the market of people who need to run Windows - and legacy x86 software - through Boot Camp or Parallels. Big tech buys a lot of MacBooks because they're good laptops that can run any OS. Unless they've got more magic in the pipeline to run emulated x86 Windows on an ARM Mac, that's going to go away.
|
# ? Jun 22, 2020 20:13 |
|
I would definitely not by the first gen ARM Mac. Maybe not even the second gen. I wouldn’t be surprised if when you shipped the DTK back you get sent a consumer release version. Wouldn’t want to advertise that fact so that Apple doesn’t have people buying them just to get a $500 Mac mini. In any case, I don’t need one because all my stuff is standard system UI components that will transition very easily. I imagine it’s more for Unity, Unreal, Adobe, etc. cross platform software.
|
# ? Jun 22, 2020 20:17 |
|
The first gen Intel Macs were a disaster. A goon here took his apart and found like 3 tablespoons of thermal grease splattered on there like a porno.
|
# ? Jun 22, 2020 20:19 |
|
Mu Zeta posted:The first gen Intel Macs were a disaster. A goon here took his apart and found like 3 tablespoons of thermal grease splattered on there like a porno. ....and then Apple learned their lesson and never invested in thermal grease again....
|
# ? Jun 22, 2020 20:23 |
|
Ok Comboomer posted:Is it normal for companies to charge for access to a temporary Devkit like that? I imagine it’s more a show of good faith/to weed out potential leakers and flippers than a serious attempt at getting money out of developers right? Yes - Transitions make or break when it comes to the details, and Apple had a lot of them at WWDC 2005. After admitting that every major version of Mac OS X has been compiled for both PowerPC and Intel chips, Jobs outlined Rosetta, which would allow PowerPC applications to run on Intel Macs without a hitch. Xcode was updated to version 2.1 to allow developers to build “universal” apps that would run natively on old and new Macs alike. To help with this work, Jobs then introduced a new computer. An Intel Mac inside a Power Mac G5 body, the Apple Developer Transition Kit (or DTK) was made as a way for developers to work on their x86 applications before the first Intel Mac shipped to customers. The machine came with a 3.6 GHz Pentium 4, a CPU that would never end up in a shipping Mac, as Apple launched with the Core Duo line of processors. It ran an Intel version of Mac OS X Tiger. 10.4.1 came on these machines; Apple was up to 10.4.4 by the time the first MacBook Pro and Intel iMac shipped in January 2006. Jobs was quick to explain that the DTK was not a Mac made for customers: This is a development platform only. This is not a product; this will never be shipped as a product. It’s just for you guys to get started in development. You actually have to return them by the end of 2006. We don’t want them floating around out there. These are not products. To borrow one of these systems ran $999, and they shipped a few weeks after WWDC. Once they were in the world, things got interesting as people started poking around inside. https://www.macstories.net/stories/this-is-not-a-product-the-apple-developer-transition-kit/
|
# ? Jun 22, 2020 20:24 |
|
Ok Comboomer posted:I mean, yeah you’re probably right but also a big part of the demo was them showing off how MS Office, Adobe CS, Final Cut/Logic, and Maya were already “ready to go” for ARM, and the implication seemed to be that this wasn’t Rosetta 2-based, but that they were native. I wouldn't be surprised if they have big players like MS and Adobe on board out of the gate, or soon thereafter. I was more thinking about how long it'll be until most software has been ported so that you're not significantly limited in your software choices by going ARM. Like sure they'll have Logic, but how long until a smaller shop like Ableton can port Live (or all the popular 3rd party plugin makers)? They'll have PS/LR, but how long until alternatives like Capture One/Affinity are available? Etc. I also haven't been following the announcement, so I could be off base here.
|
# ? Jun 22, 2020 20:25 |
|
I don't know if I'd call an Apple Ax CPU-based hardware product a "first generation" product since they've been making them since the early iPhone days though. I don't think you're going to find a lot of flaws in the chip or logic board design
|
# ? Jun 22, 2020 20:26 |
|
I use enough obscure, old/legacy, small developer software to know that using an ARM based Mac would be a huge shift and probably a miserable experience for me. Gonna ride it out with an Intel machine and hope maybe there's some different app options for the software I currently use in a few years.
|
# ? Jun 22, 2020 20:33 |
|
Splinter posted:Like sure they'll have Logic, but how long until a smaller shop like Ableton can port Live (or all the popular 3rd party plugin makers)? It’ll never happen.
|
# ? Jun 22, 2020 20:34 |
|
59.82 USD +0.21 (0.34%) Intel stock didn't really care.
|
# ? Jun 22, 2020 20:41 |
|
I wouldn't be surprised to see some instruments on ARM Macs fairly soon, given that there's a fairly strong iOS ecosystem. But yeah, I wonder how quickly DAWs with tons of legacy code (Ableton, Cubase) and VSTs that are decades old (Waves) can get ported over
|
# ? Jun 22, 2020 20:44 |
|
Bob Morales posted:59.82 USD +0.21 (0.34%) It won’t until AMD/Qualcomm make significant headway into the datacenter krysmopompas posted:There are still a ton of Protools shops running ppc macs, others never upgraded to the trashmac, and the vast majority of VST plugins are 2.0 (which was deprecated in 2010 & the sdk was wiped from steinberg’s site in 2018) Reading a little bit on Rosetta 2, it seems to be very similar to how MS handles x86 emulation on Windows for ARM- a hybrid approach that essentially generates+caches ARM native code for x86 commands the first time you run an app and then goes back to that cached code every subsequent time you run it. Not ideal, but seems way less painful than how Rosetta 1 worked/classic command interpretation emulation. If they don’t remove it eventually (lol) you might run a lot of old VSTs that way and never really notice the difference.
|
# ? Jun 22, 2020 20:51 |
|
Ok Comboomer posted:Is it normal for companies to charge for access to a temporary Devkit like that? I imagine it’s more a show of good faith/to weed out potential leakers and flippers than a serious attempt at getting money out of developers right? I think I read that the last time Apple did this, they let developers use the money for the devkit towards a new mac if they wanted to.
|
# ? Jun 22, 2020 20:54 |
|
japtor posted:For the MBA you can get quad/16GB/256GB for $1300, the trick is to select the dual core one and upgrade from there to save on the storage. But yeah it's close enough to the MBP that you kinda have to consider that too. The price points are like that for a reason thanks. I ordered it just now. Hopefully this one lasts 10 years as well. The 11” is just painfully slow at this point.
|
# ? Jun 22, 2020 20:57 |
|
I doubt I'd switch to an ARM Mac during the first generation, but I couldn't care less about some random obscure legacy software if it means I can run all my iOS/iPad apps from my ARM-based MacBook. Plus is looks like Rosetta 2 will handle any legacy software without much inconvenience, so it's a win/win. Calling it right now: touchscreens on ARM MacBooks.
|
# ? Jun 22, 2020 21:03 |
|
Given the ARM transition, my original plan for this summer, which was "Buy a new top of the line MBP that will last me 8 years like my current 15" MBP has lasted 8 years" suddenly seems like a not great idea. (Suddenly very glad I waited for both the summer BTS promo, and for our long national keyboard nightmare to be over). I'm generally okay with the idea of moving a little down the performance ladder, but ditching the dGPU by going from 15 to 13 inch is still loving with my head and I can't find any sort of performance comparisons that will reassure me since they're all either years out of date, or very much out of the "MACS CAN'T PLAY GAMES LOOK AT THIS 20 FPS poo poo" framework of discussion and don't really provide me with any ability to actually understand how a new mac performs compared to my old mac. Basically, all I really want is to know that even with the integrated graphics on the new 13", I'm still going to see an upgrade playing the sort of not-overly intensive isometric RPG's that make up the only real gaming I do on my laptop. In particular, I'd like to know that I'd see that upgrade even running in bootcamp. My current machine has a GT750M. It was good enough for me to play the Witcher 2 and Fallout 4 at lovely settings, but for some reason, Pathfinder Kingmaker is the white whale I could never get to run well on it. (Disco Elysium, for something more similar to that, ran... okay ish.) I don't care all that much about whether I can play Cyberpunk 2077 on the thing in 6 months, I just want some reassurance that I'll still be able to play the stuff I've played on my current machine on a new one that no longer has the discrete GPU, and that the upgrade will be, y'know, an upgrade.
|
# ? Jun 22, 2020 21:13 |
|
CaptainPsyko posted:Given the ARM transition, my original plan for this summer, which was "Buy a new top of the line MBP that will last me 8 years like my current 15" MBP has lasted 8 years" suddenly seems like a not great idea. (Suddenly very glad I waited for both the summer BTS promo, and for our long national keyboard nightmare to be over). Graphically the current 13” seems like more of a side-grade to the 2013 15” MBP w/750m that you and I apparently both own than a straight upgrade or downgrade. That might be totally worth it to you given the benefits of moving to a smaller/lighter computer with better eGPU compatibility, or you might want to move up to the 16” and get something with a 5300M/5500M.
|
# ? Jun 22, 2020 21:22 |
|
Ok Comboomer posted:If they don’t remove it eventually (lol) you might run a lot of old VSTs that way and never really notice the difference. I’m almost glad they did this at the head of a global recession - we won’t be buying new hardware for years, so there’s plenty of opportunity to evaluate whether or not we want to keep paying the apple tax.
|
# ? Jun 22, 2020 21:22 |
|
So is Apple going to be licensing the thunderbolt technology from Intel to use in their custom Apple SoCs? I mean, even AMD hasn't really been able to get Intel to get them that stuff proper.
|
# ? Jun 22, 2020 21:22 |
|
FireWire but usb-c.
|
# ? Jun 22, 2020 21:24 |
|
krysmopompas posted:I bet they remove it faster than rosetta 1. still haven’t found a better alternative—at least with portables I’m very optimistic and excited about this. I know there are a lot of very good reasons to be skeptical, concerned, or apprehensive but I personally have been waiting for this rumor to materialize for like 5 years. This is the most interesting thing Apple has done with the Mac, and with the industry, since the Air came out and made ultrabooks a thing.
|
# ? Jun 22, 2020 21:27 |
|
xgalaxy posted:So is Apple going to be licensing the thunderbolt technology from Intel to use in their custom Apple SoCs? Thunderbolt is getting folded into USB-4 (which is supposed to be either indistinguishable from TB3 or better or maybe possibly worse depending on where you look?) and Intel open-sourced it, afaik.
|
# ? Jun 22, 2020 21:29 |
|
One thing I liked about the keynote is that Apple pretty much confirms Macs will get world class GPU performance as the GPU from iOS devices will be brought over to the Mac side, or even possibly a more hefty version thanks to the higher thermal and power envelopes on laptops / desktops as opposed to tablets and phones. We'll be getting Apple GPU technology on that SoC, which is derived and refined from the old PowerVR / Imagination Technology GPUs.. I do wonder where that leaves AMD though.. making the GPUs for the Intel Macs yet to come I guess. Interesting how this comes as Intel is actually making concerted efforts to improve their integrated GPUs; they're still nowhere close to dGPUs but they're not the trash pieces of junk they used to be, like the old 950 GMA that was ubiquitous to Minis and laptops for years. Binary Badger fucked around with this message at 21:32 on Jun 22, 2020 |
# ? Jun 22, 2020 21:29 |
|
I dunno. I think Intel is kind of hosed. They've clearly screwed up in their long term planning and rested too long on their laurels.
|
# ? Jun 22, 2020 21:31 |
|
Intel's got shitballs of money and are literally selling every chip they can produce. They're not in a good spot, but they've got time.
|
# ? Jun 22, 2020 21:47 |
|
Ok Comboomer posted:Graphically the current 13” seems like more of a side-grade to the 2013 15” MBP w/750m that you and I apparently both own than a straight upgrade or downgrade. This, and ~600 dollars saved is basically where I'm at. I feel like throwing my 13" savings into an egpu is a better play than the 16" in the worst case scenario, given that, (from what I understand), egpu's + bootcamp only play nice if you don't have one built in? (I have done only the slightest bit more than 0 research on the egpu side of things, but the the safety blanket of 'this will work' is a definite factor in my decision making process.)
|
# ? Jun 22, 2020 21:47 |
|
Oh, look.. https://www.apple.com/macos/big-sur-preview/ Apple lists the following Macs as being Big Sur compatible: MacBook - 2015 and later MacBook Air - 2013 and later MacBook Pro - Late 2013 and later (gently caress you Early 2013 owners) Mac Mini - 2014 and later iMac - 2014 and later (We're not supporting those POS 2013 iMacs, they all used lovely nVidia GPUS) iMac Pro - 2017 and later (LOL there haven't BEEN any later ones.. yet) Mac Pro - 2013 and later Binary Badger fucked around with this message at 21:57 on Jun 22, 2020 |
# ? Jun 22, 2020 21:49 |
|
I'm actually really, really excited for this. I think Apple is gonna kick some major rear end with their hardware with this move.
|
# ? Jun 22, 2020 21:50 |
|
So, for Big Sur, you need: minimum Intel HD Graphics 5000 GPU or later, -or- AMD Radeon 5XX GPU or later The D300 / 500 / 700s on the TrashCan Pro are a weird animal. I bet Apple had to pay them some $$$ on the side to update their Tahiti drivers for Big Sur. Late 2013 iMacs had nothing but nVidia GPUs across the board; other than that, there's nothing that would prevent Big Sur running on them other than that Apple will likely not bundle any nVidia drivers for Big Sur like they did for Catalina. Also noted: they ran the Safari energy tests on MacBook Pros with 8 GB of RAM, so that bodes well for surfing longevity Binary Badger fucked around with this message at 22:15 on Jun 22, 2020 |
# ? Jun 22, 2020 21:57 |
|
Binary Badger posted:One thing I liked about the keynote is that Apple pretty much confirms Macs will get world class GPU performance as the GPU from iOS devices will be brought over to the Mac side, or even possibly a more hefty version thanks to the higher thermal and power envelopes on laptops / desktops as opposed to tablets and phones. The iPad Pro GPU is very powerful for a lower-power integrated part, but it's not in the same class as a high-end GPU with dedicated memory and a big power budget. Apple will probably use its own GPU as the default integrated option, replacing Intel's on-chip graphics, and still offer dedicated AMD (or maybe Nvidia) parts in higher-end systems.
|
# ? Jun 22, 2020 22:02 |
|
Binary Badger posted:
this is a pleasant surprise, after just getting my MBP repaired and seeing this announcement i assumed that i'd now own two "legacy" apple daily drivers (a 2011 imac + late 14' 15" MBP). sounds like i was in the same spot as some of y'all; trying to see if i can get another year or two out of this before upgrading. good to know that isn't going to be expedited because apple decided that they don't want to support my equipment anymore
|
# ? Jun 22, 2020 22:04 |
|
Space Gopher posted:Apple will probably use its own GPU as the default integrated option, replacing Intel's on-chip graphics, and still offer dedicated AMD (or maybe Nvidia) As far as Apple is concerned, the nVidia ship has sailed, been torpedoed, and its rotting corpse is regularly jostled by depth charges launched with malice for years. Binary Badger fucked around with this message at 22:09 on Jun 22, 2020 |
# ? Jun 22, 2020 22:06 |
|
xgalaxy posted:I dunno. I think Intel is kind of hosed. They've clearly screwed up in their long term planning and rested too long on their laurels. Come to IntelChat thread. It’s possible, but also Apple revenue is a drop in the bucket for Intel (a high-profile, high-value, high-prestige/PR drop, but a drop nonetheless), as is stuff like DIY PC and retail-packaged chips where AMD is currently killing. Once you see big data center/server clients and builders moving strongly to ARM/AMD (and, less critically, OEM desktop/notebook manufacturers) then you might start to see Intel’s stock price really move. Really it’s an issue of AMD being 1/10th the size of Intel and simply not having the scaling ability to compete with them on delivering the kinds of orders that companies like Dell/HP put in for their entire product lines across enterprise/home user/server. And AMD’s greatly improved their ability to do integration and consulting and boutique poo poo for manufacturers but it’s totally dwarfed by what Intel does as their bread and butter. Idk if it was this thread, but Intel CPU staffers arguably have as much a hand in the design of Apple motherboards as Apple’s own engineers do. And if Intel’s future looks legitimately threatened by an AMD/ARM incursion, then you can imagine the kinds of aggressive pricing/partnership maneuvers that they could take. The day that AMD makes a credible play for the data center is the day that Intel starts offering OEMs Xeons at heavy cost, or possibly even for free, if they promise to stick around. They’ll offer free engineering support and motherboard integration, and all of the other perks that have nothing to do with chip performance. Give the partners an offer they can’t afford to refuse, no matter how good the competition. Intel’s biggest advantage by far is their size and central status to the industry. If push comes to shove, they’ll take a temporary 25% revenue cut or whatever to undercut AMD badly enough that they either back off or die outright. AMD knows this, which is why they continue to hedge their bets and shore up dominance in the places where they’re currently succeeding/focus more on challenging NVidia industry-wide than on Intel. If they didn’t have GPUs or retail chips or the upcoming consoles and they were putting all of their eggs into challenging Intel at the OEM level right now they’d get bled to death in no time. Doesn’t matter how good the chips are. Binary Badger posted:One thing I liked about the keynote is that Apple pretty much confirms Macs will get world class GPU performance as the GPU from iOS devices will be brought over to the Mac side, or even possibly a more hefty version thanks to the higher thermal and power envelopes on laptops / desktops as opposed to tablets and phones. I remember firing up TF2 on my 2011 MBA and being amazed at how well it ran (before the computer turned into a jet plane and took off) I imagine there will still be a place for dGPUs at the high end. Unless Apple’s about to start shipping their own ARM-based PCIe cards, and unless Apple plans to totally scrap the new Mac Pro that they just released in two years for something radically different that doesn’t use MPX or dGPUs or any of that stuff. trilobite terror fucked around with this message at 22:20 on Jun 22, 2020 |
# ? Jun 22, 2020 22:08 |
|
Ok Comboomer posted:unless Apple plans to totally scrap the new Mac Pro that they just released in two years for something radically different that doesn’t use MPX or dGPUs or any of that stuff. This, at least, would be an extremely Apple thing to do.
|
# ? Jun 22, 2020 22:12 |
|
CaptainPsyko posted:This, at least, would be an extremely Apple thing to do. Well they would have to start working on it right now then, because otherwise we aren’t seeing it for at least four years.
|
# ? Jun 22, 2020 22:18 |
|
Ok Comboomer posted:...unless Apple plans to totally scrap the new Mac Pro that they just released in two years for something radically different that doesn’t use MPX or dGPUs or any of that stuff. They'd just turn around and alter the logic board for an Apple Silicon™ Heavy 32-core processor rather than a 28 core Xeon and keep everything else, wheels optional as usual.
|
# ? Jun 22, 2020 22:19 |
|
|
# ? Jun 1, 2024 17:55 |
|
Binary Badger posted:They'd just turn around and alter the logic board for an Apple Silicon™ Heavy 32-core processor rather than a 28 core Xeon and keep everything else, wheels optional as usual. That’s exactly what I expect them to do. (I actually have no clue how GPUs work wrt x86 vs getting them to work with another architecture. Like, do the GPUs themselves need to be redesigned or could it just be on the CPU/mobo side? Like, could Apple make a CPU/board that work ootb with existing Radeons? A drop-in replacement board for owners of existing Mac Pros?)
|
# ? Jun 22, 2020 22:24 |