Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

leman

macrumors Core
Oct 14, 2008
19,521
19,678
What are the current consensus predictions for chipset models/no. high power and low power cores/clockspeeds/GPU/memory? [I figure you've probably been able to keep better track of this than I have.]

I don't know about the consensus, but the rumors point to a 4+4 CPU (A14X) and a (8+4) CPU (A14T?) config. No mention of GPU specs. If we consider the historic precedence and the common sense, the A14X would have 7 or 8 GPU cores and the "A14T" — who knows — maybe 12 or 16? We'll learn soon enough


Neither Intel or AMD stay within their TDP under load, and haven't for years. Just getting an SoC for the MBP that actually stayed under 45W would be a huge win. My 16" loves to spike up over 90W just for the CPU package, and will frequently sit above 60W under load.

Spiking above the TDP is not a problem — TDP is just a figure for long term sustained thermal dissipation anyway, and it depends on what a vendor means in it (Apple SoC's don't publish or discuss TDP in any sort or fashion anyway). The problem with Intel is that it has to do these absolutely ridiculous power spikes to offer good performance.


I don't expect them tomorrow, but eventually multi-processor machines. If they threw two A14s in there, that would be 4 high-performance cores (and 8 high-efficiency cores) total. Throw in four A14s and that's 8 high-performance cores total. (But it probably won't be A14s because they would have to be modified to support multi-processor use, though I guess it's possible they are already designed for it. But that would mean wasted transistors in all those iPhone 12s and iPad Airs that have only a single A14.)

As others have already pointed out, that is a sure way to kill performance. Multi-processor works fine for cluster-style machines that specialize in running multiple (independent) tasks in parallel, such as web servers or scientific supercomputers. They don't work for general purpose computing. There is NUMA as pointed out by @casperes1996 , which is used for large-RAM machines (I programmed a crappy large data algorithm for our 4TB RAM supercomputer, interesting experience), but it's certainly not a general purpose programming domain. Incidentally, I think that Apple is likely to adopt a NUMA-style architecture for their Mac Pro, but for a different reason than you state. I think some sort of NUMA architecture is inevitable if they want to keep unified memory for a pro level workstation.


Sounds to me like a rather uninformed article. They utterly fail to notice that Apple has ben developing their own chips for years — and that these chips are significantly ahead from the curve in terms of energy required to deliver the same performance. They also seem to have no clue about what a software transition like this entails and how porting from x86 to ARM works. It's ok to be skeptical — it is an investor-oriented article anyway, but at least get your facts straight.

But the story is vastly different if your critical software is written in K&R style C with manual pointer arithmetic, making assumptions about memory offsets and alignments, and using compiler intrinsics for AVX instructions, which btw are not translated by Rosetta.
Though I will caveat this by saying that if you code C in a good style using functions like sizeof instead of assuming the memory size of types and such you're not in very much trouble there either - so it is only potentially painful in an extremely small set of situations. Just want to point out it isn't always just hitting build :)

Memory offsets, alignments etc. are identical between x86-64 and Aarch64 — ARM developed the entire 64-bit instruction set with a lot of foresight (and I have read some rumors that Apple apparently had a hand in it). Low-level compiler intrinsics sounds like a problem, until you discover thing like these. Overall, if your code is correct C/C++, it will run on Aarch64. I mean, didn't Apple mention that it took a single guy under a week to port Photoshop — the stereotype of legacy software —to Apple Silicon? In other news, a single person with a DTK, working in their spare time, made Zig compatible with Apple Silicon targets.

There will be roadblocks of course. If you have inline assembly, well, you are mostly out of luck. If you have hidden hardware assumptions (page size, hardware register granularity, using CPUID trick to synchronize CPU barriers etc.), you are pretty much screwed. The biggest pitfall is multi-threaded programming, because x86 and Aarch64 have explicitly different memory ordering guarantees. But in even in the later case, if your code is correct C/C++, it will correctly work on Aarch64 — the only problem is that many people ship buggy code without realizing it, since the bugs won't show on x86. To be fair though, this will mostly affect low-level threading libraries and things like allocators — and popular stuff is already tested and packaged for ARM CPUs.


Apple already sucker-punched devs with the whole codesigning debacle. Which followed the hard cutoff forcing upgrade forcing apps to 64-bit. Which followed a laundry list of other requirements that'd make this post too long... Developing for macOS is now a big pain in the butt for small-to-medium devs. And now they'll be asking developers to support not just a new architecture, but a new one plus the previous one.

Really don't share your opinion. Developing for Apple is much simpler than developing for any other platform. Forcing the 64-bit was painful if you have to maintain badly designed legacy software, but it makes things so much better for everyone in the long run, that it had to be done. Codesigning is literally one command, and if you se Xcode, it will take care for you instead. if you don't use Xcode, the linker ill do it for you in the background (you only need to deal with it if you intend to distribute). Finally, if your program is competently engendered engineered (which is really not that hard to do), it will support both x86-64 and AArch64 as compilation targets.

I agree however that the requirement of being enrolled in the Developer Program sucks for open source, Apple really ought to have free access to codesign identities for confirmed individuals.

IMO you can't really sell people on thermals. "Look how much cooler the chassis is!", while a valid praise in an enthusiast review that nerds like some of us pay attention to, does not seem like the kind of thing you'd put in a keynote.

That leaves battery life and processing performance. They'll both need to be awesome, or the rumors will need to be decoys and form factors are changing which is unlikely, or otherwise Apple will be in trouble. AND lets not forget the laptops rumored to be updated are ones that would normally get Tiger Lake, which itself is finally a good upgrade for the ultrabook class. Otherwise why not wait another 6 months?

I think you are missing the fact that already the 4-5 watt dual-core iPhone CPU has performance comparable to that of a 15W Tiger Lake. A slightly higher clocked auld-core A14 will have single-core performance on par — or better — than AMD's newly released desktop CPUs and the multi-threaded performance close to that of an Intel i9 in the 16" model. If that is not something to woo enthusiasts, I don't know what will. And of course, let's not forget about the GPUs. The iPhone 12(!!) offers similar performance to Nvidia's MX350 (a 25 watt Pascal GPU), so you can expect entry-level Mac laptops with 8 GPU clusters to outperform an Nvidia MX450.

Personally, I desperately need more performance. Apple Silicon will make Macs the choice for everyone like me — people who need state of the art performance in a mobile package. Heck, I am contemplating switching my 16" i9 for a 13" with A14X, because it will mostly likely run my R code much faster.
 

blindpcguy

macrumors 6502
Mar 4, 2016
422
93
Bald Knob Arkansas
Personally I just hope that they show how the os integrates with the hardware. I figure signed apps will matter even more with IOS support coming. So I just wanna see the switching between macOS versions and how the new power button based boot picker works.
 

krell100

macrumors 6502
Jul 7, 2007
466
723
Melbourne, Australia
I wasn't really thinking about getting a laptop but if these new offerings are really strong then a low(er) end machine may be a nice addition and a first toe-dip in to the AppleChip ecosystem. Intrigued...
 

thenewperson

macrumors 6502a
Mar 27, 2011
992
912
Sounds to me like a rather uninformed article. They utterly fail to notice that Apple has ben developing their own chips for years — and that these chips are significantly ahead from the curve in terms of energy required to deliver the same performance. They also seem to have no clue about what a software transition like this entails and how porting from x86 to ARM works. It's ok to be skeptical — it is an investor-oriented article anyway, but at least get your facts straight
Is it just m or did that article not tell me why Microsoft may benefit at all?
 

casperes1996

macrumors 604
Jan 26, 2014
7,599
5,770
Horsens, Denmark
Memory offsets, alignments etc. are identical between x86-64 and Aarch64 — ARM developed the entire 64-bit instruction set with a lot of foresight (and I have read some rumors that Apple apparently had a hand in it). Low-level compiler intrinsics sounds like a problem, until you discover thing like these. Overall, if your code is correct C/C++, it will run on Aarch64. I mean, didn't Apple mention that it took a single guy under a week to port Photoshop — the stereotype of legacy software —to Apple Silicon? In other news, a single person with a DTK, working in their spare time, made Zig compatible with Apple Silicon targets.

There will be roadblocks of course. If you have inline assembly, well, you are mostly out of luck. If you have hidden hardware assumptions (page size, hardware register granularity, using CPUID trick to synchronize CPU barriers etc.), you are pretty much screwed. The biggest pitfall is multi-threaded programming, because x86 and Aarch64 have explicitly different memory ordering guarantees. But in even in the later case, if your code is correct C/C++, it will correctly work on Aarch64 — the only problem is that many people ship buggy code without realizing it, since the bugs won't show on x86. To be fair though, this will mostly affect low-level threading libraries and things like allocators — and popular stuff is already tested and packaged for ARM CPUs.

Thanks for that - I should’ve also clarified when I mentioned things like pointer size, memory offset and alignment it was more examples of things that potentially could be different between architectures - I know x86, I don’t know AArch64 very well at all. Though I do know it has more consistent register naming than x86 :p - I’ve ranted a lot about Intel’s register naming in x86. Think AMD’s contribution to the naming made much more sense, except it was added onto the registers Intel had already named so it became a weird mix of rax, rdi, and suddenly r9, r10, r11, etc. Anyway that rant aside I appreciate your detailed post there. Good read - Sounds good with Zig computability that fast - What tier was it? Since Zig works in computability tiers and tier 4 is basically ”it barely works” it could be anything from super duper impressive to ”just” impressive - But for a compiler that seems great.
But yeah I very much expect all my code, from C to Swift to OCaml and all, to basically just work - as long as my 3rd party dependencies can get upgraded :)
 

HiRez

macrumors 603
Jan 6, 2004
6,265
2,630
Western US
Bring back the 11" MacBook Air as a 12" by completely eliminating the bezels, I mean all the way edge to edge. With no fan, reduced weight, increased performance, and increased battery life. This would be a huge winner.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,678
Good read - Sounds good with Zig computability that fast - What tier was it? Since Zig works in computability tiers and tier 4 is basically ”it barely works” it could be anything from super duper impressive to ”just” impressive - But for a compiler that seems great.

Not, sure, they didn't say. I'd expect Tier 2, same as macOS on Intel. They mention some linker issues, I wold guess it has to do with the new obligatory codesign. But I'd expect LLD to be patched rather quickly. Overall, I'd be surprised if there is any major open-source software not running excellently on ARM Macs after February 2021.
 
  • Like
Reactions: casperes1996

Synna

macrumors member
Apr 17, 2020
94
23
So, there's not really conclusive evidence about the appearance of an ARM 16" in the event, right?

On the one hand, the "Boot Camp leak" is pointing towards an Intel machine, on the other hand, Mark Gurman mentioned that an ARM 16" might appear. Is there any chance, they introduce new 16" with both ARM and Intel?

I really need a new 16", but if they only introduce an Intel machine, I feel like this now really is "dead on arrival".
 

leman

macrumors Core
Oct 14, 2008
19,521
19,678
So, there's not really conclusive evidence about the appearance of an ARM 16" in the event, right?

I don't think there is any conclusive evidence about anything, really, except the fact that we will see some ARM Macs. Then again, the event is in 10 hours, you can surely wait this long with your purchasing decision :)
 
  • Like
Reactions: Synna

smoking monkey

macrumors 68020
Mar 5, 2008
2,363
1,508
I HUNGER
...the "Boot Camp leak" is pointing towards an Intel machine, on the other hand, Mark Gurman mentioned that an ARM 16" might appear. Is there any chance, they introduce new 16" with both ARM and Intel?

I really need a new 16", but if they only introduce an Intel machine, I feel like this now really is "dead on arrival".
Agree. A newly released intel machine seems like a really strange step and there is no way they will announce it tomorrow unless it's got some AS inside. It would be a silent refresh at best.

Rumors stated that 16s are in an earlier stage of production than the 13s, so I honestly believe we will see them before the end of January. Feb at the absolute latest.

There is no way I'm buying Intel. I feel sorry for the vast majority of people (not those who need windows or particular software) who have picked up intel 16s in the past 3 months because their machine will be destroyed tomorrow in every possible way. And even if they don't release a 16, the 13 is gonna kick sand in its face and walk off with the 16s choc top nutty ice cream.
 

MyopicPaideia

macrumors 68020
Mar 19, 2011
2,155
980
Sweden
Yeah, this is a good point - Tons and tons of the most popular and most often used iOS/iPadOS apps/games that people will want to run will just not be available because developers have either decided they don't want those apps to be on a Mac and want them to stay mobile, or simply will just not click on the opt-in option in Xcode. Yes, developers have to actively opt in to allow an app on the Mac. Apple has made it an opt-in process, not an opt-out one.
Hehe, yeah - For a long time you couldn't even statically link the standard library on Linux!!! It's only just recently been introduced. That one was a bit of a shocker to me when I first ran into it. Validated I could statically link my little cronx tool (executes lines from a crontab file at will) for use with Ubuntu so finally that's sorted at least.

It really isn't. Developing for macOS is a much smoother experience than for example developing for Windows. Codesigning is a distribution thing not a development thing, and it's not really any different to websites getting TLS encryption (the little padlock in the browser) - I hope you don't think we should just get rid of that.
I don't know how much development work you've done, but for actively maintained codebases none of what Apple has done has been that major a hurdle. macOS is still an excellent development platforms with great toolchains both using Apple's Xcode IDE as well as traditional Unix development facilities and anything else one might want.
I've had a lot of university courses now that only officially condone the use of Linux and macOS and often using Windows has been entirely impossible, while on macOS setting up the toolchain is as easy as installing a few packages through fink, home-brew, macports, or whatever packaging solution you prefer and you're ready to go. I've legitimately never felt hampered by code signing because, as mentioned, it's a distribution problem, not a development problem. Executables run from Terminal or by simply clicking "run anyway" can bypass the code signing block anyway, just like you can choose to visit websites with an untrusted certificate. And when you want to distribute, code sign with Apple's provided tool, done.
The problem here is that stuff like homebrew is confirmed not there yet on ASi - so I would double check all of these and get in contact with the development community for these tools and repositories to see what their roadmap looks like for getting these up and running on ASi. Many will need help to get these up and running in stable releases within a good timeline.

On Homebrew here is a good list of the formulae status, and the repository itself
https://github.com/Homebrew/brew/issues/7857

I don't disagree, which is why I don't think it will be A14 SoCs but something else. Something designed for interprocessor communication and cache coherency. (I'm not sure what you're saying about software optimization, unless you're talking about keeping the cores busy.)
Apple could have a killer "Pro" machine this way. I mean they could have one anyway, but there's a limit on what they can do with a single chip/package.
Still a hugely inefficient way of doing it. Way more efficient to design an SoC and much less limited doing it this way than trying to get multiple lesser SoC's to efficiently work together. One SoC with double the High power cores and the same number of efficiency cores is hugely more efficient than 2 SoC's with the same number of high power cores and double the efficiency cores - you will be leaving so many transistors on the table for normal use.

There are of course specialized use cases where multiple SoC parallel computing could benefit, but those are niche and not in the realms of consumer or corporate use cases - they are more for research/scientific/academic applications.
 

HiRez

macrumors 603
Jan 6, 2004
6,265
2,630
Western US
The battery life has not been very impressive on MacBooks for years, using their own custom ARM design should yield massive battery life improvements, as well as lower heat and noise or even the elimination of a fan. I think these will end up being much more significant than increased performance.

Not saying it'll happen tomorrow but I won't be surprised if we start seeing 16+ hour battery life on some of the lower performance models (ie. Air).
 

torncanvas

macrumors regular
Feb 14, 2006
121
73
I didn’t realize my and my colleagues development experience was so outside the norm (I’ve been a game developer for 17 years). It has been a couple years since our studio has compiled for Mac, but my colleagues have wasted a full week on codesigning technical rabbit holes. I’m active on a Discord for a multiplayer framework and just 3-4 days ago someone spent hours trying to get the plugin to work. The problem? It wasn’t codesigned properly and the devs had to issue a hotfix.

That’s just one issue mind you, every platform has its 100 things you need to fix to ship a sizable commercial product. The problem comes when a platform has 10% sales tops and the OEM of that platform keeps adding more hoops to jump through. Will you spend $10k to update your app if you’ll maybe get $10k in sales from it over the first couple months? It’s not an automatic yes when that $10k could be spent on a new feature or more polish.

Anyway my point was mainly Apple needs to hit a home run because fragmenting platform support even more is just another hoop for devs. And userbase is what makes that worth it.

As an end user, I would LOVE to see iPad Pros be able to support macOS and then for Apple’s Blender fork to be released. Blender said updating it to support macOS on ARM was otherwise going to be a pretty large undertaking.
 

johannnn

macrumors 68020
Nov 20, 2009
2,315
2,602
Sweden
  1. Face ID — This sadly doesn't seem possible with this update, even though we got iSight cameras first in Apple laptops in the last processor transition with the MacBook Pro in 2006.
Do people actually want this? Even though we have a fantastic TouchID nowadays on the laptops?
 
  • Like
Reactions: AaronM5670

leman

macrumors Core
Oct 14, 2008
19,521
19,678
I didn’t realize my and my colleagues development experience was so outside the norm (I’ve been a game developer for 17 years). It has been a couple years since our studio has compiled for Mac, but my colleagues have wasted a full week on codesigning technical rabbit holes. I’m active on a Discord for a multiplayer framework and just 3-4 days ago someone spent hours trying to get the plugin to work. The problem? It wasn’t codesigned properly and the devs had to issue a hotfix.

Ah, ok, I can certainly imagine that once yo get into plugins and all were interactions it can become tricky. Don't have any experience with that. I was thinking about regular software architecture. My bad.
Anyway my point was mainly Apple needs to hit a home run because fragmenting platform support even more is just another hoop for devs. And userbase is what makes that worth it.

I think it's we are seeing just the opposite. The rules become simpler, the APIs for platforms converge. The enforcement of 64-bit architecture makes is much more simpler to maintain your software for multiple targets (unless of yours you have some very restrictive requirements). Which of course is mostly a moot point, in three years the Intel Mac platform will be essentially dead. Not to say that there were no hurdles in the past, but the deprecation of 32-bit was not as big of a disaster than one would have expected, was it?
 
  • Like
Reactions: Serban55

Serban55

Suspended
Oct 18, 2020
2,153
4,344
prepare for some little leaks/rumour for the next 3-4 hours
Again this year, Apple just wants to show side by side the current 2020 Intel Macbook air how it compares to their own AS macbook air....and the same thing with the MBP
From next year, expect redesign imacs, and some macbook pros etc
 
Last edited:

torncanvas

macrumors regular
Feb 14, 2006
121
73
I think you are missing the fact that already the 4-5 watt dual-core iPhone CPU has performance comparable to that of a 15W Tiger Lake. A slightly higher clocked auld-core A14 will have single-core performance on par — or better — than AMD's newly released desktop CPUs and the multi-threaded performance close to that of an Intel i9 in the 16" model. If that is not something to woo enthusiasts, I don't know what will. And of course, let's not forget about the GPUs. The iPhone 12(!!) offers similar performance to Nvidia's MX350 (a 25 watt Pascal GPU), so you can expect entry-level Mac laptops with 8 GPU clusters to outperform an Nvidia MX450.

Personally, I desperately need more performance. Apple Silicon will make Macs the choice for everyone like me — people who need state of the art performance in a mobile package. Heck, I am contemplating switching my 16" i9 for a 13" with A14X, because it will mostly likely run my R code much faster.

I’m not up to date on A14 benchmarks so thank you! Hopefully it’s only as simple as bumping clock speeds and adding a couple more cores as you say!

For me an iPad with >10GB RAM supporting more 3D content creation apps - ideally through macOS - would make me a happy camper. ^_^
 

acidfast7_redux

Suspended
Nov 10, 2020
567
521
uk
Very excited.

My old rMBP just crapped out after having a full life.

late-2012 rMBP bought in Japan on holiday.

3100 battery cycles
4 plastic feet broken
T key missing
E key worn through
first 128 GiB SSD failed in 2014
i replaced with a 240 GiB OWC SSD in 2014
this drive failed last week
at least 30-40 stuck on pixels
dents from dropping on an escalator
dents from being run over by a van on my bicycle
scratches from being in my travel bag while cycling across Myanmar/Burma

great machine ... will look at the pricepoints (in GBP) and likely buy the cheapest MBA (if at £800) or MBP (if at £1000) and run it for 6-8 years unless the SSD fails (which it shouldn't as it's on the mainboard). tempted by the touchbar just for poops and giggles. if I can get that at £1000, i'll pull the trigger.

i really need a good front camera and that's about it.

typing this on a 24" 2008 iMac that just had the left arrow key break (man, these keyboards are expensive).
 

ascender

macrumors 603
Dec 8, 2005
5,021
2,897
no major leaks for this event huh, looks like it maybe a mysterious event in a while
The Mac side of things seems far less susceptible to leaks, although part of that will be down to how much info Apple has to release to third parties - with iPhone and iPad they need to share device dimensions in advance of launch.

The leakers do seem to have much better sources in basically everything Apple do other than Macs! Although with the event being pre-recorded, that also increases chances of a leak.

Before the last redesign, there were leaks of the top case showing the space for the TouchBar. The fact we've seen nothing at all could point to the rumours about no redesign being accurate.

Anyway, I'm pleased that for the first time in a long time we can go in to an event knowing none of the details.
 
  • Like
Reactions: Jouls

yurc

macrumors 6502a
Aug 12, 2016
835
1,014
inside your DSDT
The only stuff I concern with Apple Silicon is my existing working gear.

Bunch of tangent panels, Wacom pen display/tablets, audio interface (some of them still connected through FireWire), capture card, etc.

I don't need any new pro gears since it's still working fine. Imagine Apple break everything with compatibility issues, that's insane to forced buying newer tools in just sake of maintaining compatibility with latest Apple chips based Mac...
 

tuc

macrumors 6502
Aug 25, 2003
333
67
Still a hugely inefficient way of doing it. Way more efficient to design an SoC and much less limited doing it this way than trying to get multiple lesser SoC's to efficiently work together.
That's what I meant by probably not A14. I meant a specially-designed chips/SoCs/packages for this purpose. (And I hinted that it's possible that Apple had already designed the A14 for this use case and not told anyone yet. But, for reasons already pointed out, that seems extremely unlikely.)
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.