Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

beginnersview

macrumors member
Feb 21, 2006
85
21
SF Bay area
Apple needs to call it something that Joe Consumer can understand.

Joe Consumer isn't going to understand that Core i7 is Intel and A__ is Apple Silicon so they can't just identify the CPU models.

When they transitioned from PowerPC to Intel, the CPU architecture was prominently mentioned in the product name (PowerBook G4 vs. Intel MacBook).

I would expect that the Apple Silicon moniker (or another proprietary Apple trademarkable name) will be used to denote the new Mac hardware running their in-house silicon.
Since the MacBook Air and MacBook Pro are already listed on 3rd party seller sites as Apple MacBook nnn (whatever), maybe the name of the Apple corporation for the chip.

Thus we had PowerBook (PowerPC), Intel MacBook, and now the Apple chip ..... "The Apple AppleBook -- In two flavors, the Apple AppleBook Air and the Apple AppleBook Pro."

(Remember you heard/saw it here first!)
 

MallardDuck

macrumors 68000
Jul 21, 2014
1,677
3,222
The author of that opinion clearly didn't pay any attention to the WWDC keynote based on his comments about transitioning the software from x86 to ARM. He also apparently assumes that Apple just woke up one morning in February and said "let's switch to our own processors!", as he clearly doesn't know that not only has Apple been working on this for several years now, but that the bulk of Apple software (along with Microsoft Office and at least some Adobe apps) have already been recompiled for the new processors. I can attest to the fact that converting an app from x86 to ARM using XCode is a relatively painless experience, not "anything but straightforward" as the author claims, although there are always some glitches in the matrix from time to time. I was not in any sort of development role or capacity during the PPC-Intel transition, so I can not use that experience as a guideline for what to expect now.

From Apple's perspective, it would make sense to launch AS with the MacBook product lines, as their big obstacle on the portable side has been Intel for several years now. Assuming that some of the recent benchmarks popping up on Geekbench and elsewhere are legitimate, Apple may have figuratively punched Intel in the face with these first-gen notebooks, which bodes well for their professional machines (i.e., Mac Pro, iMac Pro) going forward. Unlike Intel, HP, etc., the bulk of Apple's revenues come from consumer products and services, so if they can get the consumers on board up front, it becomes an easier sell to businesses and creative professionals down the line who will have already been exposed to the new machines prior to launch of the business-class hardware.

If there's a market to do it. There's a large number of software packages that weren't ever ported to OSX to begin with, but with virtualization and boot camp could be run on a mac. None of those are going to be ported to MacOS/ARM. That's the software they're talking about...things like AutoCad, SolidWorks, and so on.

As for the pain, if you only use a modern high-level language, it's relatively painless. If you've optimized the application by writing lower level code (e.g. above and games) it's a ton more work.
 

Realityck

macrumors G4
Nov 9, 2015
11,421
17,212
Silicon Valley, CA
Just my $0.02, but I think it could be the native Apple & Third-Party Software that steals the show !

Which Flagship iOS apps would help Apple best Showcase the new Apple Silicon ?

I believe Apple's primary (initial) markets are Games & Remote Education, so very-likely apps in those two categories.
9 to 5 Mac had this article today
iOS apps will run on Apple Silicon Macs, but major developers have already opted out of the Mac App Store
9to5Mac found out that some major iOS app developers have already chosen not to offer their apps on the Mac App Store to Apple Silicon Mac owners — at least for now. We were able to check this information through the App Store system that revealed to us which iOS apps will not work on the new Macs with Apple Silicon chip.

If you were expecting to watch YouTube on your new Mac with a native app, you’re out of luck. Google has chosen not to offer most of its apps on the Apple Silicon platform, and this also includes Google Maps, Google Drive, and Gmail apps. On the other hand, the Netflix app for iOS is still on track to be available on new Macs, as well as the HBO Max app.

Facebook has also chosen not to include its iOS apps on the Mac App Store, such as Instagram, WhatsApp, Messenger, and the Facebook app itself. There are some other noteworthy apps that will be missing from the Mac App Store, including Snapchat, Amazon Prime Video, and Disney+.

We also checked some popular games like Candy Crush, Among Us, and Real Racing 3, but they won’t be available on the App Store for Apple Silicon Macs either. However, Sky: Children of the Light, Subway Surfers, and Temple Run might be.

Users will find the following message on the App Store for compatible iOS apps: “Designed for iPad. Not verified for macOS.” If the app is not compatible, the App Store will tell the user to download it on an iPhone or iPad.
How about that "Designed for iPad. Not verified for macOS?"
 
Last edited:

Yebubbleman

macrumors 603
May 20, 2010
6,024
2,617
Los Angeles, CA
Alright, as I post this, it's less than 48 hours to go until the November Apple event for the debut of Apple silicon Macs. Woohoo! So what's rumored right now are...
  • Two 13-inch Apple silicon MacBook models — most likely MacBook Air & MacBook Pro
  • 16-inch Apple silicon MacBook model — tentative "maybe" ??‍♂️
...All in the existing design we're all familiar with. That's it. Just swapping out the processor can't be the whole story — or can it? I noticed Apple doesn't capitalize silicon, so "Apple Silicon" isn't going to the brand name, thank god. Yet for some reason the naming of these "Mac family of processors" as A14 or A14X seems off, like they should be called an X1 or Z1. That would sound more bad ass, in my opinion.

While I'm excited for the switch I think Apple may include some other tweaks/creature comforts. Here's a list:
  1. Improved battery life — Seems like a given to supposed offset of the Intel processor's power consumption. How much of an improvement can only be guesstimated. I'd like to see it 2x compared to what we have now. I can't see Apple reducing battery size as a way of reducing overall weight of these devices — but who knows?
  2. 802.11ax (WiFi 6) — seems like a given since it's already on the newest iPhones & iPads.
  3. Thunderbolt 4/USB 4 ports — Seems likely, yes? Since Thunderbolt 4 is a royalty free. I mean how the hell else was Craig powering that Apple Pro Display XDR in the WWDC lab video?
  4. Brighter displays — Doesn't look like we'll get any real change in the display quality yet (i.e. higher resolution, ProMotion, or mini LED). So we'll get a screen that can go up to 600 to 700 nits? Might help with iPhone HDR video situation?
  5. Improved FaceTime HD camera — how about something in the 1080p range? I mean 720p is so 2010...
  6. Face ID — This sadly doesn't seem possible with this update, even though we got iSight cameras first in Apple laptops in the last processor transition with the MacBook Pro in 2006.
What surprises might we see?
  • Like, for example, will they keep the Touch Bar in the MacBook Pro? I'd just assume jettison it — but that's more of a personal thing than a likelihood at this point.
  • 5G integration — I think this possibility as very remote.
  • Elimination of 2 Thunderbolt port model on the 13" MBP, it's 4 ports.
  • 16" MacBook Pro preview — coming in late December or January?
Your thoughts?

Did I mention I'm excited?
Same designs for the incoming Apple Silicon Macs. In the PowerPC to Intel transition you had the 12" PowerBook G4 and both iBook G4s merge into the MacBook, which was an all-new design, but all other first gen Intel Macs retained the same or 90% similar designs to their PowerPC equivalents. The same strategy will be employed here. New designs aren't as important as showcasing the difference under the hood. Apple will want to focus on that for most of the new Macs (save for iMacs, which likely couldn't change design until this transition), but do expect second and third rev Apple Silicon Macs to start changing up designs.

I strongly doubt we'll see the 16" MacBook Pro make the jump to Apple Silicon tomorrow or in 2020, especially given that there are Boot Camp notes from within Apple specifically mentioning a 2020 16" MacBook Pro (meaning it's still running Intel [as even a native dual-boot with Windows 10 for ARM64 likely wouldn't use "Boot Camp", but rather some other concocted mechanisms entirely]). I do believe we will see said 16" MacBook Pro get released tomorrow, likely as a ceremoneous "here's the final Intel Mac". There's also the fact that Tim Cook said that there were multiple new Intel Macs still in the pipeline (and we only got one of them released since the transition). There may be another Intel Mac Pro in the pipeline (though I don't know if there are replacement components to warrant such a change).

Otherwise, I'd think that we're seeing the Air (still to be at 13" using the same exact form factor) rocking an SoC of similar class to that of the iPad Pro using the A14 microachitecture, and then the 13" Pro using an SoC of a higher class (figure more cores for the CPU and GPU portions respectively) variant of A14. Despite using the same enclosures, I'd say that FaceID is extremely likely; as are better webcams (Apple very likely wanted to wait until this transition to start beefing up the cameras). I think Apple also wanted to wait until the transition to move the Mac to WiFi 6 (which is sort of unfortunate considering that the Macs that could run Catalina, but not Big Sur were cut out for not having updated WiFi drivers), though the Intel 16" MacBook Pro update could be a notable exception here.

Thunderbolt 4 is not royalty-free, USB4 (which is basically Thunderbolt 3) is. I'm unsure of what Apple will use here. The current Intel 4-port 13" MacBook Pro is the only Mac where the Thunderbolt 3 controller is integrated into the CPU; for all others, there is a discrete Intel Thunderbolt controller chip on the main logic board of every Mac with Thunderbolt. Apple will need to keep putting these chips on its Apple Silicon Mac main logic boards in order for Macs to have actual Thunderbolt 3 or 4 support. USB4 solves this, but Apple has already said that there would continue to be Thunderbolt support on Apple Silicon Macs, so whether 3 or 4 (and whichever one used could depend on many factors), so expect Intel Thunderbolt controllers to remain on Apple Silicon Mac main logic boards.

Otherwise, I do believe that the 2-port 13" MacBook Pro will disappear. This Mac was the continuation of the 2010-2017 MacBook Air, whereas the 2018-2020 Air was the continuation of the 12" MacBook. The latter was powered by crappy slow modern-era Atom-class Y-series processors which never really had enough power to do anything without heating excessively. Apple will be able to put a very decent SoC into the chassis of the 2020 Air to get it to outperform both the 2020 10th Gen Y-series processors as well as the 8th Gen lower-powered U-series chips of the 2-port 13" MacBook Pros, thereby returning the MacBook Air to the former glory (and capability) of the 2010-2017 era Airs, and removing the need for two different Macs on the low end. I'd say that what Apple does to the 4-port 13" MacBook Pro is more of a mystery, considering that Apple needs to make it more performant than the 13" Air for it to have a reason to still exist in its current form. I think it's more likely that the Air will gain the TouchBar than it is that the 13" Pro will drop it; Apple seems like it wants to keep building on that feature rather than listening to the mass indifference to it that people have had.

One final big question mark for me is how they'll handle the outgoing Intel models. In the PowerPC to Intel transition, the first two Intel Mac model introductions didn't trigger the immediate discontinuations of their PowerPC equivalents. They stuck around for a month or two longer for those that NEEDED to elect for the outgoing architecture. Later PowerPC to Intel jumps (beginning with the Intel Mac mini) didn't have that kind of overlap. Also, one could argue that the iMacs and then-15" MacBook Pros were a critical market segment that might require a more gentle overlap, whereas the target market audience for both the Air and the 13" Pro will likely not be as inconvenienced by having to use Rosetta 2 on things that aren't Apple's apps, Microsoft 365 apps, or Adobe apps. But we'll certainly see soon enough.

Personally, my Mac usage has scaled down enough to the point where I'm hoping that a revitalized Air (not sucking like the 2018-2020 Air) is all that I need out of a Mac in the Apple Silicon era. The rest of my computing needs is handled by Windows 10 these days.
 

casperes1996

macrumors 604
Jan 26, 2014
7,599
5,770
Horsens, Denmark
The author of that opinion clearly didn't pay any attention to the WWDC keynote based on his comments about transitioning the software from x86 to ARM. He also apparently assumes that Apple just woke up one morning in February and said "let's switch to our own processors!", as he clearly doesn't know that not only has Apple been working on this for several years now, but that the bulk of Apple software (along with Microsoft Office and at least some Adobe apps) have already been recompiled for the new processors. I can attest to the fact that converting an app from x86 to ARM using XCode is a relatively painless experience, not "anything but straightforward" as the author claims, although there are always some glitches in the matrix from time to time. I was not in any sort of development role or capacity during the PPC-Intel transition, so I can not use that experience as a guideline for what to expect now.

This is mostly true, but it very much depends on how you've written your software. If your software is written in a good style of Swift with the use of Apple's frameworks and libraries, well, you just hit the magic build button and you're practically done.
But the story is vastly different if your critical software is written in K&R style C with manual pointer arithmetic, making assumptions about memory offsets and alignments, and using compiler intrinsics for AVX instructions, which btw are not translated by Rosetta.
Though I will caveat this by saying that if you code C in a good style using functions like sizeof instead of assuming the memory size of types and such you're not in very much trouble there either - so it is only potentially painful in an extremely small set of situations. Just want to point out it isn't always just hitting build :)
Currently making a compiler for a university course - amidst converting to LLVM-IR. And the existence of LLVM will also simplify the transition since it means all compiler front ends are basically already ready for the transition as long as LLVM itself is ready, so that's great. Thanks for LLVM and Swift, Chris Lattner. And thanks for heading the ship Tanya Lattner.

I don't disagree, which is why I don't think it will be A14 SoCs but something else. Something designed for interprocessor communication and cache coherency. (I'm not sure what you're saying about software optimization, unless you're talking about keeping the cores busy.)
Writing code for multiple-processor systems if you want optimal performance - or consistent performance - can be a bit different to monolithic structures. A little bit of this can even be seen with chiplet design like Zen or big.Little type structures where both efficiency and Power cores can run threads simultaneously, but in those instances a lot of the burden is often carried by the OS' thread scheduler. - The scheduler can also help with multi-processor scenarios but not as much when it's a single process wanting to utilise the performance of the whole system rather than several distinct processes with separate memory spaces.

This concept is called "NUMA" - Non Uniform Memory Access.

Unless the scheduler basically only uses a single chip for most operations anyway you might see performance regressions for "everyday" operations even though highly intensive threaded tasks could still see performance increases.

I don't find it unlikely that Apple will deploy a multi-chip design as well s a chiplet architecture at some stage, but I doubt we'll really see multiple distinct coordinated CPUs akin to some Xeon or EPYC setups. Apple had such options on the old Mac Pro and do not with the current one even though Intel offers the option. - Scaling a single NUMA node, even though it can potentially have enough divergence internally to almost be considered separate NUMA nodes in a chiplet sense, seems to win out.

But distinct chips like Afterburner, GPU die, etc. either as their own packages or on the same package as the main die is very much something I see in the cards.
 
Last edited:
  • Like
Reactions: ExcelTronic

deevey

macrumors 65816
Dec 4, 2004
1,417
1,494
If the Mac Mini DTK was a finished product, you might have a point. However, all indications are that the Mini was just used because the chassis was large enough to fit in the frankensteined parts that went into the DTK. If Apple does release a Mac Mini running AS, it could be half the height of the current Mini and keep the same depth and width (which would also allow the new Minis to fit into the same server racks the current models can use).

They could make it 1/4 the size (Apple TV is actively cooled) so you could fit 4x as many in a slightly modified server rack.

That, to me would be impressive.
 
  • Like
Reactions: erich.j.k

casperes1996

macrumors 604
Jan 26, 2014
7,599
5,770
Horsens, Denmark
As for the pain, if you only use a modern high-level language, it's relatively painless. If you've optimized the application by writing lower level code (e.g. above and games) it's a ton more work.

Games generally are produced at a fairly high level within an engine IDE that deals with a lot of the lower level things in a platform independent way. Well, assuming a platform independent engine of course.
Really major game titles will have optimisation tweaks specific to architecture and platform done to the engine for that game specifically, but most of that work hasn't so much to do with CPU architecture as it does optimising for various GPUs and most of the time that's targeting the Windows GPU driver and interaction model.

The main macOS games provider, Feral, have stated that the majority of the changes they need to make to make games run on macOS is not so much about CPU architecture or even rendering API like Metal v Vulkan v DX, but about other OS system calls and dependencies; Though I cannot remember when or where this statement was made :/
 

Mr. TBTSmith

macrumors newbie
Jan 9, 2019
3
2
California
There were 2m cables with Thunderbolt 3 . For example



What TBv4 brings is “Universal 2m “ cables. That means using ‘passive’ cables that should also work with USB 3.1 , USB 4 , and TBv4 . They may get cheaper , but not longer .
Universal cables aren't necessarily passive. Passive cables can already handle USB, Thunderbolt and DisplayPort signaling. Thunderbolt 3 cables, could only handle Thunderbolt signaling. Thunderbolt 4 cables can handle DisplayPort or USB signaling in addition. Active cabling is required to go beyond about 0.8m at Thunderbolt 3 speeds. Thunderbolt 4 cables aren't likely to be cheaper than Thunderbolt 3 cables.
 

smoking monkey

macrumors 68020
Mar 5, 2008
2,363
1,508
I HUNGER
Good information, thanks and that makes sense about the bezels, I'm typing this on an original 15" retina.

Thanks, nice to know who the forum bully is up front.
I thought the Trek reference would have been good enough to see it was just mucking around and having a laugh. Hey, I got plenty of blindspots as well!

If you're on the original retina that means 2012 and a year older than my kit. Meaning we are probably both in the market for a new machine and as such hope a 16 gets announced.
------------

Not sure this has been mentioned, but the "one more thing" refers to this AS event and the release of these computers... yeah? That's how everybody is seeing it? Or am I totally off on this?
 

Krevnik

macrumors 601
Sep 8, 2003
4,101
1,312
This is mostly true, but it very much depends on how you've written your software. If your software is written in a good style of Swift with the use of Apple's frameworks and libraries, well, you just hit the magic build button and you're practically done.

Swift has done a lot of heavy lifting in that it keeps a lot of the architecture details from leaking into the language too heavily. About the only real way to be in trouble there is if you:

A) Use UnsafePointer types for purposes other than AppKit/UIKit interop, but even then at least you won't have endianness issues like with the PPC -> Intel switch, so even this could be fine in many cases as a way to unpack buffers in memory.
B) Use non-Apple C library interop (or a dependency of yours does). And that library is what doesn't compile for ARM.

But the story is vastly different if your critical software is written in K&R style C with manual pointer arithmetic, making assumptions about memory offsets and alignments, and using compiler intrinsics for AVX instructions, which btw are not translated by Rosetta.
Though I will caveat this by saying that if you code C in a good style using functions like sizeof instead of assuming the memory size of types and such you're not in very much trouble there either - so it is only potentially painful in an extremely small set of situations. Just want to point out it isn't always just hitting build :)

I worked on a project that has gone through both the PPC -> Intel and the 32 -> 64 bit transitions.

A lot of the sort of things you mention were already fixed as part of those transitions in legacy apps like the one I worked on. That said, there's still definitely some cases where low-level code exists for one reason or another that will need to be addressed, especially in legacy apps. Or just cases where developers have started making assumptions again in new code. But if your code already straddles iOS, again a chunk of that work should be done already. For example: Office, Affinity Photo, maybe Photoshop?

TBH, compilers and similar low-level tooling are the most frequent places I've hit these sort of issues. Ironically, the Swift compiler itself being a good example. The developers of the Swift compiler tend to include assumptions of 64-bit host platforms, which has been a thorn in the side of folks using the Swift compiler on the Raspberry Pi. But now that there's a 64-bit Raspbian, and I've validated my project still works on it with the hardware register access I need for PWM control, I've bailed on 32-bit and use an AArch64 Swift compiler.
 

torncanvas

macrumors regular
Feb 14, 2006
121
73
Apple already sucker-punched devs with the whole codesigning debacle. Which followed the hard cutoff forcing upgrade forcing apps to 64-bit. Which followed a laundry list of other requirements that'd make this post too long... Developing for macOS is now a big pain in the butt for small-to-medium devs. And now they'll be asking developers to support not just a new architecture, but a new one plus the previous one.

That is a huge ask, and obviously they know it. The only way they'll make that work is if people flock to them. En masse. The reason why they would do that is because they are obviously superior, i.e. along more than one axis.

The weird thing is that the everyday person will do just fine with an iPad level of performance, why not push them to an iPad? I guess that's where the Air comes in, it's for people with minimal 3rd party software needs who want the classic clamshell form factor.

But why release new MacBook Pros? That's the interesting part. If they are indeed doing that, either they'd need a huge jump in software support since WWDC, or the improvement needs to be simply amazing.

IMO you can't really sell people on thermals. "Look how much cooler the chassis is!", while a valid praise in an enthusiast review that nerds like some of us pay attention to, does not seem like the kind of thing you'd put in a keynote.

That leaves battery life and processing performance. They'll both need to be awesome, or the rumors will need to be decoys and form factors are changing which is unlikely, or otherwise Apple will be in trouble. AND lets not forget the laptops rumored to be updated are ones that would normally get Tiger Lake, which itself is finally a good upgrade for the ultrabook class. Otherwise why not wait another 6 months?

For a company of their size, this is a pretty risky move unless they really have nailed performance-per-watt scaled past iPads, so I'm hopeful.

I also find it interesting that the notable device-fingerprinting, tracking apps are the ones holding back on updating their apps to macOS. It's definitely easier to do that kind of thing on mobile or in a web browser. But hey, maybe big companies don't want to deal with codesigning, either. :p
 

UltimateSyn

macrumors 601
Mar 3, 2008
4,969
9,205
Massachusetts
Since the MacBook Air and MacBook Pro are already listed on 3rd party seller sites as Apple MacBook nnn (whatever), maybe the name of the Apple corporation for the chip.

Thus we had PowerBook (PowerPC), Intel MacBook, and now the Apple chip ..... "The Apple AppleBook -- In two flavors, the Apple AppleBook Air and the Apple AppleBook Pro."

(Remember you heard/saw it here first!)
I think they should really just nail it home. Now introducing the Apple AppleAppleAppleBook Pro Apple Book Book AppleBook
 

casperes1996

macrumors 604
Jan 26, 2014
7,599
5,770
Horsens, Denmark
TBH, compilers and similar low-level tooling are the most frequent places I've hit these sort of issues. Ironically, the Swift compiler itself being a good example. The developers of the Swift compiler tend to include assumptions of 64-bit host platforms, which has been a thorn in the side of folks using the Swift compiler on the Raspberry Pi. But now that there's a 64-bit Raspbian, and I've validated my project still works on it with the hardware register access I need for PWM control, I've bailed on 32-bit and use an AArch64 Swift compiler.

Hehe, yeah - For a long time you couldn't even statically link the standard library on Linux!!! It's only just recently been introduced. That one was a bit of a shocker to me when I first ran into it. Validated I could statically link my little cronx tool (executes lines from a crontab file at will) for use with Ubuntu so finally that's sorted at least.

Apple already sucker-punched devs with the whole codesigning debacle. Which followed the hard cutoff forcing upgrade forcing apps to 64-bit. Which followed a laundry list of other requirements that'd make this post too long... Developing for macOS is now a big pain in the butt for small-to-medium devs. And now they'll be asking developers to support not just a new architecture, but a new one plus the previous one.

It really isn't. Developing for macOS is a much smoother experience than for example developing for Windows. Codesigning is a distribution thing not a development thing, and it's not really any different to websites getting TLS encryption (the little padlock in the browser) - I hope you don't think we should just get rid of that.
I don't know how much development work you've done, but for actively maintained codebases none of what Apple has done has been that major a hurdle. macOS is still an excellent development platforms with great toolchains both using Apple's Xcode IDE as well as traditional Unix development facilities and anything else one might want.
I've had a lot of university courses now that only officially condone the use of Linux and macOS and often using Windows has been entirely impossible, while on macOS setting up the toolchain is as easy as installing a few packages through fink, home-brew, macports, or whatever packaging solution you prefer and you're ready to go. I've legitimately never felt hampered by code signing because, as mentioned, it's a distribution problem, not a development problem. Executables run from Terminal or by simply clicking "run anyway" can bypass the code signing block anyway, just like you can choose to visit websites with an untrusted certificate. And when you want to distribute, code sign with Apple's provided tool, done.
 
  • Like
Reactions: AppleFeller

theorist9

macrumors 68040
May 28, 2015
3,880
3,060
Neither Intel or AMD stay within their TDP under load, and haven't for years. Just getting an SoC for the MBP that actually stayed under 45W would be a huge win. My 16" loves to spike up over 90W just for the CPU package, and will frequently sit above 60W under load.

View attachment 1606756
I believe you may have misunderstood my post, perhaps because you didn't catch the context. I wasn't referring to the TDP rating of the processor, or the extent to which Apple would stay within it. Instead, I was referrering the the maximum thermal capacities of the systems (laptops, in this case) (TPD can also be used in that way).

The poster to whom I was responding opined that, to get maximum performance, Apple would push the maximum thermal envelope (TDP) of its AS laptops, as it now does with its Intel laptops (as all of us are already aware). I.e., the point you raised, while of course valid, was actually the implicit *starting point* of our discussion.

Anyways, he thought the thermal output Apple would allow for its AS CPU/GPU combo would be similar to the thermal ouput Apple now allows for its Intel or Intel/AMD combos. My response was Apple would not push it that far if, in so doing, it was moving into the regime of significantly diminishing returns with its CPU/GPU performance.
 
Last edited:

omenatarhuri

macrumors 6502a
Feb 9, 2010
992
1,019
Developing for macOS is now a big pain in the butt for small-to-medium devs. And now they'll be asking developers to support not just a new architecture, but a new one plus the previous one.
In recent years there's been fairly few MacOS developers while there are throngs of iOS developers.

I think what they're betting on is that they can get more apps and developers on MacOS by allowing iOS apps to be run on Mac. And I bet their right.

To your point though, 99% of developers don't even need to care about what kind of a processor the compiler is compiling the code. Typically one wouldn't be working that close to the processor that they'd need worry about that. Looks like the in the future Xcode will spit out a binary that includes support for both CPUs so no biggie.
 

trip1ex

macrumors 68040
Jan 10, 2008
3,232
1,900
I could use an official dock for the iphone that lets you use a big screen and mouse/keyboard.
 

smoking monkey

macrumors 68020
Mar 5, 2008
2,363
1,508
I HUNGER
The new Mac Book Air - Double the battery life.

Dunno what it is now, but I bet battery life will be front and center. And it will be massive.
 

Krevnik

macrumors 601
Sep 8, 2003
4,101
1,312
I believe you may have misunderstood my post, perhaps because you didn't catch the context. I wasn't referring to the TDP rating of the processor, or the extent to which Apple would stay within it. Instead, I was referrering the the maximum thermal capacities of the systems (laptops, in this case) (TPD can also be used in that way).

The poster to whom I was responding opined that, to get maximum performance, Apple would push the maximum thermal envelope (TDP) of its AS laptops, as it now does with its Intel laptops (as all of us are already aware). I.e., the point you raised, while of course valid, was actually the implicit *starting point* of our discussion.

I actually agree with you here. Part of the point I was more failing to make, and not really directed at you, was more that Intel’s power consumption was already so excessive, you don’t want to be pushing that envelope.
 

Jorbanead

macrumors 65816
Aug 31, 2018
1,209
1,438
Apple needs to call it something that Joe Consumer can understand.

Joe Consumer isn't going to understand that Core i7 is Intel and A__ is Apple Silicon so they can't just identify the CPU models.

When they transitioned from PowerPC to Intel, the CPU architecture was prominently mentioned in the product name (PowerBook G4 vs. Intel MacBook).

I would expect that the Apple Silicon moniker (or another proprietary Apple trademarkable name) will be used to denote the new Mac hardware running their in-house silicon.

They can still use the term Apple silicon at the event to describe the processors, but it won’t be the actual name of them, and will fade away like it has with iPhone and iPad too. They used to use that term with those products, but now everyone knows it’s Apple silicon inside so they don’t use it anymore.
Nobody is really mentioning MacMini. Seems easy for this to be part of the initial release given they've already released one for testing purposes... It's never going to be the headline item given its slight niche status but should be easy to get an A14X or equivalent in there and launch on day one.

Given working-from-home life, I'd be interested in one if there are no other updates to MBP beyond a processor swap.

There have just been no rumors about a mini which is why nobody is talking about it. I’m expecting a mini sometime next year (which is why they did an “update” this last spring to tie consumer over).
 

Jacobi

macrumors regular
Aug 8, 2012
116
520
I'll be quite happy if they release the Air in the same design but with a 1080p camera and without the heat / fan noise issues of the current one.
 
  • Like
Reactions: smoking monkey

Alex 88

macrumors member
Jul 31, 2013
39
21
Mac Mini? They're already in the wild as dev kits...haven't heard any speculation about that being announced today. I think it will be a part of the lineup.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.