Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Bodhitree

macrumors 68020
Apr 5, 2021
2,086
2,217
Netherlands
I think the most interesting question is whether Apple can sustain the pace of improvement they have held onto with the A series chips over the years, and bring that uplift to the M series chips for the Macintosh. I doubt whether they will go for yearly cycles of M chips, more likely every two years.

If they can, then Apple will end up outpacing everyone else. Tiger Lake and Alder Lake are good upgrades, but one 40% yearly performance gain from the A series will catch up with that and more. It all depends on how big an effort the M series chips have been for Apple’s silicon design team, if it absorbed all the best minds for a while we can expect the A series to be more staid in its advances.

But even if Apple from this point forward manages only 20% yearly performance gains, that will still outpace the rest of the PC industry, who tend to manage more like 10%-15% yearly.
 

bobcomer

macrumors 601
May 18, 2015
4,949
3,699
I counter with FORTRAN IV. ?
But seriously, science is also an area where neolithic code and hardware can be found because it does the job. I would guess that that’s the case in many/most businesses where computers and software is just another tool to get the days work done.
30% improvements in IPC may be huge news for CPU architecture nerds, but it just doesn’t matter in a lot of settings. Legacy code and industry inertia is a huge strength for x86.
Very well said!
 

cocoua

macrumors 65816
May 19, 2014
1,011
628
madrid, spain
A lot of people buy what they use at work, at least in my experience.

Maybe in 20 or 30 years. Normal non-computer businesses move extremely slow -- the ROI of upgrading to something new just isn't there. Heck, we actually have a PC in the plant that runs DOS. It's a fiber length testing machine, and it never gets updated worldwide because there is no advantage to updating it. There's a lot of software like that!

I'd absolutely LOVE to upgrade all our software to something modern, and I really wouldn't care what hardware it ran on, but it isn't going to happen, there is just no argument for it. And my corporate overlords only add new apps on their side rather than modernizing old ones. You'd laugh if I told you what their main application is, so i wont, but I will say it's all custom code written a very long time ago. It does the job needed.

I like my M1 for the most part, and the next version is going to be better, but there is no way that's going to change things around here. Power is cheap, so chip efficiency means nothing, only does it get the job done without being too annoying is a low bar for Intel to overcome. I'll be buying the 14" MBP when it comes out, but only for home.
yep, and the Perseverance in Mars has a PowerPC 750 (quite similar to the Power Mac G3) but PowerPC is not the main business of IBM or Motorola, as it is X86 for Intel…

Large enterprises would take longer in the transition, some would do even slower in some areas because X86 dedicated code (personal computers would be faster than server or dedicated machines), but development in X86 architecture would slow down or reduced to spare parts. This is not something Intel's or AMD's investors are going to dig.

The thing is the Intel inside claim would be out of sight from the general public. And as teenagers doesn't know what is IBM today, even still being a blue giant, this is what could happen to Intel (which market cap is ~230M vs IBM ~130M). And Intel business model is focused on few assets, IBM always being a digital octopus with multiple tentacles…

what is totally true is everyone is expecting what would be inside the MBP/iMac30" and the MP in order to better understand the future, and this is an important point on history of computing (or a bluf).

Because if M2 (working title) is double powerful than M1 with slight higher power consumption, PC and server market would turn plummeting to ARM (each to their respective speed).

If M2 does not cover expectatives, then Intel would laugh for the several years to come.
 
Last edited:

AgentMcGeek

macrumors 6502
Jan 18, 2016
374
305
London, UK
I think the most interesting question is whether Apple can sustain the pace of improvement they have held onto with the A series chips over the years, and bring that uplift to the M series chips for the Macintosh. I doubt whether they will go for yearly cycles of M chips, more likely every two years.

Why wouldn’t they? TSMC improves its node on a yearly basis, a bit like Intel was able to do with its Tic-Toc system back in the day: N5 in 2020, N5P in 2021, N4 in 2022, N3 in 2023.
If Apple can bring new core designs every year for its A chips, it can certainly do it on its M line. This + node improvement should bring substantial YoY perf increase, assuming TSMC keeps the pace.
 

cocoua

macrumors 65816
May 19, 2014
1,011
628
madrid, spain
Since you know more than Elon Musk you should tell him to put M1 in Teslas instead of AMD. And, instruct developers to release more than just three native M1 games that are decades old.

https://www.macgamerhq.com/apple-m1/native-mac-m1-games/
well, forget M1, we are talking about ARM, and ARM for "serious" computing is still to be prove (remember, there is not a PRO version of the only serious ARM chip on a consumer computer).

It would be impossible even for Apple to put a ARM chip in a Tesla, as they need to test it in the market before doing such an important move. Tesla needs powerful chips and X86 is the only option now a days and moreover, latest years (the years new Tesla's models took to develop).

So, those Tesla would needed to be tested with Galaxy S10 similar chips, and then use the latest available in 2020 to be sold . This is such a crazy move, not even Microsoft can do. Maybe only Apple or Google could and taking a lot of risk, and forsure not having the powerful AMD chips as there is no ARM version similar yet.

Anyway, main move of this is because those AAA games are coded for X86, and this is something even slower to change.
 

TiggrToo

macrumors 601
Aug 24, 2017
4,205
8,838
It's not so bad, stable income, in demand, good pay. You should have seen what they paid people for Y2K temp jobs!

I don't use COBOL in my current job, but I it's one of the many languages I've used.

Those were the best days of my contractor life.

Customer: "Any chance you can work a couple of hours extra tonight?"
(Thinks - I do want that new laptop and that'll pay for it entirely)
Me: "I'd be more than happy to!"
 
  • Like
Reactions: bobcomer

JMacHack

Suspended
Mar 16, 2017
1,965
2,424
Yup. A friend of mine just changed his job - to maintaining and servicing a COBOL codebase at a bank.
I was horrified.
And more than a little envious when I learned what they paid him for his skills. ?
I’ve considered learning COBOL just for the income. Headaches be damned, I can afford the aspirin.


Maybe in 20 or 30 years. Normal non-computer businesses move extremely slow -- the ROI of upgrading to something new just isn't there. Heck, we actually have a PC in the plant that runs DOS. It's a fiber length testing machine, and it never gets updated worldwide because there is no advantage to updating it. There's a lot of software like that!
It’s true, granted I work in a small business, but the livestock sale management program we still use predates my birth. The single progammer we hired (no longer updates it) used a Windows XP vm to update it. It’s to the point Windows has deprecated some of the frameworks that it runs on, it runs horribly in w10!

Also, fiber length, that wouldn’t be wool fiber by chance would it?
 

JMacHack

Suspended
Mar 16, 2017
1,965
2,424

Gudi

Suspended
May 3, 2013
4,590
3,267
Berlin, Berlin
I know refugees, who not only have an iPhone (and a Nokia to pretend to be poor) but plan to buy a new one, because they’re running out of photo storage on their iPhone X and the 12 mini has a better size.

This is the future, not a bunch of dudes who learned their ways on a Wintel monopoly PC. Intel isn’t even in the market for the most prevalent form of computing these days. How could they possibly survive, even if they manage to cling on their shrinking market?
 
  • Like
Reactions: cocoua

cocoua

macrumors 65816
May 19, 2014
1,011
628
madrid, spain
I think the most interesting question is whether Apple can sustain the pace of improvement they have held onto with the A series chips over the years, and bring that uplift to the M series chips for the Macintosh. I doubt whether they will go for yearly cycles of M chips, more likely every two years.

If they can, then Apple will end up outpacing everyone else. Tiger Lake and Alder Lake are good upgrades, but one 40% yearly performance gain from the A series will catch up with that and more. It all depends on how big an effort the M series chips have been for Apple’s silicon design team, if it absorbed all the best minds for a while we can expect the A series to be more staid in its advances.

But even if Apple from this point forward manages only 20% yearly performance gains, that will still outpace the rest of the PC industry, who tend to manage more like 10%-15% yearly.
main concern here too, I dont know about this subject but I understand lot of the performance gain in A series is because the smaller size, but I understand too this would have a top, isn't it? what when they couldn't go further?

I tried to find info about this some months ago but is was too technical to my poor knowledge.
 

09872738

Cancelled
Feb 12, 2005
1,270
2,125
Since you know more than Elon Musk you should tell him to put M1 in Teslas instead of AMD. And, instruct developers to release more than just three native M1 games that are decades old.

https://www.macgamerhq.com/apple-m1/native-mac-m1-games/
So you think Elon Musk is the ultimate authority to know what M1 can run? Seriously? Ok, that explains a lot…
You know its not a hardware limitation, its beyond me why you insist. Of course it is not, and you know it.
 
Last edited:

cmaier

Suspended
Jul 25, 2007
25,405
33,474
California
How do you imagine that working? I mean, they will definitely ship workstations, and they will be great workstations, but servers? What kind of software are those servers supposed to run? It's just not Apple's business and I don't see how they would be good at it. They sell products/experiences, but server business is selling tailored platforms for others to build their products upon.

Besides, their chief CPU engineer left Apple to find his own server CPU company because Apple was not interested in that kind of stuff.
That’s not why he left.

And if you’re going to start a CPU company (an endeavor my friends are often suggesting), you do servers. Because if you don’t do servers, who else are you going to sell your chips to? Only on the server side is there sufficient OS flexibility at the moment. Not a big desktop/laptop market for non-Windows. You aren’t going to sell to Apple.

You could do mobile cpus, and try to get some android design wins, but it’s tough to compete if you don’t also do the baseband and radios, and that’s a patent world of hurt. Mobile chips also aren’t the most fun to design, honestly.
 
  • Like
Reactions: JMacHack

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
Apple has proven that x86 isn’t necessary. That;’s a huge threat to intel, who hasn’t show they can do anything else.

That is just deeply revisionist history. From 1984 to 2006, 22 years, Macs survived without x86. There was no huge need of a proof that it could be done. It had been done. Actually for longer than it has been on x86 ( 2020-2006 = 14. Still eight years short of time off x86 ).

A platform that clings to 30+ year old design designs was inevitable to be detached from Apple over time. Apple is just not that dedicated to looking in the review mirror for multiple decades.

UEFI deployed around the time Apple made the transition to x86 and it is still the case the majority of x86 motherboards sold with BIOS compatibility boot mode in them. If that is the kind of "nieghborhood" then Apple is likely going to move away.

For a while Windows/Intel was a bigger shared R&D pool to orbit around that "gravitational well". At this point "iOS/iPadOS + Apple Slicon" is a much closer gravitational well" ( they own it. ) . Both Windows and Intel aren't the " 800 lbs gorilla " they once were. Even Microsoft and Intel revenues are now much more cloud based for growth.
The shared R&D spend pool around ARM is quite high. Desktops aren't driven the fastest growing markets.

ARM v9 is dropping 32 bits in major ways, just like Apple did years ago. It is a shared R&D pool that is just far, far more aligned with Apple's core basic approach to evolution.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
He's got two interesting quotes from what he claims is an Intel source, one of which we already heard from Bloomberg's Gurman, but I thought I'd post both here:

"Honestly Apple scares us more than AMD at the moment. They aren't sitting still, and we are worried they have far greater ambitions than most people are currently assuming...not to mention they get the latest nodes before AMD!"

I think that isn't just an Apple CPU thing , but also an Apple GPU threat for Intel. Intel was the biggest Mac GPU vendor (not AMD). Apple is out to not only completely wipe out the space that Intel's iGPUs had. They are out to wipe out all of the embedded dGPUs that were there also. Intel trying to get into the dGPU that is yet another door then cannot even try to get their foot through to even have a shoot at competing for a design win.

Intel could have hoped to maybe train losses on CPU core placement for GPU core placement. They had a relatively good working relationship on Metal driver partnership with Apple.

Intel is buying up "better" nodes than AMD is at the moment. 6nm for their Xe-HPG DP2 chips. All those wafers Intel is buying are wafers that Apple and AMD can't use. But generally, yes Intel being arrogant because they are a process node ahead of everyone else. That is over.

Apple was also one of Intel's customers who asked for inertia busting, forward looking stuff. Apple jumped over to x86 will full support for EFI. (apple didn't want to put loads of resources into BIOS). Bigger iGPUs die allocation space. Apple asked for that early also. eDRAM cached iGPUs ( Apple major buyer). Thunderbolt was a joint project. It wouldn't be surprising if Apple brought up big.little future path years ago. ( I highly doubt Microsoft was at Intel pounding the table for the change. and the bigger scheduler changes needed. Apple ... already have them, if could use them on the macOS x86 "that is fine with us." ) Is there another Intel customer that might have been suggesting that Intel completey dump 32-bit functionaly from a future processor because they were nuking all 32 bit code in 2-3 years? Probably not.

[ I think there is a Moore's Law is Dead slide that outlines that Lakefield was in part developed last year to give Microsoft solid hardware for a year to get the scheduler issues worked out in preparation for Alder Lake this year. If a usual medicore v1.0 from Microsoft that could be problematical. Although also a dual edge sword for Intel because going to enable Windows-on-ARM better also. ]

If Intel ends up with a higher density of "monkey see, monkey do" clients then they are going to miss the more forward looking feedback. ( apple also tended to buy higher average priced CPUs than those others also. )

How easily Apple wiped out Intel's offering for the iMac 24" should put Intel on notice though. I think Intel probably though Apple was going to feel more short term "pain" on the desktop transition but they probably aren't. However, that has as much to do with covering dGPU (that were embedded) than it does the CPU cores.


"We also expect Apple's upcoming chip for the Mac Pro to comprise of 32 big cores and 8 little cores. Massive IPC."

If coupled that a largish iGPU that is also pulling off the same limited bandwidth that may not be a large scalable win if myopically looking at it from just the performance CPU cores.

But yeah if Intel was thinking they were going to milk the last dregs of relevatively (for price point) buys for Xeon W-3300 for the "big" Mac Pro, that may not happen. The "half sized" Mac Pro probably work more pretty well for folks who are highly CPU core compute bound more so than GPU core bound.


I think the PC guys underestimated Apple, much like Ed Coligan's infamous quote when he was CEO of Palm, back in the day before the iPhone. Intel and AMD probably won't go extinct, but Apple have shown that the industry doesn't need to keep the x86 shackles to be successful.

Microsoft is dropping 32-bit Windows gradually. Intel has turned off BIOS support in their UEFI reference. Windows on ARM is going to do more to pull the x86_64 forward than the stuff that Apple is doing. Largely leveraging Qualcomm's celluar modem focused SoCs is the bigger problem there.

Microsoft has been trying for a couple years to go cross platform. They have just bumbled around doing it. (typical takes three tries v1 , v2 , v3 to get something out that isn't kneecapped , fumbled , or bungled in some way. )
 

robco74

macrumors 6502a
Nov 22, 2020
509
944
More reason Intel need to be afraid of AMD. Not only does AMD own the PlayStation and Xbox console market but now they're getting into car infotainment system which can now run the latest AAA games but not Apple M1.

https://videocardz.com/newz/elon-mu...nfotainment-system-powered-by-amd-navi-23-gpu
Apple has never chased after the AAA gaming market. This has been known for quite some time. If you are a gamer, you should absolutely use a PC, if you can get your hands on a GPU at a reasonable price. AMD is the only vendor that can supply the latest for Sony and MS. However, if Nvidia can pair ARM CPUs with their own IP, they could potentially create a compelling offering for the next gen that would allow for smaller, quieter consoles.

As for car infotainment, Apple seems content with CarPlay for now. I suppose they could enter that market if they decided it were worthwhile. For now, they seem happy to sell a ton of iPhones, iPads, TVs, Macs, AirPods, AirTags, and making lots of money in services.

I still wonder why you insist on hanging out here, nitpicking and crapping all over M1 instead of hanging out on gaming, AMD, and/or Windows forums with others who share your interests. Is AMD paying you to be here?
 
  • Like
Reactions: JMacHack

dgdosen

macrumors 68030
Dec 13, 2003
2,817
1,463
Seattle
How do you imagine that working? I mean, they will definitely ship workstations, and they will be great workstations, but servers? What kind of software are those servers supposed to run? It's just not Apple's business and I don't see how they would be good at it. They sell products/experiences, but server business is selling tailored platforms for others to build their products upon.

Besides, their chief CPU engineer left Apple to find his own server CPU company because Apple was not interested in that kind of stuff.
I'm convinced Apple could make an offering in the cloud (beyond iCloud) - where they can leverage their own Arm hardware to offer services akin to AWS/GCP/Azure. Similar but different.

On one level, they'd could let iCloud use that stack to 'eat their own dog food'. Apple's need of cloud services are only going to grow in the coming years.

On another level, they'd also be able to cultivate providing better support for back end services for iOS/iPadOS apps. (Xcode Cloud V2?)

Those are markets with willing customers in which they can dip their toes. It's not like the 'cloud' business is slowing down. Amazon touts the energy efficiency of their Graviton (Arm) offerings. If I were in Apple, I'd take that as a challenge.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
IMO, that market is too juicy for Apple not to bite. Soon.

To buy Ampere Altras for their the bulk of their cloud services? Sure.

To run macOS as the baseline for the bulk of their cloud service? That doesn't make any sense at all .

XCode Cloud needs a specialized custom server processor. Probably not. Especially how Apple does highly isolated, temp build focused virtualization.

Do they eventually need a top end workstation processor. Probably. Put a double (side by side) racked "half sized" Mac Pro would sufficient for XCode Cloud that Minis couldn't cover. MacStadium has been running a cloud base continuous integratione / continuous build could services business for more than several years with off-the-shelf macs. Apple doesn't need something extra special here to do the exact same thing. ( AWS didn't need them, Azure didn't need them. Multiple folks have been doing this with off the shelf in proven business models. )

To be highly competitive in the server side market Apple's SoC would have to go over 64 cores. There is little to no incentive for Apple to put that kind of fork into the shared OS scheduler they have across products.

Apple isn't going to go into the AWS/Azure/Google/Oracle generic cloud services business. The rest of Apple's cloud services doesn't need iOS/macOs . Mail? Nope. messages backend ? Nope. Apple Private relay? Nope. iCloud storage backend? Nope.

Apple is going to eat more of their own "dog food" for XCode Cloud but the overwhelming vast bulk of their cloud services don't need anything special sauce that Apple has been building in CPU/GPU cores over the last several years.

Could they do a customer ARM Neoverse like Amazon for some of their generic cloud back as an alternative to buying from Ampere ( or Nvidia or someone other established server ARM vendor ). Big maybe. But that would be highly tuned not to be a Mac product and like Amazon they'd never sell it to anyone else either.
 
Last edited:

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
Until windows and windows software work well enough on arm, they don’t have to worry too much about anyone else except amd. Apple isn’t giving their chip to anyone.

Apple is giving the chips to folks who by MacBooks and MacBook Pros. I think some of the 'fear' there is that Apple would suck a larger share of the laptop market away from Windows in the short to intermediate term. AMD has lower prioritized laptops focused CPU modules ( they are doing them but they come out last in the rollout schedule. ). AMD is aimed at higher margin stuff first.

It is also that this will make Windows put more resources into better Windows-on-ARM as a response. It has been a "hobby project". if it becomes a main priority for Microsoft that is more problematical.

I think these Intel comments are a bit "kneejerk". Apple doesn't access to enough fab capacity to take super large chunks of market share away from Intel. TSMC is rolling out more capacity over time but Apple isn't getting all of it. Apple gets early access to the most bleeding edge but that also means it is not the highest in capacity throughput process also. ( e.g., Intel is buying up a sbustantive chunk of TSMC 6nm capacity to make Xe-HPG DG2 cihps with. Those are wafers that aren't going to go to Apple , AMD or Qualcomm to take share away from Intel CPUs. ).
 
  • Like
Reactions: JMacHack

Fomalhaut

macrumors 68000
Oct 6, 2020
1,993
1,724
Intel's biggest problem is, increasingly, "software" means "website". Almost everything is done over the web now. Other than big software like Final Cut Pro and Adobe Suite etc, everything is done over the browser and this trend will only continue. I won't be surprised if soon we get a full fledged Photoshop online, working on Chrome. In a few years, pretty much nobody will need a strong CPU for personal use. A decent ARM CPU which doesn't need a fan, very easy on the battery, will be what pretty much anybody needs.
I have seen this trend evolving over the last 2 decades, and the majority of business productivity and communications applications I use now will run on a web-browser - MS Office, Google Docs, Teams, Slack, Zoom, Google Meet, plus nearly all the task and time management tools. Even software IDEs run acceptably well via a browser. Infrastructure management (AWS, MS Azure, GCP, Cisco etc.) all have web interfaces. I *prefer* desktop apps in many cases, but you can do an awful lot with web-based apps these days.

It's just the CPU/storage/bandwidth intensive apps that need a decent local machine now, such as content creation (photo/video/audio) and graphical intensive apps (modelling, CAD, games).

We really are now living in a time when very few people need "trucks" and most are happy with a compact car, to use Steve Jobs' analogy.
 
  • Like
Reactions: jeremiah256
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.