Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I
...

The MBP and desktops would be years away after "Marzipan" has had time to create a base for "universal" apps that can run effectively on ARM CPUs/GPUs. I'd expect the MBP to go first and then the Mini, followed by the iMac.

Marizpan does nothing to change the relatively low volume problem the Desktop Macs have in terms of custom chips (and range of performance ) pragmatically needed for that 'half' of the line up. It isn't going to substantively increase the number of those systems sold as the software is in a significantly different dimension. in fact, there is a decent change once most of the MBP ( 13" range ) is pulled off the balance of laptops versus desktops would get worse from the desktop perspective.

Marizpan isn't really an "emulation"/"capability" bridge to keep old software on new systems at all. It is far more about new software on new systems. The apps are more so "universal" because it is a big bundle of iPad/macOS/iphone apps with slightly different GUIs that share a common foundation and bundle. On the macOS side, it also is going to enable a fair amount of software line up filling for the apps that are going to get toasted by tossing 64 bit , OpenGL , and OpenCL out the door. Apple will shepherd in more iPad apps so there is no net decrease in quantity of macOS apps. Apple will also get a few active macOS apps to move down to the iPad (probably much smaller, but a net increase in apps there also). It will also probably push more apps into the App Store.

But also will have somewhat of a Carbon/Cocoa split on the macOS side for a while. There will be macOS apps that don't rewrite to the Marizpan foundation that will probably be more just x86 over time since not moving as fast to switch build modes and libraries. Apple ran with two stacks for macOS apps for a pretty long time. There is no reason they can't do it again. Classic Cocoa will just take up the position as the "old" API and Marzipan take up the position as the "new" one. [ yet another reason Apple is flushing the "old" , legacy API of carbon , 32-bit , etc. ]

If Apple pushes a much higher percentage of macOS into the App Store then delivering x86 or arm could be something the store does dynamically. ( developer upload a big bundle and the App Store thins that out to what the app needs on the Mac/iPad making the request. Don't even really need 'fat" binaries ). Classic Apps that wanted to could add "fat binaries" if distributed outside the App Store ( Apple won't mind at all their store apps are delivered far less bloated. It is an additional selling point to getting them from their store. )


In that model, Apple could do mixed platform Macs for a relatively long time. Apple could keep the A-series far more focused on low end battery power and then rest of mac line up on x86 ( presuming both Intel and AMD don't both screw up over long term).


That all doesn't necessarily get Apple better mix of top quality apps. Having to do drivers for x86 and ARM is has a decent chance of leading to an even more unsettled 3rd party GPU driver situation. Some apps will leave for the more stable ecosystem of Windows (at least Windows on x86).


The iMac Pro and Mac Pro will be the final ones to go and I could easily see that not happening before mid-next-decade.

That probably wouldn't happen. Once the iMac goes ( which is probably around the threshold of a viable custom implementation run rate ) the rest is untenable. It is not volume enough for many developers to put up with the extra work (especially if Apple nukes the older x86 of MBP , MBA , MB support ).

If there was an extension it would probably more so be because those two products were in Rip van Winkle mode and nothing happened because Apple was doing almost nothing. If Apple is maniacal about ARM only at that point they a decent chance they'll just get dropped ( or kept in sleep mode even longer while defacto letting the Vintage/Obsolete clock run.). If by mid-next decade the iMac Pro and Mac Pro account for less that 1% of the Mac unit sales then would more likely be and "end point" ; not a transition point. Apple is highly unlikely to do a expensive platform shift for a product that doesn't even show up in pie chart of the Mac products being sold.

If part of the mainstream market jumps over to ARM desktops also perhaps not. But that depends more so on what other vendors do versus something primarily driven by Apple.


Right now it seems more likely that Intel and AMD highly competing with each other will save at least the Mac desktops from switching. A very decent chance the at least the MBP 15" product class stays in that boat too. ( small chance that some ARM implementer disrupter pops up and hauls a significant chunk of the Windows desktop systems on to ARM. If that happens then Apple could shift their desktops to that platform too. )

if Apple has to outcompete 2-3 vendors who are all executing well, they will probably punt on trying to be beat them. The CPU package isn't the Mac. The Ma is the Mac. If there is a more than reasonable CPU package to buy then they'll probably just buy it.
 
  • Like
Reactions: ixxx69
Marizpan does nothing to change the relatively low volume problem the Desktop Macs have in terms of custom chips (and range of performance ) pragmatically needed for that 'half' of the line up. It isn't going to substantively increase the number of those systems sold as the software is in a significantly different dimension. in fact, there is a decent change once most of the MBP ( 13" range ) is pulled off the balance of laptops versus desktops would get worse from the desktop perspective.

Marizpan isn't really an "emulation"/"capability" bridge to keep old software on new systems at all. It is far more about new software on new systems. The apps are more so "universal" because it is a big bundle of iPad/macOS/iphone apps with slightly different GUIs that share a common foundation and bundle. On the macOS side, it also is going to enable a fair amount of software line up filling for the apps that are going to get toasted by tossing 64 bit , OpenGL , and OpenCL out the door. Apple will shepherd in more iPad apps so there is no net decrease in quantity of macOS apps. Apple will also get a few active macOS apps to move down to the iPad (probably much smaller, but a net increase in apps there also). It will also probably push more apps into the App Store.

But also will have somewhat of a Carbon/Cocoa split on the macOS side for a while. There will be macOS apps that don't rewrite to the Marizpan foundation that will probably be more just x86 over time since not moving as fast to switch build modes and libraries. Apple ran with two stacks for macOS apps for a pretty long time. There is no reason they can't do it again. Classic Cocoa will just take up the position as the "old" API and Marzipan take up the position as the "new" one. [ yet another reason Apple is flushing the "old" , legacy API of carbon , 32-bit , etc. ]

If Apple pushes a much higher percentage of macOS into the App Store then delivering x86 or arm could be something the store does dynamically. ( developer upload a big bundle and the App Store thins that out to what the app needs on the Mac/iPad making the request. Don't even really need 'fat" binaries ). Classic Apps that wanted to could add "fat binaries" if distributed outside the App Store ( Apple won't mind at all their store apps are delivered far less bloated. It is an additional selling point to getting them from their store. )


In that model, Apple could do mixed platform Macs for a relatively long time. Apple could keep the A-series far more focused on low end battery power and then rest of mac line up on x86 ( presuming both Intel and AMD don't both screw up over long term).


That all doesn't necessarily get Apple better mix of top quality apps. Having to do drivers for x86 and ARM is has a decent chance of leading to an even more unsettled 3rd party GPU driver situation. Some apps will leave for the more stable ecosystem of Windows (at least Windows on x86).




That probably wouldn't happen. Once the iMac goes ( which is probably around the threshold of a viable custom implementation run rate ) the rest is untenable. It is not volume enough for many developers to put up with the extra work (especially if Apple nukes the older x86 of MBP , MBA , MB support ).

If there was an extension it would probably more so be because those two products were in Rip van Winkle mode and nothing happened because Apple was doing almost nothing. If Apple is maniacal about ARM only at that point they a decent chance they'll just get dropped ( or kept in sleep mode even longer while defacto letting the Vintage/Obsolete clock run.). If by mid-next decade the iMac Pro and Mac Pro account for less that 1% of the Mac unit sales then would more likely be and "end point" ; not a transition point. Apple is highly unlikely to do a expensive platform shift for a product that doesn't even show up in pie chart of the Mac products being sold.

If part of the mainstream market jumps over to ARM desktops also perhaps not. But that depends more so on what other vendors do versus something primarily driven by Apple.


Right now it seems more likely that Intel and AMD highly competing with each other will save at least the Mac desktops from switching. A very decent chance the at least the MBP 15" product class stays in that boat too. ( small chance that some ARM implementer disrupter pops up and hauls a significant chunk of the Windows desktop systems on to ARM. If that happens then Apple could shift their desktops to that platform too. )

if Apple has to outcompete 2-3 vendors who are all executing well, they will probably punt on trying to be beat them. The CPU package isn't the Mac. The Ma is the Mac. If there is a more than reasonable CPU package to buy then they'll probably just buy it.

Anyone remember the DEC Rainbow?
Wikipedia:
"The Rainbow 100 was a microcomputer introduced by Digital Equipment Corporation (DEC) in 1982. This desktop unit had a monitor similar to the VT220 in a dual-CPU box with both 4 MHz Zilog Z80 and 4.81 MHz Intel 8088 CPUs.[1] The Rainbow 100 was a triple-use machine: VT100 mode (industry standard terminal for interacting with DEC's own VAX), 8-bit CP/M mode (using the Z80), and 16-bit CP/M-86 or MS-DOS mode using the 8088".

Apple could design a dual-boot iMac machine, using the AMD "chiplet" design, with both an x86 CPU and an ARM CPU on a single multi-chiplet CPU.
 
  • Like
Reactions: monokakata
Anyone remember the DEC Rainbow?
Wikipedia:
"The Rainbow 100 was a microcomputer introduced by Digital Equipment Corporation (DEC) in 1982. This desktop unit had a monitor similar to the VT220 in a dual-CPU box with both 4 MHz Zilog Z80 and 4.81 MHz Intel 8088 CPUs.[1] The Rainbow 100 was a triple-use machine: VT100 mode (industry standard terminal for interacting with DEC's own VAX), 8-bit CP/M mode (using the Z80), and 16-bit CP/M-86 or MS-DOS mode using the 8088".

Apple could design a dual-boot iMac machine, using the AMD "chiplet" design, with both an x86 CPU and an ARM CPU on a single multi-chiplet CPU.

Thanks for the memories.

Isn't this too much for too little or any benefit?
 
I know it's still obviously speculation, but does anybody have a good idea of what ports will be on the next Mac Pro? I'll have to upgrade my external RAID at the same time and would like to start doing some research. Is TB3/USB-C going to still be the standard?
 
It would be nice if all the software/hardware suppliers would catch up with the Mac OS. I up-graded to an Imac Pro in December and it came with Mojave. It is hit and miss on a daily basis. At this rate the software and drivers will be useless with a new mac pro for at least 2 years after it finally comes out.
 
I know it's still obviously speculation, but does anybody have a good idea of what ports will be on the next Mac Pro? I'll have to upgrade my external RAID at the same time and would like to start doing some research. Is TB3/USB-C going to still be the standard?

That would be my guess..
TB3 is backwards compatible as is USB-C ( with an adapter )

It might have a couple USB 3.1 ports
 
Thanks for the memories.

Isn't this too much for too little or any benefit?
It might make sense for a MacBook or MacBook Air - where power consumption and miniaturization are important factors.
I know it's still obviously speculation, but does anybody have a good idea of what ports will be on the next Mac Pro? I'll have to upgrade my external RAID at the same time and would like to start doing some research. Is TB3/USB-C going to still be the standard?
If Apple goes the full modular route, it could have a couple of external PCIe x16 connectors.
 
I know it's still obviously speculation, but does anybody have a good idea of what ports will be on the next Mac Pro? I'll have to upgrade my external RAID at the same time and would like to start doing some research. Is TB3/USB-C going to still be the standard?
Yep. Seems very likely given the rest of their lineup that the prosumer desktops are going to keep Type-A ports for a little while longer at least and USB-C is here to stay.
 
How about a long-term split Mac lineup, if dual binaries are easy(either fat or split in the Mac App Store):

ARM Macs: MB/MBA (they may get consolidated, or the MB might get even lighter as an ARM machine), Mac Mini, 21" iMac, possibly a home-oriented TV-sized iMac (intended largely for media consumption, but also runs Mac Apps from the App Store). ARM Macs are intended for two purposes. The first is road warrior mobility where light weight and long battery life are more important than performance . The other is home and student use - enhanced multimedia features at comparatively modest cost. They run apps from the Mac App Store (only). Apple really pressures developers to put stuff in the App Store, and well-behaved Mac apps are easy to port. With this lineup, Apple can use only cores that are also used in the iPad line of the same year.

They'll need a faster chip, but it can be more iPad type cores, perhaps at higher clocks, rather than bigger cores. This is a huge advantage, because rearranging the cores on a chip is relatively easy, but developing a whole new core is many hundreds of millions of dollars. Moving the high-performance Macs to ARM would require a new, bigger core (and a huge design expense that isn't shared with the iPad and even iPhone - iPad cores are "goosed" iPhone cores - I'm sure that's an expense, but it probably isn't a new core level expense).

Intel (possibly including AMD) Macs (Pro line): MacBook Pro, iMac Pro, Mac Mini Pro, Mac Pro. The MacBook Pro stays on its present development path - nobody but Intel offers a high-power laptop chip. The 27" iMac joins the iMac Pro line (the Xeon iMac Pro stays as a high end model, likely gaining a larger screen). It's possible the iMac Pro might go AMD (Ryzen for 27"), Threadripper for Xeon model). Mac Mini Pro is the successor to today's higher-end Mac Mini models - i3 and i5 or Ryzen 3 and 5. Mac Pro is a monster - high-end Xeons (or EPYC)...
 
For one more time, the Mac line is in a pitiful situation, I could say that they can do the switch easily now or in near future but Mac mini and MBAir were "just" updated which adds to the confusion.

P.S. This "DON'T BUY" sticker on Mac Pro will earn a place in Guinness book for its longevity.:)
2019-03-11, 15.01.59.png
 
Yep. Seems very likely given the rest of their lineup that the prosumer desktops are going to keep Type-A ports for a little while longer at least and USB-C is here to stay.

That presumes that the Mac Pro ships in a "little while longer" window. It would be quite silly for them not to ship it in that window, but it also would have been saner to not be in this zombie status now also. Slapping a "tail wags dog" constraint of USB4 onto the next Mac Pro would be yet another highly dubious move.

When USB4 passes and Apple can get their hands on USB4 certified controllers, there is pretty good chance they will use that to kill off most, if not all, of the Type-A sockets. Once the USB standards itself has moved past Type-A at USB4, Apple likely would start treating the Type-A socket similar to the PS/2 Keyboard and Mouse sockets on the back of mainstream PCs. But if the "all good keyboards/mice are wireless" crowd wins out, they could just put a second set of Type-C ports that don't provision Thunderbolt or DisplayPort ( and folks who still need Type-A just pointed at new cables ). Type-C is the future. The Mac Pro is the future. Therefore, Mac Pro is all Type-C would be Apple's thinking.

Apple is probably gong to be a "everything should move to Type-C" advocate. One of the things missing though is how to separate the different implementations behind the Type-C port that doesn't confuse end users. [ maybe USB-IF is taking another stab at that with USB4. if it just as messy, or has gotten more confusing ... which might happen, then Apple will probably keep the Type-A's. Physical differences are easy to distinguish. ]
[doublepost=1552311928][/doublepost]
That would be my guess..
TB3 is backwards compatible as is USB-C ( with an adapter )

TBv3 is a USB Type-C alternative mode. USB 2.0 has to be present on a Type-C socket. That's the "lowest" mandated USB speed by the standard. USB Type-C is a physical socket type though. Thunderbolt (TB) isn't required. USB 3.1 (gen 1 or gen 2 ) isn't required.

For now on Apple Mac's, the pragmatic convention is that TBv3+USB3.1 gen 2 is "lowest" denominator on a Type-C port.

It might have a couple USB 3.1 ports

A couple USB 3.1 ports could be provisioned via Type-C. ( or Type-A ). Depends upon whether Apple wants to introduce the concept of different types of Type-C ports on a single Mac system or not. ( i.e., set a new "lowest" floor on that port type).
 
Last edited:
I know it's still obviously speculation, but does anybody have a good idea of what ports will be on the next Mac Pro? I'll have to upgrade my external RAID at the same time and would like to start doing some research. Is TB3/USB-C going to still be the standard?

If you are looking at t USB external RAID devices whether the next Mac Pro has Type-C or Type-A connectors doesn't make any material difference to your selection at all. The next Mac Pro will probably have at least some USB 3.1 gen2 capable ports.

If need Thunderbolt like bandwidth from your external RAID it is extremely likely the next Mac Pro will have at least Thunderbolt v3 (and that will probably be best bandwidth possible). There is not any other implementation of Thunderbolt v3 other than through a Type-C connector. ( "/USB Type-C ' doesn't 'add' anything there). The only Mac that does not have Thunderbolt is the MacBook. Apple made it a bit too thin and small to have Thunderbolt. (and that Mac product wants to match the iPad Pro in sport specs ). The next Mac Pro will extremely likely not be limited to that narrow of a volume constraint, nor is it likely constrained by need to huddle close to the iPad Pro.

The current Mac Pro has six Thunderbolt ports. Apple going from six to zero would be extremely unlikely. It probably will slide to four, but zero is extremely unmotivated by modern Apple design criteria. That is mainly driven by some with the notion that Apple is laser focused on building a HP Z4/Z6 or Dell 5000/7000 workstation clone to pragmatically exact set of features. They probably are not. Nor are they primarily designing the next Mac Pro inside out starting with completely off the shelf GPU add-in graphics cards. Those are primarily the only two ways the next Mac Pro would end up with zero Thunderbolt sockets.
 
  • Like
Reactions: filmak
Anyone remember the DEC Rainbow?
Wikipedia:
"The Rainbow 100 was a microcomputer introduced by Digital Equipment Corporation (DEC) in 1982. This desktop unit had a monitor similar to the VT220 in a dual-CPU box with both 4 MHz Zilog Z80 and 4.81 MHz Intel 8088 CPUs.[1] The Rainbow 100 was a triple-use machine: VT100 mode (industry standard terminal for interacting with DEC's own VAX), 8-bit CP/M mode (using the Z80), and 16-bit CP/M-86 or MS-DOS mode using the 8088".

Apple could design a dual-boot iMac machine, using the AMD "chiplet" design, with both an x86 CPU and an ARM CPU on a single multi-chiplet CPU.

And the Rainbow was so successful for DEC that it was a break out winning system for them in the 80s ........ not. So why would Apple want to emulate that outcome?

DEC didn't control CP/M or MS-DOS. Apple entirely is in charge of building/making iOS/macOS. If they wanted a dual boot system they can simply make a version of iOS and macOS that booted on the same system CPU. One of the principle drivers for two chips being in the Rainbow was to cover two app+OS with rather poor abstraction ( neither one had completely 'won' at that point so DEC was somewhat hedging bets on both. ). CP/M ran on 8088 also. The Z80 had some extensions so it was a superset of the 8080 and had leaked out more so the to the embedded market with CP/M apps hooked to those extensions. Running those Z80 specific superset apps was the rabbit hole DEC was chasing down into. Chasing those apps was a bad idea. It was a nice nerdy idea, but made very little long term business sense. ( DEC was showing they had cooler engineering in their Personal Computer than IBMs quickie mash up ... but that wasn't the key issue. )

CP/M and MS-DOS really weren't operating systems in the modern sense. They were closer to what EFI is (with some additional narrow "disk" abstractions layered on top), they are full OS. That meant many more apps have hardware specific quirks weaved into them. Hence need to mimic the quirks with two processors. Entirely misses the point of a real operating system. ( DEC knew really knew that but really didn't want these small systems to come up and replace the VAX or DECSYSTEM 20 class systems. Apple has not demonstrated that degree of major hang-up on cannibalization at all. )

In the Mac Pro space, this wouldn't make much sense at all; even if wanted to combine x86 Chiplets with a T-series. For the core count you'd probably need for the Mac Pro the number of x86 Chiplets would squeeze out the rest of the room on the MultiChip Module (MCM) foundation die/board being used. Chiplets means relatively low core count. If need mid-high range core count then need multiple chiplets. More chiplets means less free space for extranous stuff. Exactly way in the Workstation space these product will probably still continue forward in the future with zero GPU allocation on board. AMD's chips already have an ARM security chip buried inside. Putting the security chip in the same memory space as the CPU is not as helpful as it initially sounds.

For Apple's A-series chiplets are a 180 degree turn from what they have implemented. Those are more tightly integrated CPU/GPU/Memory-controller clusters ... not far more looser ones. There is a gross mismatch there. The push there is for smaller ( or not growing but better performance) SoC .... not bigger MCM mash ups.

The only MCM path Apple might be on is for a modem. Which the Mac Pro also has about zero pressing need for.


These "mash up" of different architectures have largely been busts. Intel pushing to put x86 mode embedded into the initial iterations of Itanium/EPIC was a gross waste of time. IBM's mashup of x86 chips into their Mainframes really didn't help much.


Apple could perhaps claw some system board space back in a Mac Mini like system with T2 and x86 on same, slightly bigger, socket. But at iMac sizes and up it probably doesn't buy much at all.
 
No, they don't ....
Yeah you don't know what you're talking about. Fortunately the guy under your ridiculous post set the record straight so I didn't have to go pulling benchmarks on you. Apple is going to blow past Intel and they will be forgotten.
 
How about a long-term split Mac lineup, if dual binaries are easy(either fat or split in the Mac App Store):

ARM Macs: MB/MBA (they may get consolidated, or the MB might get even lighter as an ARM machine), Mac Mini, 21" iMac, possibly a home-oriented TV-sized iMac (intended largely for media consumption, but also runs Mac Apps from the App Store). ARM Macs are intended for two purposes. The first is road warrior mobility where light weight and long battery life are more important than performance . The other is home and student use - enhanced multimedia features at comparatively modest cost. They run apps from the Mac App Store (only).

Microsoft tried that whole apps from the store only and no emulation approach with Windows RT. It didn't work so well. It is doubtful Apple would go down that same highly restrictive rabbit hole. If make the system "mac app store only" then only highlight the differences between the two macOS instances. More risk adverse folks will stay skittish and stay on the 'classic' side of the split. In the case of the Apple Macs though Apple really wouldn't present much of the choice since those on budget would be hard tracked into ARM solutions. Some will go but others won't like being herded that hard.

I don't see how ARM brings anything special with regards to multimedia features at all. All of that is mainly "fix functional' logic which is not even close to being behind the curve on Intel's side. Grumble about Intel's iGPUs but their fixed function media features have been highly competitive.

If Apple using purely "Hand me down" A-series for these then those will probably be cheaper CPUs. FaceID and Animoji perhaps if cast that as a multimedia features, but otherwise mainly just cheaper.

They may be able to get away without shipping an Rosetta like emulator but just:

1. normalize the blocking much of the older stufff across the whole platform. Dump OpenGL/OPenCL/32-bit apps. Whatever 10.xx is at the change just attach lots of change to it.

2. Since there is app version churn point folks at fat binary updates ( or better yet to the "magically get from Mac App Store" option. )



Apple really pressures developers to put stuff in the App Store, and well-behaved Mac apps are easy to port. With this lineup, Apple can use only cores that are also used in the iPad line of the same year.

At the low end right the volume bow wave of a higher volume product... yeah they have done that a couple of times. ( entry iMacs with MBA processors. Mini with MacBook/MBP 13" processors. etc. ) .


Apple isn't really pressuring the Mac App Store as much as tilting the advantages for smaller vendors who are on the "pay once"/"pay subscription" business model track. It is tougher to be a unregistered, 'under the radar' developer.
As long as there is a non Mac App store option that is also an partial inoculation against being sued. They are not likely to completely drop that for macOS.


They'll need a faster chip, but it can be more iPad type cores, perhaps at higher clocks, rather than bigger cores. This is a huge advantage, because rearranging the cores on a chip is relatively easy, but developing a whole new core is many hundreds of millions of dollars. Moving the high-performance Macs to ARM would require a new, bigger core (and a huge design expense that isn't shared with the iPad and even iPhone - iPad cores are "goosed" iPhone cores - I'm sure that's an expense, but it probably isn't a new core level expense).

If Apple is mainly shooting at "more affordable" they won't need a "faster chip". Similarly if also primarily chasing thinner and minimal necessary battery capacity.

iPad A__X series really haven't been "goosed" but iPhone cores. They have mainly been more mainstream A-series cores. Either more GPU cores ( bigger scale at the Functional computation block level. more cores ) or more ARM cores or a bit of both. The basic units are built to scale and they just scale more for the X variant. Scaling isn't a problem because all of the iPad models using the 'X' variant will use the exact same chip. ( drive volume by uniformity of the scaled unit).

Mac scale over a much broader range. The I/O and RAM capacity is over a much broader range. The volumes are much more fragmented inside the Mac line up. The high fragmentation of the feature set is far more the salient issue. That is offset in the Mac market largely largely because it mirrors the much more larger general WindowsPC market in functional groupings. Those other systems are filling up approximately those same function group spaces. As a total Windows+Mac group there is volume.


Intel (possibly including AMD) Macs (Pro line): MacBook Pro, iMac Pro, Mac Mini Pro, Mac Pro. The MacBook Pro stays on its present development path - nobody but Intel offers a high-power laptop chip. The 27" iMac joins the iMac Pro line (the Xeon iMac Pro stays as a high end model, likely gaining a larger screen). It's possible the iMac Pro might go AMD (Ryzen for 27"), Threadripper for Xeon model). Mac Mini Pro is the successor to today's higher-end Mac Mini models - i3 and i5 or Ryzen 3 and 5. Mac Pro is a monster - high-end Xeons (or EPYC)...

Apple could possibly chuck Intel (and AMD) if some ARM implementor showed up that was succesfully covering a broad segment of the WindowPC market. (for example pruned off a large spectrum 20% from overall market. Apple throws their 4-7% onto the pile and that approximate 30% has a decent chance to be at threshold limits that would make sense for Apple and the others in that segment. ).

Right now there is no implementor like that present. Nor is Microsoft talking up huge moves in across the whole mainstream Windows target space. The only Windows traction is in the MacBook/MacBook Air space. Later this year there will be Windows system with Qualcomm 8cx. Apple 'marking' those makes some sense because they could relatively easily with a revised MacBook. ( a bit limited on ports like the iPad Pro and would need a Cell modem if they wanted to mark the "always on" capabilities. ). Intel has some "always on" reference designed they are offering up as options but something like a A12X/A13X + modem could be better in a MacBook design constraints container. Apple is in a feud with Qualcomm , so the 8cx wouldn't be very viable even if it does manage to outperform the A12X on with a 'desktop" OS workloads. The better "marking" option for Apple is using their own.
 
It would be nice if all the software/hardware suppliers would catch up with the Mac OS. I up-graded to an Imac Pro in December and it came with Mojave. It is hit and miss on a daily basis. At this rate the software and drivers will be useless with a new mac pro for at least 2 years after it finally comes out.

That's been a big issue for me as well in recent years .
Both regarding current Apple hardware not being compatible with older versions of OSX , and with program, hardware and driver support becoming a problem in anything beyond Mavericks .

I don't think 3rd party manufacturers or software makers are to blame .
Apple has been spewing out new OSX versions like it's iOS and all it has to work with are iApps and wireless headphones .
Along with the decline of Mac offerings and their use in pro and semi pro environments, it's no surprise the 3rd party support gets cut down .
 
Yeah you don't know what you're talking about. Fortunately the guy under your ridiculous post set the record straight so I didn't have to go pulling benchmarks on you. Apple is going to blow past Intel and they will be forgotten.
I’m not sure I believe Apple.
If you have a product that’s better than the competition, they won’t mind testing them on equal terms and in real world conditions.
Apple hide too much.
 
Speculating about a module stack pro, vertical stack illustrated top to bottom...

  • Third party graphics (multiples possible). Standard card in box. Rated “X units”
  • Loopback Video module - Not a compulsory purchase includes loopback cables to route GPUs ports to video-in ports on that module, that route back to the I/O modles’s Thunderbolt ports (for the minority of customers who care about display over thunderbolt). Rated “X units”
  • Standard Apple Graphics, features no external display connectors - Not a compulsory purchase, headless units don’t need any gpu, as per Mac mini without a display adapter plugged in which doesn’t activate the gpu. Rated “X units”
  • Storage module(s) contain multiple blades with t-series for whole system. Rated “X units”
  • I/O module all the tb / USB current at the time. Rated “X units”
  • Processor & memory - no external I/O at all. Rated “X units” depending on potential max consumption
  • Power supply (options rated in an abstract fashion - supplies X number of “units”).
 
Last edited:
Yeah you don't know what you're talking about. Fortunately the guy under your ridiculous post set the record straight so I didn't have to go pulling benchmarks on you. Apple is going to blow past Intel and they will be forgotten.

Apple has "blown past' Intel largely because they were not trying to do what Intel was most actively engaged in. Apple was only after a relatively very narrow subset of the Intel CPU line up. Namely, embedded-to-lower end. Were Atom and Core-M was the edge of the intel CPU product up that was the 100% focus of what Apple has been doing. Intel did more of much broader area in the same time. Apple did more in a much narrower area.

The latter doesn't mean Apple is going to be blow past Intel's entire CPU line up at all. Apple has gone out of their way to avoid doing a broader set of implementations. More affordable iPhones/iPads all get 'old' SoC. Other product segments get 'old' SoC (AppleTV, HomePod, etc.). Basically products under which the focused iPhone SoC either already covers in performance or can be 'cut down' to be made to fit (e.g., T-series. ).

Those benchmarks also show that the A12X is over 100% behind the Xeon W in the iMac Pro when it comes to multicore performance. Intel is blowing Apple away there; Apple is not even close ( even 20% year over year improvements would take 4-5 years to catch up and the A__X series may have dropped off yearly updates. ). If we get into total on/off chip package bandwidth it is even more far worse behind; at least several 100% behind there. medium to large Memory capacity handling... again grossly behind.

The parts of the Mac line up that highly prioritize being as thin and light as an iPad Pro have good synergies with Appel's work. The parts of the Mac line up that plug into a wall and don't have to be thinnest as possible do not. Intel and AMD have mainly focused on the latter grouping ( not completely but there is significant weighting of their efforts there. The center point of their weighting have been in the higher end power consumption laptop range. Which not coincidently is where the core of the WindowPC market is at too. Affordable desktop replacement laptops (often plugged in), affordable desktops (always plugged in). )

How much could be left to x86 larger depends upon where the mean/average Mac system sold is. If it lies in the MBP 13"/iMac cluster of units then all isn't all that lost for x86. If the vast bulk in in the MBA/Macbook range then Intel/AMD have lots of work to do to turn the tide. Apple could kick Intel/AMD out of Mac altogether but that would probably mean walking away from the iMac Pro/Mac Pro space over that time also.
 
That's been a big issue for me as well in recent years .
Both regarding current Apple hardware not being compatible with older versions of OSX , and with program, hardware and driver support becoming a problem in anything beyond Mavericks .

Apple has never explicitly tried to put old version of macOS onto newer hardware. The narrow corner cases is when the "new" Mac was primarily only a speed bump upgrade ( tweak the processor version slightly and bump the firmware model ID. ). If there was a significant chipset change and/or component selection change then there has not been an effort to backport nay macOS is out of first line support status. With a new macOS comes a slide for the previous macOS to go into a second line status. (get fixes and security updates for current configurations but no new configurations. )

Mavericks was about the point where Apple as finished rolling in the first generation of kernel address space randomization. ( baby step at Lion (10.7), rolled out more completely in Mountain Lion (10.8), touched up with some tweaks in 10.9 Mavericks). At point Apple moved on to more security improvements in the kernel. Initial SIP can in around 10.11 )

The kernel is not a completely static place. Really good kernel extensions carefully minds it own business in it own designated corner of explicitly asked for kernel space. Other stuff basically cuts corners with 'clever' code and sometime just plain out right hacks. The latter typically fails over time if not maintained and adjusted.


I don't think 3rd party manufacturers or software makers are to blame .

If they haven't produced any substantive updates in the last 2-3 years .... yes they probably are. Even more so if in the "race to bottom" on support costs products/business. Cleaning up stuff like Meltdown/Spectre is an on going slough. Security updates over an extended period of time bring changes. Better instrumenting of the kernel over time brings changes. Dealing with higher parallelism and new hardware classes over time brings changes.


Apple has been spewing out new OSX versions like it's iOS and all it has to work with are iApps and wireless headphones .

Even at Apple's "old" rate of every 2-2.5 years the 5 year gap from Mavericks would have been at least 2 updates. If gotten basically zero from those vendors since then ... it is them. Not Apple's scheduling.

Windows has shifted from Win7 to 10 in roughly same period. A completely static development model there at both kernel and user level? Nope.


Along with the decline of Mac offerings and their use in pro and semi pro environments, it's no surprise the 3rd party support gets cut down .

It is not a surprise if some software vendors die off, but they do over time often over their own decision making; not the platform's. Some vendors may have beat the whole entire farm on CUDA. That would be just dumb in a Mac context. In retail, Montgomery Ward was the big dog until Sears was a big dog, until Walmart was a big dog , until Amazon was big dog. Companies shifting in/out of being able to well serve customers changes over time in part to how well they deal with change. Software is not particularly any different. Some churn you'll see is because folks running those products made bad calls and didn't have a backstop resources to correct for it.

Anyone who had a Mac software product that was almost entirely dependent upon the number of Mac Pro sales has been in trouble since around 2008-2009. It would only be a matter home much time they spent not correctly that narrow target was going to catch up to them. When the iMacs got desktop processors there was a shift. If completely ignored that was on them. Not aiming at most of the Mac product line up is generally a mistake for Mac softtware vendors. The space as a whole is relatively much smaller than Windows. Aiming a some narrow subset of a subset of a subset is very likely going to be off over the long term. Computing abilities change over time and that narrow target will move.
[doublepost=1552331797][/doublepost]
Cascade Lake next month.

This will line up with Apple's now annual April "Where the hell is the Mac Pro?" announcement.

That article says Cascade-X ( socket 2066) is targeting Computex which is very late May (perhaps first day of June) It would not be surprising if Intel W came a bit after those. Same socket just longer system validation time ( and allows Intel to smooth out the initial demand bubble.).

IMHO, doubtful that Apple is shooting for socket 3647 solutions for the Mac Pro. I don't think that would throw an April "preview" discussion though. If using a Cascade variant in 2066 socket they certainly would have more than few samples at this point to use a couple of working systems that was mostly complete at this point that they could point and talk to if not being able to tag with a "fixed in stone" release date. All they'd need to do is 'tap dance' around the specific name and features of the CPU. (and possibly the GPU too). Some "up to xx cores" and "next gen xx GHz" dance steps.
[ the low core count and high core count LCC/HCC core numbers will be in the 3647 product mix just coupled to a different socket. So those aren't super secret Intel info after the larger socket version is fully announced. ]
 
Yeah you don't know what you're talking about. Fortunately the guy under your ridiculous post set the record straight so I didn't have to go pulling benchmarks on you. Apple is going to blow past Intel and they will be forgotten.

What color is the sky in your world?
 
  • Like
Reactions: WatchFromAfar
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.