Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
If the second anniversary of the mea culpa passes in early April without some announcement - it's time to place your orders for your Z-series.

Or, perhaps, order a (cancellable) Z-series before then so that you'll be ahead in the queue of the people who see the MP7.1 announcement and who say "oh my god no - order a Z right now".

If only HP Australia weren't such slackarses with their repping for Z Series - we get no configuration or customisation options, only the most expensive versions of the standard configs, and the single-option GPUs are a level below the equivalent US supplied machines.

It's basically like buying a 2013 mac pro.
 
we get no configuration or customisation options, only the most expensive versions of the standard configs, and the single-option GPUs are a level below the equivalent US supplied machines.

Maybe just call HP sales in the U.S. and explain the situation.
Even if the shipping from the U.S. to Australia costs USD $200.00.
Note: you'd probably want the PC double-boxed with padding between boxes, to be on the safe side.
 
If only HP Australia weren't such slackarses with their repping for Z Series - we get no configuration or customisation options, only the most expensive versions of the standard configs, and the single-option GPUs are a level below the equivalent US supplied machines.

It's basically like buying a 2013 mac pro.

You can use Www.shipito.com to get anything from the US that a vendor will not send to another country. They may even double box it for you.
 
Maybe just call HP sales in the U.S. and explain the situation.
Even if the shipping from the U.S. to Australia costs USD $200.00.
Note: you'd probably want the PC double-boxed with padding between boxes, to be on the safe side.

the warranty would almost certainly not be honoured by HP Australia, for one. Same with Apple gear - they only provide international warranty support on portable systems.
 
I'd agree with a lot of that. The dual-GPU 'solution' for the 2013 was a bit ahead of its time in software terms, and fundamentally flawed in hardware. It's now more than viable, thanks to TB3 eGPU takeoff & broad OS & application support. No way are they going to drop Thunderbolt, it's the most flexible and powerful external expansion option out there. And why wouldn't certain customers want two internal GPUs and two eGPUs?
The view I'm taking with the wait.....given all the issues with the 2013, and the length of time that Apple took to acknowledge it, then the time to fix the mess they'd got themselves into...
If they were going to pull out of the workstation market, they were going to have done that already. Even given the tiny proportion of their business that the MP represents (even when it was the best buy on the market, back in 2006....), they want it to sell. Not merely to all those people who've been patiently waiting, that won't do enough IMHO, but (one would hope) they've considered why people have gone to hacks or jumped platform altogether, and aimed to meet those needs as well. They'll still want a new machine sooner or later....and again, given that wait, the noise made by media and analysts every time a competent or in any way underwhelming update is made, rather than a spectacular one, they can't really afford to get a 'good enough' product out of the door after this time. It's got to be great, or they might as well not have bothered, and they know it. When those supply chain ducks (I think Intel and AMD graphics division, not AMD or ARM CPUs) are all in a row, it'll be ready. They can't hope to please all of us lot, but they've got to give it a go nonetheless, or the rest of us will be jumping ship....
I don’t think the dual GPU stuff was ahead of its time at all.
It works in the Windows works does it not?
Is it just that Apples implemtation just didn’t gain traction?
[doublepost=1551951496][/doublepost]
I don't think they are that far off. They've got iPad Pros running faster than 2017 MacBook Pros with GPU performance the same as a fat Xbox One S and that's from last year. The thing is 5.9mm thin! Give them a case even half the size of the old cheese grater Mac Pro and that thing is going to melt our faces off. Imagine a 64-core ARM 5nm chip with active cooling and all the cache and desktop fixins you could ever want. I'm sure they have some crazy stuff cooking in their labs. I think the argument for me is less about whether I need Intel vs. whether I want my machine to be unable to update to the latest version of macOS after a few years. But it's all about the implementation and how backwards compatible everything is and what their upgrade strategy looks like. I know with PPC to Intel, the support period for G5 wasn't great, only like two OS X versions I think.

I just want to buy a desktop that will last me 6-8 years that I can upgrade over time. One that is fast so I don't have to wait much, will work reasonably well with files from future gear that I buy (16-bit 80MP RAWs, 8K60 footage, who knows), is able to Bootcamp into Windows to do some gaming, and supports all the creative and development software that I use. I guess Intel would probably support gaming longer, but I could always just build a PC like I used to back in the day. I've been wanting to get back into PC gaming, but the most important thing is that it work as a Mac and enable me to get my work done quickly.
Whilst you may be right, I don’t believe that’s an Apples to Apples comparison.
They need to run the same OS on each architecture as closely as possible to make it so.
I’m not sure them doing that has been documented?
 
I don’t think the dual GPU stuff was ahead of its time at all.
It works in the Windows works does it not?
Is it just that Apples implemtation just didn’t gain traction?
Back then, Apple was clearly betting that OpenCL compute acceleration would take off (and it didn't), that two GPUs would make up for only one CPU (nope), and TB2 wasn't up to the job of filling the gap in the way TB3 can- though a fast GPU on TB2 will still boost performance, which says more about the design corner Apple got into. And the inability of the machine and the Mac OS to use both GPUs for graphics was plain dumb.
 
Back then, Apple was clearly betting that OpenCL compute acceleration would take off (and it didn't), that two GPUs would make up for only one CPU (nope), and TB2 wasn't up to the job of filling the gap in the way TB3 can- though a fast GPU on TB2 will still boost performance, which says more about the design corner Apple got into.
I see what you mean. So it was in fact them getting it wrong as dual GPUs are just fine.
 
The concept of dual GPU of course is fine, in fact why not get more than 2 if you want. The problem with the trashcan was that it trapped 2 mediocre GPUs inside a triangle where one side is a CPU. And the components are sourced from 2 venders that Apple have no stake in. This was a recipe for hitting straight to a brick wall.
 
  • Like
Reactions: ssgbryan
I see what you mean. So it was in fact them getting it wrong as dual GPUs are just fine.

Well, “just fine” in the sense that some workstation users will benefit from multiple GPU systems. Multiple GPU is meaningless to a large percentage of the workstation market. Perhaps even a majority. Apple’s bad bet was that dual GPU would evolve to become more generally useful and therefore justified making it a requirement for all their customers. That bet obviously did not pan out for them. Just as we see with the windows workstation market, many buyers continue to not need dual GPU for their workloads.

Nobody is saying that you, personally, are misguided for wanting or using multiple GPU. Just that Apple’s gamble that everyone would did not work out.
 
Multiple GPU is meaningless to a large percentage of the workstation market. Perhaps even a majority.

Allow me to disagree. Many of us are looking for powerful GPUs and - as much as I agree it is anti-economical for many of us - adding more than one with an OS that can manage them and scale processing power without requiring specific drivers or software updates, would be interesting to anybody.

Maybe Apple will succeed in doing it so that "it just works" :p and it could be a way to compensate for the lack of Nvidia options. Two AMDs can run faster than one Nvidia. You just need to pay the Apple tax. :confused:
 

Attachments

  • 320137-640.jpg
    320137-640.jpg
    74.5 KB · Views: 187
  • Like
Reactions: ETN3
The concept of dual GPU of course is fine, in fact why not get more than 2 if you want. The problem with the trashcan was that it trapped 2 mediocre GPUs inside a triangle where one side is a CPU. And the components are sourced from 2 venders that Apple have no stake in. This was a recipe for hitting straight to a brick wall.
more then 2 need dual cpu or AMD to get the number of needed PCI-e lanes.
 
more then 2 need dual cpu or AMD to get the number of needed PCI-e lanes.
Sometimes - but many applications run fine with switched PCIe lanes. The following box supports 8 to 10 GPUs on a pair of PCIe 3.0 x16 host slots.

oss.jpg


The key point is that the PCIe switches dynamically share bandwidth - if only one GPU per slot is busy, it gets full x16 bandwidth.

T-Bolt, on the other handle, throttles everything to 4 lanes, and then shares the 4 lanes.
 
Back then, Apple was clearly betting that OpenCL compute acceleration would take off (and it didn't),

OpenCL specifically not as well as they hoped. But to categorize that GPGPU didn't tremendouly increase over the next 6 years is to miss the the forest for a single tree.



that two GPUs would make up for only one CPU (nope),

Err, Apple didn't say anything like that. If look at the expansion of Nvidia's enterprise/pro business over the last 5-6 years there are lots of customers with "wheelbarrows of money" who pretty much disagree too. Even AMD computational GPU business ... same thing.


Did GPUs completely obviate multiple CPU socket systems? No. But the number of multiple sockets systems are going down.
"... When I spoke to a large server OEM last year, they said quad socket and eight socket systems are becoming rarer and rarer as each CPU by itself has more cores the need for systems that big just doesn't exist anymore. Back in the days pre-Nehalem, the big eight socket 32-core servers were all the rage, but today not so much, and unless a company is willing to spend $250k+ (before support contracts or DRAM/NAND) on a single 8-socket system, it’s reserved for the big players in town. Today, those are the cloud providers. ... "
https://www.anandtech.com/show/1352...x86-datacenter-class-machines-running-windows

Once both Intel and AMD mainstream desktop processors go to an 8-10 core count on average, that drop is only going to pick up speed.

Even with Intel's gross stall on 14nm about every two years the Xeon E5 1600 ( now W) class has picked up 2 cores about every 2 years. 2013 12 .... 2019 18. ( 14 16 18 over 6 years).

There are still folks who need high x86 core counts but that number shrinks as the "VRAM" capacity goes up and the core count on GPUs tends to highly outpace the CPU count increases. That trend line they didn't get wrong. What they got way wrong was the power consumption along that path (which generally went UP for the GPUs and was more constrained on the CPU path. )


and TB2 wasn't up to the job of filling the gap in the way TB3 can- though a fast GPU on TB2 will still boost performance, which says more about the design corner Apple got into.


There doesn't seem to be much to indicate that Apple was thinking very deeply about eGPUs at all in the 2012-2014 time frame. It really wasn't part of the standard until TBv3 and that's mostly because has to be coordinated with the host operating systems.


And the inability of the machine and the Mac OS to use both GPUs for graphics was plain dumb.

Given the state of Apple's graphics stack and the non stellar stability on even on the Windows side not chasing SLI/Crossfire wasn't a huge mistake. Actually probably was more so a mistake to mess around with that on the Windows side for the Mac Pro rather than put more resources in benching/developing a better insight of how the GPUs they had chosen ran in macOS.
 
They've got iPad Pros running faster than 2017 MacBook Pros with GPU performance the same as a fat Xbox One S and that's from last year.

No, they don't ....

They do and they don't.

"... Apple made some big claims about the A12X during its presentation announcing the product: that it has twice the graphics performance of the A10X; that it has 90 percent faster multi-core performance than its predecessor; that it matches the GPU power of the Xbox One S game console with no fan and at a fraction of the size; ..."
https://arstechnica.com/gadgets/2018/11/apple-walks-ars-through-the-ipad-pros-a12x-system-on-a-chip/

One , single threaded drag racing benchmarks..... If you go the article's Geekbench 4 single threaded score tables there is this graph

iPad-Pro-2018-Geekbench-Single-core-Laptops-1440x1080.jpeg

The 2017 MacBook Pro 15" isn't there but can see the Dell 2018 15 at the bottom is outlcass. If just flip to the MBP 13" models in 2017 there is pretty good chance they are in the range of that Dell (or worse).

But that's single threaded.

Multiple core is a bit probably a gimmick...

Review-chart-template-iPad-Pro-2018.001-640x480.jpeg


The gap is probably more so because this is 8 cores (A12x) versus 4 and such a short duration that the small 4 cores in the A12X don't "tap out" before the benchmark finishes. There is a decent chance that Apple has goosed the power management system to keep the low power cores in the game just long enough to finish this tech porn synthetic drag race just so they can get juice sales spin hype out of it. That is one reason Specmarks change every several years ( because the compiler flags and processors are tuned to the benchmarks. ).

For long duration, multicore workloads they many folks would try to press an Mac Pro into service on this relatively short during 'turbo" clock count driven drag race really doesn't count as much. Note also, where the core count blalance is evened up ( the MBP 15 2018 model with 6 x86 cores versus 8 ARM cores) the ARM cores get smoked. So no ... Apple does not have anything in the ball park at all right now.



As far as GPU goes.... that Apple is comparing the GPU to an XBox S isn't surprising in the context that the current AppleTV has a A10X in it. The next AppleTV probably will get a A12X. They are both boxes primarily hooked to a TV.
[ not talking about it directly but also perhaps a tie in with the AR glasses. https://www.macrumors.com/2019/03/08/apple-ar-glasses-launch-2020-as-iphone-accessory/

if it is a wireless connection then a future iPad Pro seems like would work just as well with a future update.
Apple has good enough graphics horsepower in these SoC to augment reality. One or two process shrinks and the A12X graphics could go into the future mainstream A14-A15. ]

The gimmick on the GPU marks though is that they are all "offscreen". A drag race that has little to do with driving a actual screen. ( e.g. dealing with tearing , perhaps fixed screen rate , etc. ). In other words, not dealing with TV the normal deployed environment for a Xbox S. Time duration of he benchmark ... pragmatically same issues at muticore measurement; power management for relatively short durations. Gaming full blast for 15 minutes isn't what is being measured here.
 
The gimmick on the GPU marks though is that they are all "offscreen". A drag race that has little to do with driving a actual screen. ( e.g. dealing with tearing , perhaps fixed screen rate , etc. ). In other words, not dealing with TV the normal deployed environment for a Xbox S. Time duration of he benchmark ... pragmatically same issues at muticore measurement; power management for relatively short durations. Gaming full blast for 15 minutes isn't what is being measured here.

They are measured on "offscreen" because they all have to be measured on same resolution (1080p on this scenario) for comparison sakes. You can't make comparison if they are all performed on different resolution. Yes, they are driving all different resolution screens, but for the comparison purposes, it has to be that way.
 
But what else would be taking Apple this long?

Clearly we aren’t getting a 5,1 or even modified chassis with current hardware offerings. Why put 3 years of R&D into an intel machine when Apple is heading towards ARM?

Seriously what are the options?

Is Apple heading toward taking the whole Mac line up to ARM or just some subset. Windows is only taking a subset. If Apple mimics that move there is nothing in the Mac Pro , iMac Pro , iMac , Mini , and possibly MBP 15 space that Apple has over the next 1-3 years that is likely viable at all.

And even thinner laptop with even longer ( or perhaps just as long but cheaper) battery life? Sure they have that coming up on the roadmap. But that largely just giving the bottom end Mac laptops a "hand me down" A__X chip that is primarily made for the iPad Pro. Apple could do that just fine. The software stack gets more complicated (two concurrent active macOS ports) and may loose a few low volume, high cost apps off the lowest end Macs, but they may be willing to put up with that.

Apple can dump Intel at the top end of the line up over the next 2 years if AMD doesn't screw up. In the desktop, space, there is two highly competing x86 solutions at this point. If Apple was jumping in part because there wasn't multiple vendor competition that is changing in the at least the desktop (and up) space now. Over a 5-7 year term it would probably make sense to consolidate if the "low end" ARM part becomes the dominate volume of the Mac line, but there is not really any good reason to be a 'big bang" switch across the entire line up in less than 12-18 months at this point.

First, it is highly unlikely Apple would be able to pull that off. They have demonstrated about zero ability to "walk and chew gum at the same time" across the whole Mac line up in a single 12-18 month period at all for over 3-4 years. Years! The notion that they are do something they have utterly failed at for half decade is huge a leap in rationality . Last time they got Transitive to complete Rosetta for them. They don't have a significant chunk of that expertise at this point either. [ The notion that the slowdown over the last 2-3 years across the product line is because they have been completely distracted with Area 51 Arm lineup for the whole entire Mac line up .... is probably more than wishful thinking. Occams Razor would put Apple's sputtering incremental product updates on the Mac line up at the feet of all other new product lines Apple has opened up over the last 4 years. Cars , screen tech (for phones) , web services , etc. ]


Second, the low relatively low volumes ( versus even the Watch or iPadPro ) for much of the Mac product line up don't motivate at all Apple doing a couple of niche ARM chips. The T-series makes more sense where every Mac would get one kind of chip. So 12M Macs per year allow that development costs to be spread over a iPad/Watch size amount of products. Run rates under 1M doesn't really make much sense. Likely to run into similar costs as the x86 versins in some cases in 2-3 years.
 
  • Like
Reactions: ssgbryan
Is Apple heading toward taking the whole Mac line up to ARM or just some subset?
Apple's abandonment of the creatives and other power users over the past 10 years does seem to indicate that it is likely that that the OSX computers will be demoted to ARM chips, and generally emasculated as for as power and power apps are concerned.

Killing Nvidia support was step 1 of the process. Maybe step 0 was the cluster-muck of killing OpenCL, OpenGL and adopting a "Metal-only" stance. What a clever way to kill off most of those old cMP systems.

I don't think that "taking the whole Mac line up to ARM" is the right question. Is Apple eliminating the x64 Mac line in favor of a new ARM-based line? Will it be a replacement, or a shift to an iOS based lineup that further abandons power users?

Sometime in the next two years the MP7,1 may be announced, and may give us a clue.
 
Is Apple heading toward taking the whole Mac line up to ARM or just some subset.

If the PPC to Intel transition is used as the model, and I think it will be, Apple will replace the entire lineup with Arm chips within 1 year. Apple may try to cheat and use the T2 chip as a crutch. That means the iMac Pro, MacBook Pro, Mac Mini, and MacBook Air could still be sold for a while longer, and the 2017 iMac and Macbook and MacBook Pro esc all get the shaft.
 
If the PPC to Intel transition is used as the model, and I think it will be, Apple will replace the entire lineup with Arm chips within 1 year. Apple may try to cheat and use the T2 chip as a crutch. That means the iMac Pro, MacBook Pro, Mac Mini, and MacBook Air could still be sold for a while longer, and the 2017 iMac and Macbook and MacBook Pro esc all get the shaft.

I can't see the "pro" models switching that quickly. With PPC to Intel you had Intel chips sufficiently more powerful that they could run PPC code in emulation with minimal performance hits and there was already a significant number of macOS PPC apps that had Windows Intel versions that could be (relatively) quickly ported over.

I don't see the ARM chips being comparable, much less superior, in performance to the top-end Intel products in the "pro" line and while we have iOS versions of a number of "pro" apps like Office and (now) Photoshop, it is nothing like the situation when Apple went PPC to Intel.

IMO, the MacBook will go ARM first and probably exclusively because the CPUs in it are low-end enough an ARM CPU should match or beat them so running x86 code in emulation could be viable and I expect it's "software base" is one that can be covered by iOS equivalent apps due to the model not really being suited for significant "pro" work (and yes, I am sure there are people who do actual "pro" work on one so I am not trying to make a blanket statement here).

Next would be the Air.

The MBP and desktops would be years away after "Marzipan" has had time to create a base for "universal" apps that can run effectively on ARM CPUs/GPUs. I'd expect the MBP to go first and then the Mini, followed by the iMac.

The iMac Pro and Mac Pro will be the final ones to go and I could easily see that not happening before mid-next-decade.
 
  • Like
Reactions: Nugget and ssgbryan
If the PPC to Intel transition is used as the model, and I think it will be, Apple will replace the entire lineup with Arm chips within 1 year.

The problem is that is not a good match at all to the current situation.

First , the PPC as relatively faster than 68K solutions at the time. Fast enough so that the overhead of the emulation brought the overall speed back to 68K levels. So during the software transition there is no backsliding. The across the number and variety of PPC solutions available the Except for some very low end corner cases and some 64 bits at the very hgh end, the same thing was true at the for x86 versus PPC solutions were coming from. Not only were there better mobile solutions ( which was 'stuck' in the PPC world because there were no other customers for that other than Apple), there were a variety of desktop solutions available too.

Second, those moves were to more than one vendor ecosystems. Motorola shelved their 88K "future" path for the 68K for jumping on board with IBM (and Apple) on PPC. What was suppose to happen was a variety of systems (not just Macs) would use that as a foundation and Apple would share base platform costs with other vendors of the broader platform. The Mac all by itself didn't have the volume to move it forward by itself. Apple holding back on the support chipset ( going independent) damaged that over the long term. For example, the mobile version sputtered at the end post killing clones since there was no other mobile (battery power oriented ) consumer of PPC at that point. Apple didn't want to pay the price for someone to do it; so it didn't happen.

A much more stable picture for jumping to a broader, far more stable shared ecosystem of x86. Mobile/Laptop processors available because there 90% more latptops than Macs driving that development. Desktops, same thing, 90% more than Macs volumes driving that development.

Those to factors are entirely at odds with the mantra that Apple wants to drive development solely based upon the volume of the Mac primarily all by Macs themselves. That's is basically a 180 degree turn. [ Oh but there iOS systems should count toward volume. iOS chips are going to skew the development in there own direction. Pretty much the same way that IBM skewed PPC up to fill their workstation/server needs (at higher margins ) versus trying to chase laptop CPUs which they didn't have. ] If it is a matter of Macs just getting "hand me down" chips from the iOS volume driver then essentially in same place as the PPC state just the "fill" category swapped ( PPC : have desktop ; don't have laptop versus have low end laptop versus don't have desktop ).

Apple made those previous transitions to get more rational cost sharing state; not for "more control just for more control sake".

If a desktop ARM vendor popped up and Macs were not the only desktop system consumer for that product then that would be a far more plausible path. That would actually match the pattern as would be jumping into a broader than just the Mac ecosystem and leverage shared R&D to offset relatively low volume that Apple Mac has. Still wouldn't be a sole Apple venture. For the next 2 years or so that is probably not going to happen. There are 2-3 server focused parties out there. ARM is focused on filling that space with reference designs. The N1 and E1 look decent for that task ( simulator N1 out in front of an current AMD Epyc on multiple benches there. ), but that skew is toward more cores and slower than top end clocks. ( workloads for multiple concurrent users as opposed to single user focused apps driven by a GUI. Desktop Windows is not a major target audience. )


Apple may try to cheat and use the T2 chip as a crutch.

T2 isn't a crutch. It has a role to unify the securing of the boot process and of handling of highly sensitive user data. There is likely going to be zero move to "move" user level apps there at all; as that would basically unwind the security if start to loop in arbitrary programs to that environment. The whole point is to have a computational environment that is free of that (less holes to be exploited. ).

T2 isn't trying to fill a primary application processor role. And it doesn't have to.


That means the iMac Pro, MacBook Pro, Mac Mini, and MacBook Air could still be sold for a while longer, and the 2017 iMac and Macbook and MacBook Pro esc all get the shaft.

Apple could probably prune off the MBA relatively soon. It is already on a Core-Y ( what was Core-M) processor family that the MacBook is on. A12X can handle one Type-C USB port now. It wouldn't be hard to extend that to two in a year or two. The MacBook isn't being shafted is the intent is to chase the "even thinner, even lighter" market. The MBA could get sucked into that same blackhole. That is the one aria were neither Intel or AMD have an answer. Nor are they likely to "get" one over next 2-3 years either. Intel and AMD are far more strongly based to do have an answer in he iMac , iMac Pro , Mac Pro space. Also in the larger MBP space.

Apple selling a much higher percentage of "old" stuff for more multiple years than they are now. Chuckle like that is going to work. If don't have an ARM solution then holding the higher end Macs hostage is even more ridiculous than what they have done with the Mac Pro the last 3-5 years. Putting the Mac Pro into Rip van Winkle mode has NOT worked for them at all over a broad scope. They may not have lost much money on the Mac Pro, but they have basically damaged a significant fraction of that specific user base. Folks at Apple would have to be drinking gallons of Cupertino kool-aid not to know that the path they have been on is has some more than serious flaws. ( e.g., the two "dog ate my homework" meetings during the last two year's April. )

If Apple has nothing for the desktop space then they would need to keep pace with the competing systems until they do. Or just quit the desktop space if they don't think it is a worthwhile space to be in. ( perhaps do a swap for some of the desktop models with some "desktop replacement" units styled on the Mac laptops weight/thickness restrictions of 10 years ago. )
[doublepost=1552147560][/doublepost]
Apple's abandonment of the creatives and other power users over the past 10 years does seem to indicate that it is likely that that the OSX computers will be demoted to ARM chips, and generally emasculated as for as power and power apps are concerned.

ARM doesn't necessarily mean "demoted". Intel has been screwing up making substantive progress. AMD has shot themselves in the foot numerous times. For a significant chunk of the Mac product line up both Intel and AMD have questionable stuff in terms of competitiveness. At the iMac/Mac Pro end of the product line up it is basically almost the exact opposite case.

Creatives don't all live in the Mac Pro segment of the Mac product line up. Apple has done nothing for creatives is smoke. Some creatives? Yes. All Creatives? not even close.



Killing Nvidia support was step 1 of the process. Maybe step 0 was the cluster-muck of killing OpenCL, OpenGL and adopting a "Metal-only" stance. What a clever way to kill off most of those old cMP systems.

Adding support (firmware updates , software , and validation testing) for the latest OS is an extremely odd way of killing off the old cMP systems. They probably will be "killed off" in the next round of OS upgrade, but that is entirely on track with their over decade old Vintage and Obsolete policy ( 2013 + 5-to-7 = 2018-2020 ; 2019 is in middle of that range).

Apple hasn't killed Nvidia support. They have simply not signed the drivers. Apple not signing software typically means that the vendor violated some rules/guidelines that Apple has laid down. Being in compliance with Apple's rules is part of the process. If Nvidia can't/won't do that they have a very significant role in the root cause. ( Nvidia playing a "embrace, extend, extinguish" game with Metal support could be a cause not to sign the drivers. They did it for OpenCL. It wouldn't be surprising at all if they were trying to do it again. )

While macOS has aspects that are independent from most iOS restrictions and strategic objectives, Metal isn't one of those.


I don't think that "taking the whole Mac line up to ARM" is the right question. Is Apple eliminating the x64 Mac line in favor of a new ARM-based line? Will it be a replacement, or a shift to an iOS based lineup that further abandons power users?

ARM as a shift to iOS based line up make absolutely zero sense. Apple already has a highly successful OS on ARM. They do not need macOS volume to be added to make that successful in the slightest. iOS is perfectly capable of subsuming some macOS features with a split implementation stack (e.g., dock coming to iPads). Apple could slightly diverge iOS on iPhone and iOS on iPad without dragging macOS into the loop at all.

Moving macOS to ARM or not should be around what can/can't the x86 solutions do for where Apple wants to take the Mac products. For "thinner and lighter and "always network connected" mobile solutions then yes there are places the x86 solutions really don't travel to as well. So the actually the real question is whether dogma ( gotta be ARM everywhere) is driving the Mac CPU assignment decision making process or "right tool for right job" is? That is a core question.

The "why" they are eliminating/switching platform is important. If it is basically irrational ( doing ths primarily just because they can) then it really isn't worth sticking around in the professional workstation space to see what they come up with. If willy-nilly is driving the decision on CPU, then it is also highly likely driving many aspects of the rest of the system too. If they have no long term plan and release a x86 CPU then it highly likely this is another Rip van Winkle system (ship it and go to sleep for another 5 years or more. )

Doing only a subset would be highly indicative that they are looking at "right tool , right job". Obsessive Compulsive Disorder (OCD) uniformity ... not so much.


Sometime in the next two years the MP7,1 may be announced, and may give us a clue.

You are likely not going to be able to clearly ascertain what Apple is doing on the broad scope of the whole Mac Product line up by peephole analyzing just one singular, narrow market focused product.

If the Mac Pro takes until 2020 to get out then likely can get a clue about the Mac Pro, not the rest of the product line. If it was early 2020 then somewhat likely that product will go back into Rip van Winkle mode. Mid-2020 (or later) then highly likely. It would definitely be a on death spiral in terms of sales.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.