Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

SilverB

macrumors newbie
Jun 24, 2020
16
12
Damn, this is such a predicament. Resell value is such a big factor when unloading thousands of dollars on Apple computers, hence why the hefty price tags can 'sometimes' be reasoned with (in a business scenario). But I would be more worried about buying a 16" MacBook Pro now than a 2019 Mac Pro. Buying a 16" MacBook pro today would likely be completely replaced within 12-18 months with an ARM version. Where as I really don't see a new Mac Pro being released within the next 3 to 4 years. And I'm more on the optimistic side of the fence with resell value of the 2019 Mac Pro, maybe they will somehow become sought after?
Anyway, with more Intel Mac's still on the way (I'm looking at you iMac), I really wonder how their new sales of Intel Mac's (including the Mac Pro) will be affected. I actually think the general user does not care, nor pay any attention.
 
  • Like
Reactions: R3k

bsbeamer

macrumors 601
Sep 19, 2012
4,313
2,713
There is no reason an MBP16,1 would have to be replaced in 12-18 months. A new model will eventually arrive, like it does every year for the 15"/16" form factor. The MBP16,1 will run the latest macOS for at least another 5 years (likely more), or it can dual-boot Windows/Linux if macOS becomes a complete and total mess that does not even meet some of your needs.

Personally think any Core i9 machine is still a decent purchase right now. The MBP16,1 with i9 rivals or beats the MP6,1 in many tasks. The iMac19,1 with upgrades does more than just rival.

The Xeon machines are just different beasts with a heftier price tag. If you need a Xeon in 2020, there are distinct reasons and likely can pay for themselves. The equation now is can it pay for itself in 2-3 years. If you don't need a Xeon in 2020, look at other options.
 

1221320

Cancelled
Jun 16, 2020
69
19
This thread seems to be mostly people talking about CPU performance of ARM versus Intel versus PPC, etc (sorry, I haven't read all 24 pages!).

But what I'm interested in is the GPU performance of an ARM-based Mac Pro. What is the ARM-supporting/supported equivalent of an AMD Radeon Pro Vega II Duo? This card costs £10,000 alone in the UK Apple Store. I would be happy to be corrected, but as far as I know, there is no equivalent technology that works with ARM.

Will Apple be making their own high-end GPUs? Seems unlikely. So what will the ARM Mac Pro use?
 
Last edited:

teagls

macrumors regular
May 16, 2013
202
101
This thread seems to be mostly people talking about CPU performance of ARM versus Intel versus PPC, etc (sorry, I haven't read all 24 pages!).

But what I'm interested in is the GPU performance of an ARM-based Mac Pro. What is the ARM-supporting/supported equivalent of an AMD Radeon Pro Vega II Duo? This card costs £10,000 alone in the UK Apple Store. I would be happy to be corrected, but as far as I know, there is no equivalent technology that works with ARM.

Will Apple be making their own high-end GPUs? Seems unlikely. So what will the ARM Mac Pro use?

They might try to. Especially seeing how it looks like Apple is now trying to compete in machine learning not just for inference but training. I just don't see how Apple can compete with NVIDIA in that space. Not sure about AMD, but they will probably use AMD for the high end for quite some time.
 
  • Like
Reactions: 1221320

1221320

Cancelled
Jun 16, 2020
69
19
They might try to. Especially seeing how it looks like Apple is now trying to compete in machine learning not just for inference but training. I just don't see how Apple can compete with NVIDIA in that space. Not sure about AMD, but they will probably use AMD for the high end for quite some time.
Right, so they can't compete with NVIDIA, and I doubt wither NVIDIA or AMD will develop ARM GPUs that are the equivalent of their x86 GPUs.

I'm quite happy to be wrong, but I don't think this transition is going to pan out how Apple intended, at least for pro users.
 

gnasher729

Suspended
Nov 25, 2005
17,980
5,566
The MacPro will be replaced with something running on ARM when Apple has an ARM processor that is beating whatever the current Intel chips are. For all the 2-8 core Macs that will happen soon. For something with 28 cores that will take longer. Or it may never happen, if Apple decides that building an ARM chip with 24 or 32 cores is not worth the money. These chips will definitely be at the end of the list, so wait two years.

But also iMac Pro and Mac Pro are a range of different computers. So Apple could build an ARM chip with 12 cores and replace _some_ of the Pro machines. Just wait.
 

teagls

macrumors regular
May 16, 2013
202
101
Right, so they can't compete with NVIDIA, and I doubt wither NVIDIA or AMD will develop ARM GPUs that are the equivalent of their x86 GPUs.

I'm quite happy to be wrong, but I don't think this transition is going to pan out how Apple intended, at least for pro users.

The GPU instruction set doesn't really matter for the host(CPU) architecture. As there isn't really a "x86" GPU. GPU instruction set is much simpler and I think NVIDIA's is called PTX? And it changes all the time. Anyways NVIDIA has GPUs that run with ARM processors. It's really up to the driver to handle interaction between host(CPU) and device(GPU).

Personally I don't think Apple will be able to or want to compete at the high-end. It will be too much time and engineering investment. Remember it's non-trivial to design & manufacture multiple CPUs for different performance levels. Unless they follow AMD chiplet design, which would still take several years of engineering and iteration.

To expand on this a bit more – NVIDIA & AMD spend millions if not billions in R&D on next-gen GPU/CPU. They do that because it's fundamental core of their business. They can recoup that R&D expense by selling tons of datacenter, consumer, etc products.

Apple's pro market is tiny compared to that. Why would Apple spend loads of cash on a high-end ARM chip. They wouldn't be able to recoup the cost. This works for iPhone/iPad/Macbook/iMac because they sell lots of those.
 
Last edited:

gnasher729

Suspended
Nov 25, 2005
17,980
5,566
The first thing someone does when they get their A12Z dev kit is going to be to run Geekbench on it, and there's basically no way Ifixit doesn't get someone to throw them theirs to disassemble and we'll see what sort of cooling solution there is. The proof will be in the pudding soon enough.
What I would prefer instead of Geekbench: Take a nice large project. Then compile it with Intel Xcode on an Intel Mac, compile it with Intel Xcode on an ARM Mac, and finally compile it with ARM Xcode on an ARM Mac. Probably on machines with different numbers of cores. The interesting numbers obviously: What hardware do you need to run Intel code faster on ARM, and what do you need to make ARM code run faster than Intel code on Intel.
I respect that. At this point I have a couple of mission critical iOS apps which will be nice to run on my Mac, but I admit those are the exception.
My wife would play Candy Crush on any Mac if she could. Not quite mission critical...

Seriously, I'd expect Apple to add support for mouse and menu bar to iPad OS (like an OS call "do you have a menu bar"), and calls to add items to the menubar which fail on a real iPad, tell developers to make their code run on _any_ screen size, like a 6k monitor, and iPad apps will be good enough for many things. Certainly not for all.
 
Last edited:

Unregistered 4U

macrumors G4
Jul 22, 2002
10,612
8,636
Personally I don't think Apple will be able to or want to compete at the high-end. It will be too much time and engineering investment. Remember it's non-trivial to design & manufacture multiple CPUs for different performance levels. Unless they follow AMD chiplet design, which would still take several years of engineering and iteration.
Considering that “time” didn’t just start on Monday and engineering investment may have been going on for years AND considering that designing and manufacturing multiple CPU’s for different performance levels is pretty much what they’ve been doing for years, I think they’re in a good place to accomplish whatever they set out to do.

Additionally, aren’t GPU’s far simpler, architecturally, than CPU’s? While there’s nothing stopping them from using an AMD GPU, I wouldn’t at all be surprised if they designed their own.
 

teagls

macrumors regular
May 16, 2013
202
101
Considering that “time” didn’t just start on Monday and engineering investment may have been going on for years AND considering that designing and manufacturing multiple CPU’s for different performance levels is pretty much what they’ve been doing for years, I think they’re in a good place to accomplish whatever they set out to do.

Additionally, aren’t GPU’s far simpler, architecturally, than CPU’s? While there’s nothing stopping them from using an AMD GPU, I wouldn’t at all be surprised if they designed their own.

They've been designing chips sure, but those chips were for low end devices. Look at how AMD/NVIDIA/INTEL all develop their chips. They usually design from top down. They develop the most advanced/high end chip first. Then use cut down versions of it usually through binning to fill out lower tiers. Apple has been building mobile socs for years. It's just not the same.

Yes GPUs are simpler in their instruction set. Maybe not architecturally, but tell that to Intel who has been attempting a GPU for the better part of a decade.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
I'm sure what is there on the Mac Mini dev machines is a modified version of the A12Z on the iPad. Probably more cores running on higher frequencies that work with USB and other extra bells and whistles.
So it's definitely worth benchmarking once the devs get their hands on it.

Most likely not. What Apple stuffed into a Mac Mini case is likely at 2018 era ( designed in 2017 with 2017 other components) board that originally had a A12X and now has a A12Z. The Z and X are the EXACT same die. This probably is a Z.

Apple could have hacked a kludge around having just two USB ports ( provisioned through to USB-C and new Keyboard . of which the keyboard probably also dates back to 2016 baseline design time) with two USB hubs. Take one port and make it into two. The HDMI port is probably there because they were ( and probably still are) thinking of tossing this same die into a AppleTv. Ta-da done. Apple has a design harness for Apple TV and for macOS on ARM with minor tweaks to the boot firmware.

These boards have probably been around for close to two years. It is more cost effective for Apple to shovel something they already had.

This DTK is probably not a prototype of the first Mac product that was specifically designed to be a Mac product. More likely this is just a "lab" board stuffed into a handy container. ( they probably have Mac Mini's sprikled around for developers to use as built/test farms. So the mini case will fit that infrastructure built around that as a bonus. )

It isn't trying to be a Mac Mini other than it is a container that exists. More like an AppleTV will really builked up storage and some more ports.

Apps can be built on Intel Machines for ARM target and these boxes just dropped into a developer org's current Mini build farm to do bulk testing of apps. Or just connected by remote login to a small ship ( single developer) host machine as a headless adjunct build test box.

Nobody is going to own the box long term ( it is $500 lease). It is primarily just needed to do run testing on builds before deployment for a while. When there is actual Mac product then developers will have to buy one of those. It doesn't have to be a end all be all development machine.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
I’m worried about the performance of ARM.
Nothing at all was mentioned in the keynote with regard to the performance comparison. All keynotes just mentions like it’s great and works fast, etc. If so enthusiastic about the performance benefit, why not offer the performance comparison charts?

There is one general stab at this. It appears in many of the articles covering this.


2020-06-22%2020_31_28.jpg



(also in https://arstechnica.com/gadgets/202...ap-for-moving-the-first-macs-away-from-intel/ )


This doesn't have precise numbers but it does highly likely highlights the "spot" where they want Apple Silicon SoC are going to land in the future.

There are two migrations that will probably happen based on that quadrant. Notebooks will move "up" into the current desktop performance zone. Not surprising that is what has been happening for the last 10+ years. Desktops will mainly shift "left" ( some incremental performance "up" is lit up in that chart. ). Here Apple gets to undo more "designed into a corner" aspects of the desktop line. For example Mini cranks up performance and gets to stay the same size as it is now. the iMac same ( or the bulge in back thinned out a bit . Flatter back like the XDR. )

From that chart I'd expect a future ARM based SoC for a Mac Pro to land with about the same number of high power cores as the Mac Pro has now. ( capping out around than 32. 28 high and 4 low ). Apple leveraging the process improvements in 2 years to push the performance incrementally higher so can have some charts with improvements. And bet more of the farm on 2 year in the future GPUs to do more heavy lifting ( and come from a 3rd party). Not revolutionary faster host cores, but "faster enough" on a decent set of workloads to promote upgrades. (or get left behind on OS upgrades).


The problem for the Mac Pro is that the competition isn't going to move that far "left" on the Power consumption space. More power will be thrown and higher core counts and more discrete GPUs. Apple seems very intent on their SoC being more complete ( diverse) SoCs. As long as have a hefty chunk of the transistor budget allocated to GPU then will have less cores. The upside is that GPU and CPU share a unified memory. The dual edged sword on that pragmatically puts a cap on how big your package can be. ( if bring really hot CPU die in extreme close proximity of really hot GPU die then run into issues. If separate them to get cooling space then shared memory gets more non uniform NUMA (or just slower). )



They have Final cut pro running natively on beta, why not compare that with intel one?

Because this is just an iPad Pro processor from two years ago at this point. It will do OK , but would get trashed by the Mac Pro. ( even more so with Mac Pro fitted out with Afterburner doing ProRes RAW 8K work. )




Does “Apple Silicon” offer a true scalability for professional platform? If ARM chips are so great and has enormous potentials, Server market should have been dominated by them already. Why not?

The ARM instruction set really isn't the major issue. What Apple does with the I/O attached to the package will matter more. Graviton 2 and upcoming Ampere Altra


push into power consumption zones that Apple is probably going to walk away from.



Anyhow, we can all see that any major 3rd party software development for intel platform is dead. Good luck to all MP buyers and owners.

Dead? Shouldn't be right now. Apple still probably has two macOS versions to get through before they start to drop Intel. If Apple sells even averages selling even 5M x86 Macs over next two years that would be over 10M users that developers would be walking away. That is bonehead dumb business in software to walk away from that many potential paying customers.

If Apple is only doing incremental subsets of the Mac line up over time and the rollout is spread over over 1-1.5 years to get to the 60% range of new Macs on arm64 then the bulk and inertia of the deployed x86-64 will dominate for at least a couple of years. Still will need to do updates and fat-binary builds for a decent amount of time going forward.


Apple probalby isn't going to start their termination clock on Intel until the whole line up is over. If that is 2.5 years out, then even if they had a "short" 3 year clock that is still 5.5 years out. That is pretty much the liftetime currently out now would normally get if Apple updated on a yearly cycle (which they haven't in a while but is consistent with the older cadence. )
 

codehead1

macrumors regular
Oct 31, 2011
117
98
Some seem worried that once the Mac line is replaced, that's the end for an intel Mac you might already own. Cook made it clear compiling for Intel or Apple processor would be easy. Apple's has sold 18-20 million units annually over the past few years, and the installed base is estimated at 100 million.

So, why would a software developer stop producing Intel builds—at least for several years to come? And Cook said they would continue to build the OS for Intel.

A new Mac Pro in 2020 is probably going to have whatever longevity you reasonably expected of it before the Apple processors were announced.
 

Slash-2CPU

macrumors 6502
Dec 14, 2016
404
268
Definitely not buying a MP 7,1 now. Zero resale value in a couple-few years and no support or updates sooner than originally expected.

My 4,1 flashed to 5,1 might get replaced in the interim by a maxed-out Mini. If I see TB3/4 on the first ARM Mac, I'll buy a maxed out Intel Mini and TB3 storage devices. Later I can swap the Mini out for a 2nd generation ARM Mini or equivalent.

Definitely not buying any first generation ARM Mac. First gen Apple anything is always an expensive paid beta test.

So, why would a software developer stop producing Intel builds—at least for several years to come? And Cook said they would continue to build the OS for Intel.

A new Mac Pro in 2020 is probably going to have whatever longevity you reasonably expected of it before the Apple processors were announced.

What's possible and what's probable are two different things. With no future production for Intel Macs, why would any developer bother compiling Intel binary of any new programs? Sure they'll update the current ones, but I doubt seriously that you'll see any .0 releases of any of the big suites for Intel Macs again. You can also forget any of the new features, capabilities, or seamless experience that will be offered by Apple ARM. Five years from now, any Intel Mac you buy today will be in the same phase of its life as the 2010 Mac Pro's are now. Rapidly dwindling new software support and no resale.
 
Last edited:

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
....

The big question is will they use chiplets like AMD has on Epyc, Threadripper and Ryzen or will they use a monolithic die like Intel? - I suspect for the Mac Pro they will go with a chiplet approach.

Chiplets are unlikely to 'save' the Mac Pro in a core count "war" with AMD ( and/or Intel ) if Apple stays 100% grounded to it being a SoC in scope ( some minimal GPU + NPU + etc etc. apple custom fixed function logic ) as first priority more so than raw focus on application host processor core count .

Apple's constant rigging the bell on "Unified Memory" is insanely create points to substantially different design priorities. More likely to get a multi chip module build of a CPU chip with a GPU chip sharing a common memory pool than something that was more monomaniacally focused. Apple doing that just to get better yields on bigger blocks of the chip rather than even more expensive monolithic die. Or using RAM system cache chiplets to free up space on the consolidated logic chips for logic to get to a much bigger system pool.

Apple probably isn't going to get into a core count war' ( past 64 cores at all.) Pretty likely will stay in similar count zone that the Mac is currently in. ( 32 cores range).


If the nominal Apple Silicon SoC could feed the nominal USD4/TB4 ( or incremental future of that line) ports with video then the entry level Mac Pro ( and rack model) could free up another PCI-e slot for other stuff. The apps that were optimized for uniform memory GPU workload would run "OK" and the workload could cover something that wasn't so GPU heavy ( audio work. virtual host server work . etc. )


In non Mac Pro SoC perhaps the cell modem is on a chiplet so don't have to completely coupled those but lower the power consumption of interconnect because so close ( short). Save an iMac or MBP 16" SoC where have higher core target but budget individual die sizes.
 
  • Like
Reactions: whfsdude

Unregistered 4U

macrumors G4
Jul 22, 2002
10,612
8,636
They've been designing chips sure, but those chips were for low end devices.
They’ve only been RELEASING chips for low end devices. That’s what I meant by they didn’t start yesterday. They’ve been at this for years. And consider... these particular chips that they’ve been designing AND releasing, that are intentionally meant for cellular phones and tablets, are performing right up there with Intel’s i9 series. If that’s what they’re capable of when their intent is low power consumption over performance, then the chips they‘ve been designing, yet NOT releasing must be doing far better.

They develop the most advanced/high end chip first. Then use cut down versions of it usually through binning to fill out lower tiers.
This is another area where I think Apple‘s going to prove to be quite interesting. Other companies do as good as they can on the high end, and then remove features to create solutions for the low end. This is, of course, in order to fit into all the various places they’re expected to fit into. Apple has the luxury of knowing EXACTLY what systems they’re going to be fit into and PRECISELY what code they will be expected to run. So, when they’re shooting for that high end chip, they can make a beeline for it and not have to be concerned with how they’re going to remove features for the lower end.

The only difference between the current high end iPad Pro and the low end iPad Pro is how much storage and how it connects wirelessly. They could have intentionally cut down the processor for the lower end ones,
[/QUOTE]

Yes GPUs are simpler in their instruction set. Maybe not architecturally, but tell that to Intel who has been attempting a GPU for the better part of a decade.
But, in that same time, Intel has shown themselves unable to do efficient and performant. I see their GPU efforts as part of that :)
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
We can only hope they go with chiplets. At the very least, they could shove a lot of cores into a Mac Pro that way for folks that do need it. But they also get the yield benefits on smaller unit counts that AMD is making good use of.

Pretty good chance Apple is going to point at GPU cores for those who have massive embarrasingly parallel compute needs. Current VRAM sizes are already at 40GB range ( A100 ). in 2-4 years 40GB could be well into the affordable for many in classic Mac Pro budget range.

If Apple cranks their ARM core clocks up past where Intel has the current W-3200 series and keeps the core counts roughly the same they'd get increase performance.


They might loose some folks who are "insatiable" host core consumers but the Mac Pro isn't gong to be everything for everybody. Same reason why it is a single CPU package now. Dual (or more ) CPU packages isn't the core of the target market.

Apple will probably shovel more cores in but far more likely it will be more NPU and GPU cores of their design than arm64 ones. A "smarter" mac pro computations and more pervasive deployment of "metal's best target GPU".



Which is why I’m baffled they didn’t mention one thing about GPUs, PCIe and TB3 (or USB4). They could have spoken to it at least, and the DTK isn’t much help, other than pointing out the I/O limits of the A12Z. I’ll be devouring the WWDC sessions this year in part because there is so much, but also because I want to see if there are any hints in the ARM coverage.

Not baffling when they currently don't have a publically facing answer. The DTK is a A12Z that has no substantive PCIe I/O bandwidth comparable to an iMac processor let along the Mac Pros. That kind of PCIe bandwidth is probably two years out.

GPU they pragmatically did with the Maya demo that was 'good enough'. That is on a two year old GPU. When the A14X lands that will only get at least incrementally better. At the Mac Pro user heavy leaning on GPU horsepower that answer those is with the PCIe facilty not the Apple GPU.... and as pointed out above that is year(s) away.


In short, the Mac Pro isn't the leading , primary target behind the Apple Slicon design philosophy. Apple isn't going out to attack the 250-300W zone of host processors. They presented a chart to say they are trying to move away and out of that space. Not build something for it. More performance out of a already generous core count budget with constrained power consumption is their announced path. Yeah that will put a cap on embarrassingly parallel host core peformance but Apple is probably going to be "happy enough" to live in that cap. ( 20-28 cores isn't the "norm" of systems bought now. So if the current 8-16 core buyers move up to a Apple Silicon SoC with 28-32 cores in the future that will be core count upgrade. That the 28 core folks "roll off the top" of the Mac Pro range ... probably not a big loss in units or earnings after BOM expenses . )


I fully expect the 2 year lag was mostly farting around with trying to make it perfect rather than getting it out the door on a schedule. The insane attention to detail that allows 3rd party drive cages, logic board carrying the 12V rails instead of cables, MPX slots, fan-less GPUs, etc all scream that.

a bigger contributor is that they were not working on it in earnest until 2017. That getting the iMac Pro out the door was higher priority. Same issue likely in play here too. Getting mobile "Apple Silicon" out the door first is probably priority. Something for the iMac Pro / Mac Pro space on the "get to it eventually" list.

And yes Apple Silicon means they will have more control over the case design and board component layout. Probably will be even further away from generic "box with slots" with cheapest off the shelf connectors. In fact wouldn't be surprisng if loose the CPU socket. No Apple SoC has had a socket. Highly doubtful they are going to start now ( or in two years). [ CPU sockets for selling to different board vendors and retail off the shelf CPU sales. Apple is only going to do their own board , sell the CPU SoC to nobody else. So what is the socket for? ]
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
...
But what I'm interested in is the GPU performance of an ARM-based Mac Pro. What is the ARM-supporting/supported equivalent of an AMD Radeon Pro Vega II Duo? This card costs £10,000 alone in the UK Apple Store. I would be happy to be corrected, but as far as I know, there is no equivalent technology that works with ARM.

There is about zero indication that Apple is going to go after the Vega II Duo space in GPUs. Their primary intent in all their commentary is on integrated GPUs that are in the same "System on Chip" SoC. that might be technically expanded to System in Package (with maybe a couple of chip dies inside a package ), but still a single thing that Apple will place on the logic board. Not a card to be inserted. Apple thinks their "special sause" in GPUs is in how the GPU is not decoupled from the CPU. That they share same system RAM ( Unified Memory) and that is a insanely great feature. ( and can cover more performance ground that many folks expect it to. It also happens to be a bit more power efficient which is another Apple primary goal with their work. )

The issue is how much support will they add to the SoC for having 3-4 GPU cards inserted into a future Mac Pro.
I suspect real evidence for that won't show up until much farther into 2021.

Conceptually if Apple adds a PCI-e v4 controller with support for 64 lanes they could support what the Mac Pro has now in terms of open PCI-e sockets and MPX bays. There would be two years into the future AMD GPU ( and pretty good possibility Intel ) cards to plug into that future Apple Silicon SoC powered Mac Pro. Could even plug in that Vega II Duo if not "done" with it because sank so much money into it.


Will Apple be making their own high-end GPUs? Seems unlikely. So what will the ARM Mac Pro use?

No. Pretty unlikely. AMD isn't going away any time soon. Intel is trying to break into the discrete GPU business ( more so on the computational side than the best gaming side). Apple would probably be open to a computational GPU that got along "fanatically well" with Metal. Intel has been a good Metal "partner" for a number of years. This would be a way for Intel to swap some revenue to offset the CPU revenue loss. Apple would be happy to have two highly competing vendors to get the "big" GPU slot in a Mac Pro. ( even bigger reason to put GPU vendor who won't get 100% on board with Metal on the sidelines. )

I suspect though that Apple will bundle a moderate sized GPU into the Mac Pro SoC. They could have an empty slots entry model ( could get the entry level costs down a bit. Not huge drop but a notiicable amount. Or swap the loss of discrete GPU for non puny SSD capacity. Pragmatically lowering the normal entry buy-in cost for most users. )

Apple's GPU is going to be "best" optimized for Metal. But Apple isn't going to make it exclusive either here on Mac Pro or likely on iMac Pro / 27" models. The MBP 16" like space gets into a more foggy zone. If Apple has an exclusivity objective on GPUs it is in the Mac laptop range. There they may be shooting to push out everyone. ( and snag the Mac Mini also in the process. and maybe and entry iMac or two. ).

If HBM falls a bit in price and AMD gets better at timely GPU rollouts then perhaps they can keep the MBP 16" and iMac wins. Not so sure Intel can sneak in there but it is somewhat possible if the vastly improve their execution in 2 years. i don't think Apple wants to do a HBM memory controller. That has zero fit with the iPad OS and iOS lines of development. They are probably going to want to punt that work to someone else who can spread the sales of those GPUs out over more systems than the shallow end of the Mac product range pond.
 

teagls

macrumors regular
May 16, 2013
202
101
They’ve only been RELEASING chips for low end devices. That’s what I meant by they didn’t start yesterday. They’ve been at this for years. And consider... these particular chips that they’ve been designing AND releasing, that are intentionally meant for cellular phones and tablets, are performing right up there with Intel’s i9 series. If that’s what they’re capable of when their intent is low power consumption over performance, then the chips they‘ve been designing, yet NOT releasing must be doing far better.


This is another area where I think Apple‘s going to prove to be quite interesting. Other companies do as good as they can on the high end, and then remove features to create solutions for the low end. This is, of course, in order to fit into all the various places they’re expected to fit into. Apple has the luxury of knowing EXACTLY what systems they’re going to be fit into and PRECISELY what code they will be expected to run. So, when they’re shooting for that high end chip, they can make a beeline for it and not have to be concerned with how they’re going to remove features for the lower end.

The only difference between the current high end iPad Pro and the low end iPad Pro is how much storage and how it connects wirelessly. They could have intentionally cut down the processor for the lower end ones,


But, in that same time, Intel has shown themselves unable to do efficient and performant. I see their GPU efforts as part of that :)
[/QUOTE]

Scaling a mobile soc to a desktop chip is not going to be linear. It just doesn't work like that.

Cut down versions is mainly because of binning and yield rates. If AMD knows a rough break down of their yield rates they can then create different categories of products around that with varying levels of performance. It's not an entirely new chip. It's not about removing features. It's basically a defective chip that couldn't make it as a higher tier chip that's all it is.

Designing a custom chip for different tiers of products would be incredibly costly and the return on R&D wouldn't make up for it.

I don't see why Apple would do it differently. The engineers that work there are the same ones that have worked at NVIDIA/AMD/INTEL. I agree knowing the thermal envelope / cooling capacity of the final product is important, but that doesn't matter for desktops. Even so the chip is designed before the final computer. I guarantee they built the chip first and will design cooling around it. Chip design takes way too long.

As for the iPad – Apple has high yield rates via TSMC. The chip is small and it's a soc so everything on it is crucial for the iPad. So if anything is really defective on the chip it basically goes in the garbage.
 
  • Like
Reactions: ssgbryan

Unregistered 4U

macrumors G4
Jul 22, 2002
10,612
8,636
I don't see why Apple would do it differently.
If Apple, a company designing a CPU for a known range of products is not doing something different from Intel and AMD, that are making CPU’s for a wide variety of unknown products, then they’re not being smart about it.

I guarantee they built the chip first and will design cooling around it. Chip design takes way too long.
Based on what we’ve seen from Apple regarding CPU design, they design the CPU to hit a specific performance/power utilization target destined for products a couple years out. And again, if Apple is taking a different path with these processors, what that doesn’t take into account that the CPU designer has extremely deep access to the feature roadmaps for both the software and hardware sides of the house, something not possible for Intel or AMD, then they’re doing it wrong.

So if anything is really defective on the chip it basically goes in the garbage.
Based on that, you believe that Apple would do something different on non-iOS bound processors?
 
  • Like
Reactions: Adult80HD

bsbeamer

macrumors 601
Sep 19, 2012
4,313
2,713
Before the MP7,1 was originally teased, there were reports of "strange" unidentified macOS processors being tested in very limited capacity. Some assigned those to internal AMD CPU testing and others to unreleased Intel products. Neither were proven.

IF those were Apple ARM chips on some kind of macOS desktop device, it would mean they've been trying something in very limited production for at least a year already with some sort of testing. Not saying this makes the 2-year target more realistic for all machines, but they must have figured out something meaningful in that time or it would have likely been abandoned and/or delayed.
 
  • Like
Reactions: tevion5

Krevnik

macrumors 601
Sep 8, 2003
4,101
1,312
Pretty good chance Apple is going to point at GPU cores for those who have massive embarrasingly parallel compute needs. Current VRAM sizes are already at 40GB range ( A100 ). in 2-4 years 40GB could be well into the affordable for many in classic Mac Pro budget range.

Honestly, if Apple does this, I think it'll be the dual GPU issue all over again. Them trying to skate to a puck they think exists, but then doesn't materialize. Especially when Apple also has to push the boulder up the hill with Metal for compute.

All the more reason Apple is better off giving us some idea what the end goal of an "8,1" is in advance, IMO.

Not baffling when they currently don't have a publically facing answer. The DTK is a A12Z that has no substantive PCIe I/O bandwidth comparable to an iMac processor let along the Mac Pros. That kind of PCIe bandwidth is probably two years out.

That's my whole point though. It's not a good look.

GPU they pragmatically did with the Maya demo that was 'good enough'. That is on a two year old GPU. When the A14X lands that will only get at least incrementally better. At the Mac Pro user heavy leaning on GPU horsepower that answer those is with the PCIe facilty not the Apple GPU.... and as pointed out above that is year(s) away.

Where their GPU work lands is important compared to Intel iGPUs. If they can make Intel look bad. Great.

Not what my argument was about though. My argument is that they should be talking about the PCIe facility which the A12Z doesn't need because it isn't a desktop processor. They should at least say something to the effect of: "Yes, we are making sure that folks using higher end GPUs and other PCIe I/O have a way forward, we just aren't ready to show that off yet."

But even for the 27" iMac, they could really benefit from having PCIe lanes for a dGPU option.

In short, the Mac Pro isn't the leading , primary target behind the Apple Slicon design philosophy. Apple isn't going out to attack the 250-300W zone of host processors. They presented a chart to say they are trying to move away and out of that space. Not build something for it. More performance out of a already generous core count budget with constrained power consumption is their announced path. Yeah that will put a cap on embarrassingly parallel host core peformance but Apple is probably going to be "happy enough" to live in that cap. ( 20-28 cores isn't the "norm" of systems bought now. So if the current 8-16 core buyers move up to a Apple Silicon SoC with 28-32 cores in the future that will be core count upgrade. That the 28 core folks "roll off the top" of the Mac Pro range ... probably not a big loss in units or earnings after BOM expenses . )

Yes, I saw and understood the chart. I would be happy to have comparable performance for lower wattage. I'm not asking them to deliver a 200+W processor. What I'm saying is that going from a 4/4 design to a 28/4 design isn't just "slap more cores" and call it a day. Adding desktop-class PCIe and memory controllers requires extra die space as well. It all adds more complexities as the die size grows. It's a substantial jump from the types of SoCs they've made to date, and it'd just be nice for them to demonstrate that they intend to actually land there, and that they have a good shot at doing it.

I mean, Apple's basically claiming that they can go from the A12Z to workstation class SoCs in 24 months. If they actually can. Great. But this isn't like the Intel transition where the whole lineup they wanted pre-existed, and Intel's roadmap was not exactly a secret. Here we've only got Apple's word that they'll deliver. Apple's staking a lot on this, and I'd rather not see it become another AirPower issue where they announced too quickly.

a bigger contributor is that they were not working on it in earnest until 2017. That getting the iMac Pro out the door was higher priority. Same issue likely in play here too. Getting mobile "Apple Silicon" out the door first is probably priority. Something for the iMac Pro / Mac Pro space on the "get to it eventually" list.

It still took them 2 years from announcing they were going to change direction from the trash can to putting out the demo at WWDC last year. That's the lag we were talking about.

I'm not even arguing against what you mention about the 2017 start date. But it's not a 2 year cycle to get something out the door unless you are hell bent on customizing the engineering design like Apple did with the 7,1. In some ways, it was worth it, but in this case, it was clearly perfect being the enemy of the good adding time to the project.
 

Unregistered 4U

macrumors G4
Jul 22, 2002
10,612
8,636
I mean, Apple's basically claiming that they can go from the A12Z to workstation class SoCs in 24 months.
That’s not what they’re claiming, though. They didn’t say, “Hey, we’ve got this 2 year old chip here today... so, yesterday Craig said we should probable replace our entire line with chips like these. So, we’re starting here. Today...TO GIVE IT OUR BEST SHOT FOR THE NEXT TWO YEARS!!” :)

I mean, they didn’t even get where they are in 24 months.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.