Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

vigilant

macrumors 6502a
Aug 7, 2007
715
288
Nashville, TN
The successor to the T2 is Apple Silicon. Unless they brand the SoC as T3, you're probably not seeing that. You might see a different co-processor designed for the Mac Pro, but even then, I'm pretty skeptical.

Trust me, I “hear you”. I’ve enjoyed reading your responses.

I think a case can be made that having a separate chip still around for fixed purposes is still in the cards/

It does make complete sense for everything to be on one chip, but it also makes sense to not.
 

theorist9

macrumors 68040
May 28, 2015
3,880
3,060
Things like HBM and LPDDR are fundamentally not user upgradeable, since they don’t come as slotted RAM. For the same reason I don’t think it makes much sense to talk about cost, it’s not something you can buy as the end customer anyway. The a RAM itself is more expensive, but there are additional factors as well that increase cost.

I believe that RAM cost is not a factor for Apple, since their computers are already expensive. Apple could afford using HBM at the current price point, especially since Apple Silicon would save them some money. Then, there is the factor of product differentiation. Apple offering HBM in their computers would cement their premium status and offer new performance heights in the customer segment. And they can absorb the costs by making their own silicon plus streamlining the model range. It’s simply not possible in the PC world where components are sold for profit.

And finally, using high speed RAM will “finally” provide the definitive justification for soldered on memory. If you are using regular old DDR, users will reasonably complain. If you are using something that’s 5-6 times faster... then it suddenly sounds like a much better tradeoff.

Thanks!
The one problem I see with that is that one of the Mac Pro's big selling points is it gave the pros the modularity they wanted -- as workloads change, they can upgrade the capabilities of their machine.

Since the HBM would be tied to the CPU and GPU, and since it's not user upgradable, would that make the CPU and GPU likewise not user upgradable?
 
  • Like
Reactions: Sarajiel

Jorbanead

macrumors 65816
Aug 31, 2018
1,209
1,438
Trust me, I “hear you”. I’ve enjoyed reading your responses.

I think a case can be made that having a separate chip still around for fixed purposes is still in the cards/

It does make complete sense for everything to be on one chip, but it also makes sense to not.

They stated in their keynote that their Mac SoC's will have a security enclave inside which means they don't need a separate security chip.
 

Sarajiel

macrumors newbie
Aug 12, 2020
18
10
I expect 6 more years of no Mac Pro updates like with the trash can.
Apple will continue to support and release new versions of macOS for Intel-based Macs for years to come, and has exciting new Intel-based Macs in development."

Considering that it doesn't make much sense to have two competing CPU architectures for consumer oriented products, this line from the press release reads more like a promise to their enterprise customers like Pixar/Disney that they won't be forced into a specific architecture for some time. It's not out of the question that Apple keeps releasing workstation class systems with new Xeon CPUs in the future for select customers that aren't available to the wider audience either.


I am hoping for a Trashcan V2.0 in a return of the Cube:

Mac Pro Cube - starting at US$5,999.00

48 P cores / 4 E cores / 96 GPU cores - CPU / GPU Chiplets & RAM on interposer / System in Package (SiP) design
HBM3 Unified Memory Architecture - 128GB / 256GB / 512GB
NVMe RAID 0 (dual NAND blades) 4TB / 8TB / 16TB
Eight USB4 / TB4 ports
Two 10Gb Ethernet ports
One HDMI 2.1 port
Three MPX-C slots (for use with asst. MPX-C expansion modules)

RAID 0 in a "Pro" desktop machine is utter non-sense. :eek: While I'm fully aware that a number of people who earn money with their machines and as such match the definition as professional might use RAID 0 on a system drive, it's something that should never ever be done on your system or data drive even if you have backups. It's okay for scratch drives, but everything else is close to committing suicide for your business.
Anyone who actually understands the technology will usually want RAID 1 setups for system drives and most likely opts for something like RAID 5, 6, etc. for data storage in environments that are typically used 24/7. The same also applies to ECC RAM and it's really a shame that Apple didn't try to pressure Intel in releasing more non-Xeons with that feature.

While I wouldn't be surprised to see some form of Mac Cube with ARM CPUs, it's probably a better fit for a mid-range machine between the Mini and the Pro. The current Mini doesn't make much sense for consumers and is too limited for many corporate customers as it's targeted too a very narrow and specific market like software developers, data wranglers and small/scale-out servers without enterprise needs.

After the Trashcan disaster that alienated quite a number of corporate customers, I'd bet that Apple will most likely use the same or a very similar chassis for the AS Mac Pro. Maybe they change the color to space gray and adjust the layout of some ports/connectors, but overall the machines will be very similar from the outside and might even have some interchangeable parts like various MPX carriers or even complete graphic options for multi-GPU.
If you ever had to deal with rack-mounted Trashcans on a regular basis, you will understand why people, especially many system administrators, are still angry. ?
 

leman

macrumors Core
Oct 14, 2008
19,521
19,674
The one problem I see with that is that one of the Mac Pro's big selling points is it gave the pros the modularity they wanted -- as workloads change, they can upgrade the capabilities of their machine.

Since the HBM would be tied to the CPU and GPU, and since it's not user upgradable, would that make the CPU and GPU likewise not user upgradable?

We can only speculate at this point but I envision the next Mac Pro as a collection of multiple extension boards, connected with very fast interface (not PCIe), where each board can host multiple CPUs/GPUs and it’s own RAM. The entire computer would then operate like a NUMA architecture with unified memory, but some devices being ”closer” to each other.
 
  • Haha
Reactions: Sarajiel

leman

macrumors Core
Oct 14, 2008
19,521
19,674
RAID 0 in a "Pro" desktop machine is utter non-sense. :eek: While I'm fully aware that a number of people who earn money with their machines and as such match the definition as professional might use RAID 0 on a system drive, it's something that should never ever be done on your system or data drive even if you have backups.

While you are fully correct, I wanted to point out that any Mac using an SSD is essentially running a step unlike RAID 0. After all, an SSD is made by multiple storage chips being written to in parallel.
 
  • Like
Reactions: vigilant

Boil

macrumors 68040
Oct 23, 2018
3,477
3,173
Stargate Command
The one problem I see with that is that one of the Mac Pro's big selling points is it gave the pros the modularity they wanted -- as workloads change, they can upgrade the capabilities of their machine.

Since the HBM would be tied to the CPU and GPU, and since it's not user upgradable, would that make the CPU and GPU likewise not user upgradable?

If Apple has to have a line-up of Mac Pro desktops; Cube, xMac, Mac Pro...

Cube as above, with short MPX-C expansion cards

xMac is smaller tower version of current Mac Pro, Cheesegrater V2.0 7 all (Cube could be like that as well), but with less PCIe slots, maybe one MPX slot (which is also two PCIe slots) & one PCIe slot...?

Mac Pro is what we got now, but with HEDT / HPC SoC (SiC?) & Apple (GP)GPUs...

Considering that it doesn't make much sense to have two competing CPU architectures for consumer oriented products, this line from the press release reads more like a promise to their enterprise customers like Pixar/Disney that they won't be forced into a specific architecture for some time. It's not out of the question that Apple keeps releasing workstation class systems with new Xeon CPUs in the future for select customers that aren't available to the wider audience either.

From the Apple press release:

"Apple plans to ship the first Mac with Apple silicon by the end of the year and complete the transition in about two years. Apple will continue to support and release new versions of macOS for Intel-based Macs for years to come, and has exciting new Intel-based Macs in development."

This clearly states they are talking about software, not releasing new Intel-based hardware beyond the end of the transition to Apple silicon. We MIGHT see the iMac Pro updated with Navi / RDNA2 based GPUs; we should see the Xeon-powered Mac Pro updated with RDNA2 GPUs, maybe a spec bump to the CPU similar to what they did with the iMac Pro recently...?

While I would expect a few of the workstations to be at Disney / Pixar for evaluation & such, I think they are using Xeons & Quadros & Linux of some sort, plus all their custom software that us mere mortals will never navigate...?

RAID 0...

I was under the assumption that was what Apple was doing with the iMac Pro & Mac Pro as of late...?

I am familiar with differing RAIDs; dealing with SCSI RAIDs for video editing in the pre-OS X days, running a Intel Mac Pro Server with internal RAID 5 for an architectural office, hanging a 4-port USB hub off my Unibody MacBook with mismatched USB thumb drives so I could play WoW while waiting to replace my failed internal HDD... Good times...! ;^p

While I wouldn't be surprised to see some form of Mac Cube with ARM CPUs, it's probably a better fit for a mid-range machine between the Mini and the Pro. The current Mini doesn't make much sense for consumers and is too limited for many corporate customers as it's targeted too a very narrow and specific market like software developers, data wranglers and small/scale-out servers without enterprise needs.

After the Trashcan disaster that alienated quite a number of corporate customers, I'd bet that Apple will most likely use the same or a very similar chassis for the AS Mac Pro. Maybe they change the color to space gray and adjust the layout of some ports/connectors, but overall the machines will be very similar from the outside and might even have some interchangeable parts like various MPX carriers or even complete graphic options for multi-GPU.
If you ever had to deal with rack-mounted Trashcans on a regular basis, you will understand why people, especially many system administrators, are still angry. ?

Space Grey should have been an option from the start...!

A Cube is what my heart wants, but a xMac (see my other post somewhere in the thread?) would appeal more to the budget-minded pros who still need real PCIe slots (not some MPX-C bull)...
 
  • Like
Reactions: Sarajiel

Sarajiel

macrumors newbie
Aug 12, 2020
18
10
While you are fully correct, I wanted to point out that any Mac using an SSD is essentially running a step unlike RAID 0. After all, an SSD is made by multiple storage chips being written to in parallel.

I probably should have made it a bit clearer that I was talking about workstation class machines. I guess many people who never worked in a corporate environment, don't get why there a two sockets for NAND blades in the Mac Pro.
Most server oriented SSDs also add internal ECC protection or use an internal RAID 5 configuration of the flash chips, if I remember correctly.
 

Sarajiel

macrumors newbie
Aug 12, 2020
18
10
From the Apple press release:

"Apple plans to ship the first Mac with Apple silicon by the end of the year and complete the transition in about two years. Apple will continue to support and release new versions of macOS for Intel-based Macs for years to come, and has exciting new Intel-based Macs in development."

This clearly states they are talking about software, not releasing new Intel-based hardware beyond the end of the transition to Apple silicon. We MIGHT see the iMac Pro updated with Navi / RDNA2 based GPUs; we should see the Xeon-powered Mac Pro updated with RDNA2 GPUs, maybe a spec bump to the CPU similar to what they did with the iMac Pro recently...?

I think that the complete statement (the whole sentence) is a bit convoluted and ambivalent on purpose to allow Apple a way out if they struggle with workstation class or more powerful hardware.
Nothing "exciting" about the 27" iMac release or a possible spec bump of a MBP 16" that is still missing. Anything workstation related, with big numbers the average Joe doesn't really understand, fits the description of "exciting" much better. It's also the type of delay Apple can easily hand-wave away since the average consumer doesn't really care.
Maybe it's also just me, reading this marketing statement differently than a native speaker. o_O

The iMac Pro is only there for price discrimination, and a few people who need replacements or backup machines for their old turn-key systems that used them originally.

New GPUs for the Mac Pro are probably just a silent update after a press release. I'm not really following Intel's Xeon mess, but do they even have any new CPUs in the Xeon-W segment that are actually shipping? The rebranded extreme editions that were announced earlier this year are still nowhere to be seen from what I've noticed between all the AMD Epyc/Threadripper stuff that usually pushes news about Xeons aside.

I am familiar with differing RAIDs; dealing with SCSI RAIDs for video editing in the pre-OS X days, running a Intel Mac Pro Server with internal RAID 5 for an architectural office, hanging a 4-port USB hub off my Unibody MacBook with mismatched USB thumb drives so I could play WoW while waiting to replace my failed internal HDD... Good times...! ;^p
Hahahaha, I like that USB RAID.

The RAID 0 thing is just one of the few things that really, really annoys me, because it has such a stupid gamer vibe to it. Maybe I just had to deal with too many interns or film students who asked me why that 40 TB SAN storage with 6x internal RAID 6s connected to my resolve workstation with multiple GPUs doesn't feel as snappy as their gaming PC... Better not ask me about my opinion of RGB LEDs. :cool:

Personally I'd love to see a proper designed and well thought out cube as alternative to the run of the mill 2000 Euro gaming PC that is better at productivity stuff than actual gaming.

edit: spelling
 
Last edited:

Boil

macrumors 68040
Oct 23, 2018
3,477
3,173
Stargate Command
The iMac Pro is only there for price discrimination, and a few people who need replacements or backup machines for their old turn-key systems that used them originally.

Someone posted in a thread about the new 27" Intel (Last Of Them Edition) having the Nano texture option, but the iMac Pro not getting that added as a sign the iMac Pro was not long for this world, and the 'spec shuffle' it just got kinda points to that as well...

New GPUs for the Mac Pro are probably just a silent update after a press release. I'm not really following Intel's Xeon mess, but do they even have any new CPUs in the Xeon-W segment that are actually shipping? The rebranded extreme editions that were announced earlier this year are still nowhere to be seen from what I've noticed between all the AMD Epyc/Threadripper stuff that usually pushes news about Xeons aside.

I want to say I have heard of (but not actually looked into) newer Xeons that would work in whatever socket the iMac Pro has, but there is no planned upgrade path beyond a more cores option of what is in the Mac Pro...? But see above in regards to the iMac Pro...

Hahahaha, I like that USB RAID.

Had to raid (on my bastard RAID) regular to get my legendary daggers on my rogue, 26 weeks of PuGfest glory...! ;^p

The RAID 0 thing is just one of the things that really, really annoys me, because it has that stupid gamer vibe to it. Maybe I just had to deal with too many interns or film students who asked me why that 40 TB SAN storage with 6x internal RAID 6s connected to my resolve workstation with multiple GPUs doesn't feel as snappy as their stupid gaming PC... Better not ask me about my opinion of RGB LEDs. :cool:

Personally I'd love to see a proper designed and well thought out cube as alternative to the run of the mill 2000 Euro gaming PC that is better at productivity stuff than actual gaming.

I would love to see something like a modern day high-end Apple-ified version of a SGI O2 (Cube/xMac) or Octane (xMac/Mac Pro) type workstation, and publicized proof of Disney/Pixar/ILM/some tier 1 animation/effects house(s) "making the transition"... But right now most of those folks are rocking Xeons & Quadros while running Linux, with highly specialized & proprietary softwares... Apple needs to make (buy) some 3d software & renderer (at least license a renderer?) & roll it into a Digital Content Creation suite, with Final Cut Pro X & Logic Pro X & revive Phenomenon (Shake), stuff like that...

All highly integrated with all of the ASICs & FPGAs & DSPs & GPUs & GPGPUs & Neural Engines & stuff from Apple silicon; making the best workstation for everything...!
 

vigilant

macrumors 6502a
Aug 7, 2007
715
288
Nashville, TN
The only reason why T2 is separate on Intel Macs is because Apple has to put these components to somewhere. There is no reason in them being in a separate chip once Apple Silicon ships. They will most certainly be part of the SoC.

Completely get that. I’m just more thinking out loud
 

Sarajiel

macrumors newbie
Aug 12, 2020
18
10
I would love to see something like a modern day high-end Apple-ified version of a SGI O2 (Cube/xMac) or Octane (xMac/Mac Pro) type workstation, and publicized proof of Disney/Pixar/ILM/some tier 1 animation/effects house(s) "making the transition"... But right now most of those folks are rocking Xeons & Quadros while running Linux, with highly specialized & proprietary softwares... Apple needs to make (buy) some 3d software & renderer (at least license a renderer?) & roll it into a Digital Content Creation suite, with Final Cut Pro X & Logic Pro X & revive Phenomenon (Shake), stuff like that...

I think that train left the station a very long time ago. After completely killing off shake and that infamous Final Cut update when Apple decided that EDLs aren't needed anymore for professional film editing and negative cutting, pretty much anyone in high end post-production would start looking for alternative software if Apple would acquire the likes of Autodesk or their smaller competitors.
In the company I worked for at the time, we basically decided the day after the Trash Can was revealed that we won't buy any of those in the future and ordered a couple of maxed out cheese graters from our retailer to bridge the gap to something better. We later got some Trash Cans with turnkey systems like Resolve Studio, but that was basically torture for the admins and the operators. That something better ended up being a HP Z-series workstation running Windows 7 or Red Hat Linux in most cases like in so many other companies in the industry.
I'm not even mentioning that the talented developers would most likely run as fast as they could as well, if they get bought by Apple. They already do if they work for companies bought by Autodesk. If it wasn't for ProRes nobody at the high end would care much about Apple except the audio guys, but I think some of them still haven't found replacements for their old G4s running ProTools. ?

[rant]
I don't know if you ever worked for anything professional cinema, broadcast TV or post-production related, but if you did, you probably would have noticed at the Mac Pro presentation how little Apple and it's marketing department actually understands professional needs in this sector. Especially the comparisons between a reference display and the Pro Display XDR were absolutely cringeworthy. That made those silly little joke comparisons between iPad Pros and $500 cheapo laptops by Phil Schiller look like scientifically researched market studies. Too be completely honest, the whole Mac Pro, P3 color gamut stuff and even the current Mac Mini lineup feels like Pixar, Dreamworks and others who actually still heavily use macOS in many areas, sent some angry letters with a wishlist to Tim Cook who forwarded them to R&D and a note to make it happen.
[/rant]

Btw, if you are interested why high end 3D CGI is so heavily invested into Linux and OpenGL, look at the histories of old Silicon Graphics Inc., Nvidia and Autodesk.

There is basically no way Apple could capture the interest of companies doing large scale 3D CGI for digital cinema today. It would require listening to very specific hardware and software demands, acting on them and embracing open* systems, while at the same time shaving the cost down for both to the absolute minimums.
Probably the biggest hurdle would be that Tim Cook had to work with Leatherjacket, who actually seems to understand these very specific customers.
Almost all film studios or post-production facilities use the best tool for the job regardless of hardware platform, which leaves Apple typically in the audio departments, partially with video editing, 2D compositing, and color-grading within the highly optimized production pipelines.


*open mostly as in extendable or expandable, not so much as in free and open source
 
Last edited:
  • Like
Reactions: jinnyman

Jorbanead

macrumors 65816
Aug 31, 2018
1,209
1,438
Yeah, but the T2 is more than just security.

Their Mac SoC’s handle everything the T2 chip does including coprocessing, hardware acceleration, audio processors, etc. so it would be redundant. Going by the info they gave us in both the keynote and the platforms state of the union videos.
 
  • Like
Reactions: chabig

Jorbanead

macrumors 65816
Aug 31, 2018
1,209
1,438
Is it possible that Apple could stick with intel Xeon JUST for the Mac Pro line?

I think they would have been very clear if the Mac Pro wasn’t switching so professionals knew and didn’t have to worry about x86 apps not working. Plus that would require them to always make an x86 and ARM version of everything which they will only do for a few years.

The thing people keep forgetting is that Apple has been working on this transition for years behind the scenes, and they wouldn’t have announced a transition unless they were ready to take over their entire lineup and they knew Definitively that their chips could beat out the competition.
 
  • Like
Reactions: Nütztjanix

Boil

macrumors 68040
Oct 23, 2018
3,477
3,173
Stargate Command
I think that train left the station a very long time ago. After completely killing off shake and that infamous Final Cut update when Apple decided that EDLs aren't needed anymore for professional film editing and negative cutting, pretty much anyone in high end post-production would start looking for alternative software if Apple would acquire the likes of Autodesk or their smaller competitors.
In the company I worked for at the time, we basically decided the day after the Trash Can was revealed that we won't buy any of those in the future and ordered a couple of maxed out cheese graters from our retailer to bridge the gap to something better. We later got some Trash Cans with turnkey systems like Resolve Studio, but that was basically torture for the admins and the operators. That something better ended up being a HP Z-series workstation running Windows 7 or Red Hat Linux in most cases like in so many other companies in the industry.
I'm not even mentioning that the talented developers would most likely run as fast as they could as well, if they get bought by Apple. They already do if they work for companies bought by Autodesk. If it wasn't for ProRes nobody at the high end would care much about Apple except the audio guys, but I think some of them still haven't found replacements for their old G4s running ProTools. ?

[rant]
I don't know if you ever worked for anything professional cinema, broadcast TV or post-production related, but if you did, you probably would have noticed at the Mac Pro presentation how little Apple and it's marketing department actually understands professional needs in this sector. Especially the comparisons between a reference display and the Pro Display XDR were absolutely cringeworthy. That made those silly little joke comparisons between iPad Pros and $500 cheapo laptops by Phil Schiller look like scientifically researched market studies. Too be completely honest, the whole Mac Pro, P3 color gamut stuff and even the current Mac Mini lineup feels like Pixar, Dreamworks and others who actually still heavily use macOS in many areas, sent some angry letters with a wishlist to Tim Cook who forwarded them to R&D and a note to make it happen.
[/rant]

Btw, if you are interested why high end 3D CGI is so heavily invested into Linux and OpenGL, look at the histories of old Silicon Graphics Inc., Nvidia and Autodesk.

There is basically no way Apple could capture the interest of companies doing large scale 3D CGI for digital cinema today. It would require listening to very specific hardware and software demands, acting on them and embracing open* systems, while at the same time shaving the cost down for both to the absolute minimums.
Probably the biggest hurdle would be that Tim Cook had to work with Leatherjacket, who actually seems to understand these very specific customers.
Almost all film studios or post-production facilities use the best tool for the job regardless of hardware platform, which leaves Apple typically in the audio departments, partially with video editing, 2D compositing, and color-grading within the highly optimized production pipelines.


*open mostly as in extendable or expandable, not so much as in free and open source

Mid-1990's I was the IT guy & doing captures & rough edits on a Media 100 system (beta decks & component I/O)

Late 1990's I did a bit of consulting for a tiny boutique VFX shop; started with a few Indys & Indigos & a deskside Onyx; that turned into a fridge-sized Onyx feeding six workstations; then a couple of O2s & Octanes, then the little over-equipped studio imploded... The costs of hardware & software (looking at you, alias|wavefront) at the time were ridiculous...!

Got a Power Computing PowerTower Pro & a license of EIAS around the same time, eventually lost my dongle in a divorce move, never repurchased...

Early 2000's I was the IT guy & SketchUp jockey for an architectural firm (had an Intel Mac Pro Server, four Intel iMacs, & an Intel Mac mini in there, had the principals looking at ArchiCAD on OS X), but they liked their Windows & AutoCAD...

Now my interests are in Cinema4D & Octane X, and I am hoping that whatever hardware Apple creates to bolster FCPX/LPX software performance (DSPs/FPGAs/ASICs/Neural Engines/GPGPUs/etc.) can also be taken advantage of by the DaVinci software suite...!

Is it possible that Apple could stick with intel Xeon JUST for the Mac Pro line?

Highly doubtful. The Cheesegrater V2.0 chassis will stick around for awhile, but the internals will be Apple silicon after the two year transition is complete. The bigger mystery is what Apple will do regarding high-end GPUs for the Pro market...
 

vigilant

macrumors 6502a
Aug 7, 2007
715
288
Nashville, TN
The iMac Pro is only there for price discrimination, and a few people who need replacements or backup machines for their old turn-key systems that used them originally.

New GPUs for the Mac Pro are probably just a silent update after a press release. I'm not really following Intel's Xeon mess, but do they even have any new CPUs in the Xeon-W segment that are actually shipping? The rebranded extreme editions that were announced earlier this year are still nowhere to be seen from what I've noticed between all the AMD Epyc/Threadripper stuff that usually pushes news about Xeons aside.
I expect the iMac Pro to be around a while. Even with AS.

Intel seems to be really sweating their Xeons for longer than normal. I don’t have reason to believe they have anything new. Especially since Intel has a tendency to build a new Xeon after they’ve relegated several really buggy laptop chips and iterate on that a few times.
 

Boil

macrumors 68040
Oct 23, 2018
3,477
3,173
Stargate Command
I expect the iMac Pro to be around a while. Even with AS.

Intel seems to be really sweating their Xeons for longer than normal. I don’t have reason to believe they have anything new. Especially since Intel has a tendency to build a new Xeon after they’ve relegated several really buggy laptop chips and iterate on that a few times.

Speculation had the iMac Pro dying off because they did not offer the Nano texture on it when they spec shuffled the CPU...

But there are supposed to be a newer line-up of Xeon-W CPUs that will work with the socket/chipset in the iMac Pro, so maybe Apple is just waiting for Big Navi to drop the last of the Intel iMac Pros, featuring RDNA2 GPUs & new Xeon-W CPUs, with a Nano texture option; I mean, wouldn't you hate to be the guy who bought a spec shuffled iMac Pro WITH a Nano texture option, and then Apple drops a SWEET new iMac Pro with RDNA2 GPUs...?!? And Apple decides to be really nice to their Pro users & includes a RAM access panel...! ;^p

I am gonna speculate on other possible The Last of Us Intel Macs now...

16" MacBook Pro brought up to 10th gen i9s & RDNA2 GPUs...

Mac Pro spec shuffled on CPU end (no upgrade Xeon-Ws for that socket, dead socket from day one?) & RDNA2 GPUs added as BTO options...

Mac mini updated to appropriate 10th gen Intel CPUs, maybe low-end RDNA2 (or low-end RDNA1 refresh) GPUs added as BTO option...?

After that, everything is Apple silicon; frak you, Intel...!
 
Last edited:

Boil

macrumors 68040
Oct 23, 2018
3,477
3,173
Stargate Command
the current Mac Pro Design makes no sense for Apple Silicon imo

Well, a lot of that space is the PCIe slots, and I doubt those will go away. Apple designed discrete GPUs or AMD GPUs aside, there are plenty of other things Apple could make to full those slots. They currently have their Afterburner card, which is a FPGA for a specific video codec, Apple could make other cards with assorted ASICs / DSPs / Neural Engine / GPGPU functions? And there are always the Pro Tools dudes with their need for PCIe slots.

But I also just watched this:


Now that dual socket logic board, that was interesting, as was the water cooling. A logic board like that, extended for PCIe slots, that would need the space of the Cheesegrater V2.0 chassis. And it could still use the same cooling, high static pressure fans up front pushing thru massive 'wide wale' heatsinks...
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
I am 98% certain. Apple writes everything down to the firmware on the cards, to the drivers in macOS, and that is something NVidia simply doesn’t allow it. I say 98% certain because I’m sure theres an odd nuance somewhere in there around something that NVidia would let them do, but it wasn’t enough for Apple.

There is a difference between getting sources to work off of (and for source code escrow ) and "writing it" . Does Apple do the 'build' on the stuff included with macOS at the GPU level. Probably yes. The older system with OpenGL was that Apple did the top "half" and the GPU vendors did the bottom "half" ( not a exact 50-50 split , but the work was delegated).

Metal makes that a much smaller kitchen to work in. Apple probably does have a much bigger share of what they think is important. But low level boot firmware (UEFI) written completely from scratch for someone else's hardware. That's is extremely unusual. Far more likely that is Apple taking 'credit' for the work after they sprinkled it with Apple pee and put it through an Apple build mechanism and slap an Apple version number on it (along with a secure Apple code signature) . There is a shorter distance though between the 'top" of the Metal API and the 'raw metal' hardware so Apple's percentage would be up, but at the transition point to the making the binary calls there should be a knowledge gap between what vendor knows and what outsiders know just based on experience (internal teams ramp longer and earlier on new hardware then folks out the outside.) . there is likely a much thinner hardware abstraction layer there for the vendor also so the "lines written" percentage will be even more skewed to Apple. But pull that last layer piece and won't have a well working system.

Likewise for writing the highly optimizing code generator completely firewalled off from and decoupled from the core hardware implementation group. The JIT shader compiler perhaps thrown out of the 98% of driver calculation, but if removed do you pragmatically have a complete driver? (counting what want to count because tagged as Apple so therefore "important". )

Just because Apple distributes the code doesn't necessarily mean they are the primary writers of key amounts of the code. ( similar with other infrastructure sources that Apple pulls from FreeBSD , *BSD , and other open source repositories, but more constrained on claiming sole credit. )


[ Apple is in process of kicking almost every loadable driver out of the kernel. Including some of their own. As much as the GPU drivers will the last ones out of the 'pool' . The distribution and authentication signage of those drivers is going to look like Apple since only Apple "originated" is will be in the kernel. Pragmatically though that doesn't eliminate subcontractors. It does eliminate though 3rd party GPU vendors going rogue and dropping driver that do not originate from Apple. And have "halt and catch fire" directives so trip up on every new kernel update (because the development cycles are out of sync. ) . Apple is looking for 'silent' partners. Not those looking to make they own 'noise'.

And increasingly a silent partner who is willing to let Apple put Apple GPU first , exclude the other options on some Mac systems, and then base assigned whatever is 'leftovers". Toss OpenGL , OpenCL , Vulkan, and other open APIs just puts Metal to being a gatekeeper and moat digger for Apple GPU going forward. ]


I had a fairly deep conversation with a friend of mine who works for NVidia as an engineer, not going to name names nor what he does... but yes. It goes to the point where Nintendo Switch full stack for controlling the hardware and boot process is largely all NVidia.

I am not 100% certain on how much of the Nintendo Switch is written by NVidia, but it’s my understanding they contributed to it significantly.

Not particularly surprising since both the CPU and GPU and the rest of the central SoC in the system were created by Nvidia. Excluding Nvidia 100% in that context would be way past odd. Booting the SoC that they created should have been something they had to do from day zero . ( actually even priority to day zero since it would be part of the simulation suite for the SoC design evaluation. )
 
  • Like
Reactions: Sarajiel and leman
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.