Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

DearthnVader

Suspended
Original poster
Dec 17, 2015
2,207
6,392
Red Springs, NC
I don't think we are going to see a classic style Work Station Mac much longer. Apple Silicon just doesn't lead itself to many factors that lead me to think Apple is going to stick with a modular Mac Pro

First we have the issue of all the Apple Silicon Macs having on-die memory, then comes the limitations on the number of PCI-E lanes that AS may support.

To top that off we come to the issue of discreet graphics, with seemingly no path to standard of the shelf PC graphics cards being any use in a MP8,1. The 8,1 whatever form it takes is going to be a very small market for any graphics card companies to really care about 3rd party upgrades. Sure AMD or nVidia would like the OEM deal if it comes to that, but who will offer upgrades?

Apple may have been a little overly ambitious with it's 2 year path to transition to AS across the entire line, and the COVID fueled chip shortage is not helping.

What will Apple do?

We don't know the sales numbers for the 7,1, how much market share and mind share Apple lost sticking with the 6,1 for 6 years. I tend to think Apple will build a Cube 3.0 and try and pass it off as a Pro workstation. Sure it may have some impressive features, but I don't think modularity or upgradeability will be among them.

Any thoughts?
 
  • Like
Reactions: exProfesso

mattspace

macrumors 68040
Jun 5, 2013
3,344
2,975
Australia
Any thoughts?

Some...

Everything we "know" about Apple Silicon "Macs", is from 4 machines, using what are IMHO merely a renamed A15 CPU, that was initially only intended to power iPad Pros, and they're in Macs whose enclosures were designed as Intel machines (that iMac gets none of its thinness from the CPU, it's just the power supply being outboard).

The processor change schedule was brought forward by years in response to Intel's troubles, and Covid production slowdowns meaning Apple was safer not competing against larger-volume customers for Intel's processors.

The current Apple Silicon machines are hack-job kludges. We get as much useful information from them, as each of the proverbial 4 blind men each touching a different part of the elephant, can tell you about the creature as a whole.
 
Last edited:

ZombiePhysicist

Suspended
May 22, 2014
2,884
2,794
I think the RAM issue is surmountable. The built in memory could essentially become a large level 4 cache, and then use regular DDR ram for larger blocks. So the on chip 16-32GB of ram would act as level 4 cache if a user adds more than say the on-die amount of RAM in sim sockets, and otherwise acts as the normal ram.

They could still use PCI slots obviously, but it's not clear how those cards could be made compatible. The drivers *COULD* be much the same inside macOS, but it's not clear to me how much of PCI works because of intel chipsets. Obviously AMD systems have things working without the intel chipsets, so this seems surmountable too.

As long as 3rd party PC PCI cards can work, then I think a slotted Mac process Mac Pro is doable.
 

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
The current Apple Silicon machines are hack-job kludges.

Heh. No. They're very intentionally designed the way they are. Apple didn't work on these things for the last five years for M1 to be an oopsies. The Mac isn't getting leftover iPad CPUs. The iPad was getting beta Mac CPUs.

The latest big trend in CPUs is all in one designs. AMD, Nvidia, everyone is doing it. Intel is pretty much the only one not, mostly because they don't have a good graphics option yet.

The reason is because as CPUs get faster, the biggest problem is the distance between components. Physics becomes the issue. You can increase the CPU speed but you can't make electrons flow to the GPU any faster. So the answer is make your traces shorter and shorter and shorter until everything ends up on the CPU.

Like I said, Apple isn't alone here. There is the Zen architecture, which is based on the same concept but it more consumer and budget focused.

The reason that these designs haven't been more popular is the upgradability issues. You end up with a wicked fast CPU that has no upgradability options. In fact, one thing to keep in mind is that _Apple Silicon that isn't a single package will be much slower._

So, reasons you won't see Apple change course:
- It's an intentional part of the design for performance, not a rush job
- Apple Silicon that uses discrete chips won't perform like Apple Silicon
- Bulldozing "legacy" upgrade options for the sake of a competitive or marketing advantage is a very Apple thing to do

Could Apple build an Apple Silicon Mac that tosses aside all it's performance advantages to run like an Intel box for the sake of upgradability? Sure. But they won't. They'll take the wins of both having faster performance on paper and making everyone buy new Macs every few years. There's nothing they like more than bulldozing "legacy" use cases to make their hardware look more trendy.

You can't have really have Apple Silicon and the Apple Silicon performance without the single chip design. It's an intentional design choice. Already they're making feature choices that would be incompatible with discrete graphics, like pairing the Neural Engine with the GPU.

FWIW, I think the 8,1 will be another Intel Mac Pro. But I understand the idea of the thread.

Also Apple has so far blocked all GPU drivers and eGPU use on Apple Silicon. So AMD even being able to offer cards at all for an Apple Silicon Mac in any form is a really open question right now. This was not addressed in Monterey either. Market share may be completely irrelevant if Apple decides they won't support any third party GPUs in any form. Even the new 6900 drivers are Intel Mac only. eGPU or internal PCIe.
 
Last edited:
  • Like
Reactions: th0masp

mattspace

macrumors 68040
Jun 5, 2013
3,344
2,975
Australia
Heh. No. They're very intentionally designed the way they are. Apple didn't work on these things for the last five years for M1 to be an oopsies. The Mac isn't getting leftover iPad CPUs. The iPad was getting beta Mac CPUs.

Apple did not deliberately design the the Mac to only have 8 or 16gb of ram, to only support 2 monitors, and to have no support for external graphics, after years of complaints that 16gb wasn't enough, having only just moved their laptops to 32gb, and the Mini to 64gb, and having only recently adopted external graphics.

Everything about the M1 reeks of "repurposed iPad hardware, running macOS within the limits of an iPad SOC."
 
  • Like
Reactions: Mactech20

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
Apple did not deliberately design the the Mac to only have 8 or 16gb of ram

Did they deliberately design it to have a 16 gig RAM limit just like the previous Intel MacBook Air and two Thunderbolt port Intel MacBook Pro 13"? Yes, they did.

RAM limit is exactly the same as the Intel models they replaced.

to only support 2 monitors

Did the Intel models they replaced also have a 2 monitor limit?

Yes, they also did.

and to have no support for external graphics

AMD and Apple are over. We'll probably see AMD again in the next Intel Mac Pro, but that's it. Relationship over.

Would love to be wrong, but all signs point to AMD drivers never shipping on Apple Silicon.

having only just moved their laptops to 32gb

Again, MacBook Air and Dual Thunderbolt MacBook Pro 13" never supported 32 gigabytes of RAM.


Everything about the M1 reeks of "repurposed iPad hardware, running macOS within the limits of an iPad SOC."

Everything about M1 looks like a feature for feature match against the Intel chipset they replaced. Except for eGPU, but that comes down to the OS.
 

mattspace

macrumors 68040
Jun 5, 2013
3,344
2,975
Australia
Did they deliberately design it to have a 16 gig RAM limit just like the previous Intel MacBook Air and two Thunderbolt port Intel MacBook Pro 13"? Yes, they did.

But not the Mac Mini.

RAM limit is exactly the same as the Intel models they replaced.

But not the Mac Mini.

Did the Intel models they replaced also have a 2 monitor limit?

Yes, they also did.

No, none of them did.

The final Intel Macbook air supported dual external monitors, PLUS the built in display at the same time, so that's 3.

The 2018 Intel Mac Mini supported 3 displays on the integrated graphics.

The 2 TB Port 13" inch Macbook Pro, it's display support? Dual external displays, PLUS the internal. So again, that's 3.

Then add eGPU, and you can plug in theoretically, what, 2-3 more per eGPU.

Know what supports two displays ONLY? The iPad.

Know what has type-c Thunderbolt, but doesn't support external eGPU on it? The iPad.

Everything about M1 looks like a feature for feature match against the Intel chipset they replaced. Except for eGPU, but that comes down to the OS.

Except that it doesn't, the only thing it matches is RAM, and even then, that's only for 3 out of 4 products, and on the one it doesn't match, it only supports a quarter as much as the config it replaces.

The "M1" is exactly where the iPad would have been expected to be at by this stage - progress on the chronic ram starvation that has plagued the previous versions, and thunderbolt for mass storage and peripherals (especially Apple's own displays), but not for external graphics or displays beyond mirroring, which is expected given iPadOS is an inherently single screen UX.
 
Last edited:
  • Like
Reactions: Mactech20

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
But not the Mac Mini.
But not the Mac Mini.
The 2018 Intel Mac Mini supported 3 displays on the integrated graphics.

They didn't discontinue the Intel Mac mini.

Know what has type-c Thunderbolt, but doesn't support external eGPU on it? The iPad.

They're not shipping AMD drivers anymore on ARM macOS or iOS. That's why eGPU doesn't work.

Except that it doesn't, the only thing it matches is RAM, and even then, that's only for 3 out of 4 products, and on the one it doesn't match, it only supports a quarter as much as the config it replaces.

That was a major part of your theory that M1 wasn't ready.

The "M1" is exactly where the iPad would have been expected to be at by this stage - progress on the chronic ram starvation that has plagued the previous versions, and thunderbolt for mass storage and peripherals (especially Apple's own displays), but not for external graphics or displays beyond mirroring, which is expected given iPadOS is an inherently single screen UX.

Apple didn't start calling the iPad a desktop class device five years ago because Tim Cook had some sort of stroke on the WWDC stage.
 

mattspace

macrumors 68040
Jun 5, 2013
3,344
2,975
Australia
They didn't discontinue the Intel Mac mini.

Your claim about the graphical capabilities of the Macs replaced by M1 models, which underpins your argument that the M1 machines are merely maintaining limitations of the models they replaced, is incorrect.


They're not shipping AMD drivers anymore on ARM macOS or iOS. That's why eGPU doesn't work.

eGPU doesn't work, because iPads have never had to deal with external graphics, and the M1 Mac is an iPad janked & kludged into a different form factor.

eGPU works fine, and has drivers for all Macs that are not iPads.

That was a major part of your theory that M1 wasn't ready.

And display support was a major part of your theory as to why the M1 was designed to be a Mac chip from the get-go. Adding more RAM to the iPad is a perfectly reasonable and logical thing to do, given RAM has been the major limiter on performance for the things the iPad was best suited - drawing and painting apps. Subtracting display support, and one specific Thunderbolt feature from the Mac, is not.

Apple didn't start calling the iPad a desktop class device five years ago because Tim Cook had some sort of stroke on the WWDC stage.

Tim Cook says whatever marketing writes for him, and Apple is so deep into the weeds of being a post-truth company, the words they use have no connection to the function of language.
 
Last edited:

KeesMacPro

macrumors 65816
Nov 7, 2019
1,453
596
The reason is because as CPUs get faster, the biggest problem is the distance between components. Physics becomes the issue. You can increase the CPU speed but you can't make electrons flow to the GPU any faster. So the answer is make your traces shorter and shorter and shorter until everything ends up on the CPU. _Apple Silicon that isn't a single package will be much slower._
IMHO this is complete nonsense.
Unless you can come up with scientifical prove from a reliable source of course....

If we look at a PCB it's a non-conductive board with copper paths on it.
The speed of electricity through a copper conductor is ~ 80-90% speed of light for an unshielded conductor and ~60-70% for a shielded conductor.
For high frequencies there are other factors involved like e.g. interference etc etc, but that's besides the point.

The speed of light is 299 792 458 m/s (= 1079252848,800 Km/h).
To stay on the save side let's say for a PCB path it's 60% = 179 875 475 m/s .
This equals ~ 179 mm per nano second.

So if I design a PCB with a copper path of let's say 10 cm length , the time delay will be ~ 0,556 nano seconds.
I find it very hard to believe that this is a significant factor for a computer with today's standards = frequencies and transfer speeds.

So unless Apple is working on the construction of an UFO , I don't buy your theory.
I'm inclined to agree with @mattspace : the reason for Apple to go this route is financial , if I may put it very simple....

More detailed info here for example:
 
Last edited:

TrevorR90

macrumors 6502
Oct 1, 2009
379
299
Apple did not deliberately design the the Mac to only have 8 or 16gb of ram, to only support 2 monitors, and to have no support for external graphics, after years of complaints that 16gb wasn't enough, having only just moved their laptops to 32gb, and the Mini to 64gb, and having only recently adopted external graphics.

Everything about the M1 reeks of "repurposed iPad hardware, running macOS within the limits of an iPad SOC."


Apple designed it specifically for that. Anything beyond what you mentioned represents probably 1% of the Apple consumer base.

The vast majority of users don't use more than 16gb nor do they use more than 2 monitors. Less than 1% use an eGPU. In fact, users on Macrumors probably (Just a guess) represent only 5% of Mac users in general. So I don't believe the m1 is a hackjob from ipad parts and was created to appeal to the masses.


In regards to the ram, the 8gb and 16gb ram aren't really comparable to traditional ram sizes. The m1 handles memory much differently.
 

KeesMacPro

macrumors 65816
Nov 7, 2019
1,453
596
Anything beyond what you mentioned represents probably 1% of the Apple consumer base.
Assumption 1
The vast majority of users don't use more than 16gb nor do they use more than 2 monitors
Assumption 2
Less than 1% use an eGPU
Assumption 3
In fact, users on Macrumors probably (Just a guess) represent only 5% of Mac users in general.
Assumption 4
So I don't believe the m1 is a hackjob from ipad parts and was created to appeal to the masses.
assumption 5
In regards to the ram, the 8gb and 16gb ram aren't really comparable to traditional ram sizes. The m1 handles memory much differently.
Could be true , any proof of this ?
 

TrevorR90

macrumors 6502
Oct 1, 2009
379
299
Assumption 1

Assumption 2

Assumption 3

Assumption 4

assumption 5

Could be true , any proof of this ?
Just like everything you stated are assumptions...

Have anything to share besides saying assumption? Which I can argue literally everything in life are assumptions. Additionally, I have an Advanced degree in statistics so I think of everything in terms of data and statistics.
 

KeesMacPro

macrumors 65816
Nov 7, 2019
1,453
596
Have anything to share besides saying assumption?
No, not really .
Actually, as you may have noticed , coming up with a statement based on a handful of assumptions is not a valid argument IMHO.
I can argue literally everything in life are assumptions. Additionally, I have an Advanced degree in statistics so I think of everything in terms of data and statistics.
Interesting perception of life , but I see no point in getting into this.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.