Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
My question about the whole stacking module system: what would Apple gain from it? Seems like more work to develop and ship a bunch of different pieces than a single hardware system.

If it's true, my thoughts are that they're accepting it's a niche system in an area in which they have no real ability to set the agenda or direction for the industry. They can't expect people to buy the whole widget, GPUs and all, every time they need an upgrade for a particular part - that strategy failed with the 2013.

Perhaps splitting the system, and putting it on a common non-constrained connection bus (ie not thunderbolt) means that, for example, a person upgrading the processor module can leave everything else in place, then repurpose the old processor with new accessory modules. Or, tradein / sell the old processor, which someone else can use as a more affordable entry to Apple's ecosystem (folks who can't afford a new core, and won't buy an iMac - one group of people Apple is losing to Windows & Hackintoshes), buy new accessory modules, then eventually upgrade their processor, perhaps with a new one.

It could reduce sticker shock and the need of the customer to carry risk by over-committing to the super-expensive part straight off the bat.

Remember the rumour a while back that Apple was consulting with RED cameras on modularity, but rather than RED, have a look at how Phase One does their camera systems - $20k 150MP sensor backs that you can use on their own modular DSLR body (which accepts any number of generations of Phase One backs), OR use on a third party technical / medium format camera body. Importantly, when you want to trade up, Phase One will handle the tradeup and have their own secondhand sales channel - you can buy used older generation backs direct from Phase One if you can't afford their latest & greatest.

The 2013 basically had sod-all resale value, because it was only ever an entirely outdated device secondhand. The cMP's upgradability not only meant people could hold onto them longer, but it made them more saleable for folks who needed that sale to allow them to buy their newer generation machine.

Perhaps that's a niche Apple is seeing? Just like car dealerships depend on tradeins and used sales to keep new sales rolling over, perhaps they're starting to realise they need to be more accommodating to the reality that buying, owning and replacing a workstation just isn't the same as a phone or laptop.


Or, more to the point: the idea of a la carte building your system is cool, but I'm not sure where that requires this system versus the old Mac Pro style with one or two configs and customization from there.

Potentially smaller, so reduced shipping and warehousing costs, potentially cheaper floor price so wider customer case scenario etc.
 
Perhaps splitting the system, and putting it on a common non-constrained connection bus (ie not thunderbolt) means that, for example, a person upgrading the processor module can leave everything else in place, then repurpose the old processor with new accessory modules. Or, tradein / sell the old processor, which someone else can use as a more affordable entry to Apple's ecosystem (folks who can't afford a new core, and won't buy an iMac - one group of people Apple is losing to Windows & Hackintoshes), buy new accessory modules, then eventually upgrade their processor, perhaps with a new one.

Problem is this requires Apple to either develop/validate their future upgrade modules ( yeah right ) or license 3rd party to develop/validate addon/upgrade modules ( yeah right, although if the brand is trendy enough ... ).

As it stands, given the history with the 2012, and even the paltry selection of GPU options prior; there is no point in presupposing 'upgrade' anything. If these rumours are true, you can safely assume some options at time of release, and that is all from Apple for another decade or so.

Maybe they are hoping the sticker shock won't be so bad if they give you the option to defer your 1200$ ( or whatever it was ) 1tb storage module to a later date.

I really hope these rumours are not true.
 
  • Like
Reactions: Nugget
The Tailosive Tech podcast goes into a bit more detail, worth a listen. The unnamed source told Drew there were multiple competing designs for the MP- the source's team were pretty confident in their design's chances, they reckoned it was further along than the others, but obviously not certain to be picked. Their config's specs: Xeon W3175X, 2 GPUs (not specified), up to 512GB RAM....
 
Welcome to 1983:
PRODTHM-2964.jpg

http://www.computinghistory.org.uk/det/8242/Convergent-Technologies/
 
Remember the rumour a while back that Apple was consulting with RED cameras on modularity, but rather than RED, have a look at how Phase One does their camera systems - $20k 150MP sensor backs that you can use on their own modular DSLR body (which accepts any number of generations of Phase One backs), OR use on a third party technical / medium format camera body. Importantly, when you want to trade up, Phase One will handle the tradeup and have their own secondhand sales channel - you can buy used older generation backs direct from Phase One if you can't afford their latest & greatest.


I don't think you can compare the Phase camera backs to Macs .
It'd be great if it was a valid comparison , though , and Apple could learn from it .
After all, Phase is highly flexible in a niche market of high end pro gear and successful because of that .

One difference is, Apple's computing product isn't Macs, it's OSX .
Which is the opposite of flexible, increasingly caters to mainstream consumers only - and is no success at all with demanding customers.
 
unless it's 1 gpu low end basic chip to drive TB pci-e x1 link (like the low end chips in servers) and gpu 2 full x16 card.

The low end one would be fine - they could even do that themselves with an A series GPU. I'd think really strongly about buying a processor module and a GPU module, but there's no way in hell I'm gong to bother with this if i've got to buy an (expensive) GPU as a part of the basic CPU block - it's just needless content-stuffing.
 
I think the next Mac Pro will be just another disappointment. Totally overpriced, limited in terms of upgrades and support.
In the end, they should just have revamped and modernized the classic Mac Pro.
Dump the optical drives, improved CPU cooling, 2x 2 slot PCI-E sockets etc.
Maybe even dumped the 3.5" slots.

How hard can it be to release something like this? Look at what kind of windows machine you can build... This is just ridiculous
And yeah, if I want to use nvidia GPU's and CUDA, you simply can't
 
The Tailosive Tech podcast goes into a bit more detail, worth a listen. The unnamed source told Drew there were multiple competing designs for the MP- the source's team were pretty confident in their design's chances, they reckoned it was further along than the others, but obviously not certain to be picked. Their config's specs: Xeon W3175X, 2 GPUs (not specified), up to 512GB RAM....

It was regretable even watching the video.... not particularly expecting much better in the pod cast.

I'm more than highly skeptical that there is more than one team working on the next Mac Pro. Apple can't even keep the products updated let alone even more systems on a build path to nowhere.

I wouldn't be surprised if this "stackable" thing was far more a "concept path" for the Mac Mini than the Mac Pro. Apple is already spinning the "stackable' in the roll out for the current Mini. A future Mini that was "more stackable" would make some FAR, FAR, FAR more sense. The concept of a "brain box" with zero GPU in primarily why I was completely unmotivated to listen to the "in depth tech" podcast. That is a completely 100% boneheaded idea for a system that is primarily highly GUI oriented. The primary point of macOS is to be graphic and the core base system you buy has no ability to display what so ever. Yeah that make sense. *cough* Not!

The mini with a typical Intel desktop class CPU with a iGPU then the "brain box" with an iGPU makes some sense. You could buy just a "brain box" if didn't have high end graphical needs and just hook up a discrete monitor; just like previous Mini generations.

If want to go to a second (and 'bigger' ) GPU then some "stack module" would blend nicely with the Mini. But for the class of processors that are reasonable to use in the context of the Mac Pro you don't get a "free" iGPU with the CPU package. Pushing even the default GPU into another box is positively brain damaged.

The second tech tidbit in the video of saddling the Mac Pro with Thunderbolt 4 is also indicative of lots of arm flapping as to what rational Mac Pro project would be task with. Pie in the sky future Mini ... sure Thunderbolt 4 arm flapping. But for a product that is over 4 years late ... Thunderbolt v4 is utterly ridiculous. That is more indicative of "Bozo's Big Top circus" than anything remotely resembling technical engineering. ~60Gb/s over highly affordable cables probably isn't something "just around the corner".
It is almost an exact repeat to mistakes with Mac Pro 2013 tight coupling to Thunderbolt 2 and pragmatic slide into early 2014 for volume shipments.

Apple's monitor is at 6K3K so they don't need TBv4 to be viable at all. Folks who need 8K monitor support can just go with two cable solutions ( including Apple) for next couple of years. 8K monitors is mainly a solution in search of a problem for the mainstream and most pro uses. Shooting in 8K and doing reductions sure. But manic pixel peeping at 8K.


The third tidbit of "incrementally bigger than a Mac Mini" ( a little bit taller and little bit wider) . The CPU in the Mini is around 65-90W nominal TDP (and yes run hotter at higher core count most ) The Intel W
class i 140W. That is over a 100% increase. So ho do you get a little bit wider fan to work. If you have to move the Fan out of the bottom and place it vertical so it is far more efficient how is it "little bit taller"). Take a Mac Mini and rotate it from horizontal to vertical. Is that just a "little bit taller"? It is stacked so can't vent up/down so will have to go front to back. Also have double the local power supply demand so it is bigger hotter now too. If match the iMac Pro twin NAND daughtcards ... more volume and cooling still. More than one 10GbE ... yep. but still just incrementally bigger than a Mac Mini.



P.S. I wrote the above and then thought ... maybe just maybe there might be something in the podcast.
and just couldn't stomach past the 28 minutes or so mark.
The RAM is not in a separate module (as if it were a possibility) ... give me a fracking break.

Please this even more remotely really technical. the validity of the source is that he doesn't sound like a high schooler. Like someone who might have had some technical training so therefore extremely creditable to be working on the next Mac Pro. This is comical analysis; not technical analysis.
 
Agreed, there are a lot of technical details that just sound wrong...non-upgradeable CPU, for one. Granted, the last time Apple officially sanctioned a CPU upgrade option was, if I remember correctly, the 68K-PowerPC transition in 1994. However, along with many, many other Macs since, every Xeon-equipped machine Apple has ever shipped has the CPU in a regular socket, including the iMac Pro. Apple endorsing/supporting processor upgrades is not and has not generally been the policy. Physically preventing it has, thankfully, often been another matter. Entirely possible Drew has someone jerking his chain....
 
  • Like
Reactions: ETN3
I find not working in bright, reflective places a better option than avoiding glossy monitors. At least with a desktop I can control the location versus a laptop.
 
I find not working in bright, reflective places a better option than avoiding glossy monitors. At least with a desktop I can control the location versus a laptop.
I find that I've eWasted most of my glossy Apple monitors and replaced them with Dell monitors. (The ones that weren't eWasted were taken home for home office duty - no Apple monitors in the office today.)

(I didn't buy the Apple monitors - they came with a group that we acquired. Once the new people saw the Dell monitors that the rest of us were using -- they wanted to upgrade.)

And glossy was the primary reason for dumping the Apple and switching to Dell. (A secondary reason was that the Dell 4k monitors were "retina", and the Apple monitors were lower resolution.)

Only the drag queens in the group wanted a monitor that would double as a makeup mirror. ;)
 
Last edited:
I find that I've eWasted most of my glossy Apple monitors and replaced them with Dell monitors. (The ones that weren't eWasted were taken home for home office duty - no Apple monitors in the office today.)

(I didn't buy the Apple monitors - they came with a group that we acquired. Once the new people saw the Dell monitors that the rest of us were using -- they wanted to upgrade.)

And glossy was the primary reason for dumping the Apple and switching to Dell. (A secondary reason was that the Dell 4k monitors were "retina", and the Apple monitors were lower resolution.)

Only the drag queens in the group wanted a monitor that would double as a makeup mirror. ;)

We’ve moved to Dell monitors for everyone not on iMacs and happy with one display... I love mine and always got eye strain from the glossy stuff.
 
I find not working in bright, reflective places a better option than avoiding glossy monitors. At least with a desktop I can control the location versus a laptop.
Not all of us have a lot of choice in where we can place our workstation, whether it's at a place of work where someone may be assigned a desk, or whether it's at home where someone may have to make do with the home's layout.

And I think most people find a laptop glossy screen to be less of an issue than a large desktop glossy screen because it's easier to either physically move to a different place because of the device portability, and/or slightly adjust placement on a desk or lap or what have you to mitigate glare when it's an issue.

Personally I won't even consider glossy screens for the desktop. I'm not sure I'd get an iMac for other reasons as well, but I know there's a lot of potential buyers of the iMac that skip it because of the glossy screen alone.
 
I rather prefer glossy screens for desktops. But presuming I can control the ambient light conditions. When reading/writing.. dealing with text for coding etc.. the non glossy 'ripple' style AG coatings diffusion cause me incredible eye fatigue including headaches after long periods of time. The glossy screens do a far better job here for me, it's not that I like the 'glossiness' of them, far from it.

If I can't control lighting, then begrudging will take a matt ripple AG style.

For video or graphics its far less of an issue.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.