Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
Another interesting point for the 2019 Mac Pro - regardless of what happens to the AS version, how far into the future will Apple provide MPX GPU upgrades?

So far they've been pretty good about it, albeit understandably a few months behind the PC versions. When The next gen AMD GPUs come out, should we still expect support even if an AS Mac Pro is already announced or coming out?

Long term ( i.e, > 3 years from now) probably not.

Depends upon two factors. First, When (or if) did Apple plan for a full tower Mac Pro upgrade. Second, how much of change AMD does with RDNA 3 .

First factor: if in 2019 they planned to do a mid-to-late 2021 W-3300 update then these Navi21 updates were probably planned for that upgrade and got released to fill the gap. That isn't really "pretty good about it". More so got lucky. if a W-3300 powered update drops in 2-5 months that's where the updates came from ( i.e., another Mac. ). Since there isn't likely a W-3400 on the drawing board there would be no new Mac driving an MPX update in post 2022 world.

Back in mid 2019 - early 2020 W-3300 was suppose to ship around Q2 2021. That didn't work out for high volume shipments for a variety of reason both outside and inside Intel. Doubtful that Apple would have tried to couple a 2021 Mac Pro to 2022 RDNA 3 cards.

If Apple no plan for a 2021 Mac Pro , but just used the "big navi" cards to kick the can of doing anything into 2023, then that also really isn't a good sign. They would primarily be just "stop gap" cards. Reading into that the notion that Apple has found super duper GPU card modularity religion is probably bad expectation setting.




Second Factor. AMD and "RDNA3".

There is a rumbling that "Navi33 = Navi 21 + "new core ip" ". Pretty good chance that may mean that the 7600-7700 could be a Navi 21 put on a process shrink to TMSC N6 or N5. That there isn't a big architecture API change and that the AMD primary focus is to make the die smaller ( and hence cheaper) and lower power consuming.
[ I wouldn't count on Apple passing along any of that cost savings though. The prices would probably stay same while Apple takes higher mark up on lower volume. ]. Basically, Navi33 may pragmatically be RDNA2.

if that is the case then could get a card update in the future because could get very minimal driver work updates to a new chip. That would actually be helpful for the MPX 6800 Duo. If the base GPU chips didn't run as high power consuming it would be easier to run a pair of them on a single card at incrementally higher (actually 'normal' ) clock speeds. It wouldn't be a huge spike in performance but a new card in 2023 or so.

Lower power Navi33 might also means get an effective full die "6900" worth Duo card also. That would be a performance jump window that isn't offered now. Could Apple limp along with those in a Mac Pro 2019 offering into 2024 ... sure.


If RDNA as a whole was mostly Performance/watt optimization with 97+ % of the driver API exactly the same then also might get some other updates. Again because the driver updates were practically "free". If the changes in API are mainly about factors that Metal won't cover then the missing 3% wouldn't impact the drivers anyway. ( e.g. Metal ignores : media decoders assist improvements , Smart Access Memory (resizable bar) improvements, better FSR super resolution hooks into hardware, etc. ) the drivers could largely be exactly the same after some GPUID updates.

If Navi31 has a 30-50+ new API and new low level compilers and back end optimizations tools then probably not. No new Metal stuff to track that and no drivers. And if RNDA3 doesn't drive a big enough semantic gap between the driver/tools levels and Metal then RDNA4 probably will.

The other tension is going to be Apple going from buying close to 1M AMD GPUs per year to an amount one or two orders of magnitude lower. ( MBP 16" , Retina iMac 21 and 27 and iMac Pro all gone.) Either AMD will have to start to charge Apple more money to support a diminished product flow (at which point Apple would likely go Scrooge McDuck and quit) or AMD just quits to pursue much higher prospect opportunities ( especially if GPU average selling prices remain higher over the long term ... which looks like they will ).
Unless Apple uncorks drivers to the M1 macs the market is only going to shrink over time. Valuable AMD resources spent on an ever shrinking pot doesn't make much sense for AMD to apply substantive resources to.



If not in an MPX module, how about drivers for the PC version of the GPUs? I know we don't know these answers, but if anyone wants to take a guess at it knowing Apples track record...

Probably not. The bulk of work done on Metal for a new GPU is done by Apple. If there is no Apple product in the mix , then Apple is quite unlikely to do the work out of the goodness of zero money for Apple. If there was some indirect tie in with new M-series Macs ... ( e.g. eGPU on iMac 24" helps some sales) maybe. For a zero active development product line ( Intel Mac Pro in 2023+ ) probably not. Apple has no substantive history of doing that.



I think at least for the next gen AMD GPUs they likely should have an upgrade - since they're likely here in the next 2 years. Of course PCIE Gen 3 is a bit of a limitation on the current Mac Pro, but not entirely - should still be better performance on next gen.

Depends upon just how "next gen" AMD goes. AMD's dual-die solutions probably not. Big increase in CUs maybe.




If they do update the Intel Mac Pro next year, then definitely I think we'll also see more GPU MPX updates. If they don't, then it is anyone's guess as to what they'll do. This can be kept separate than their AS Mac Pro Line of course.

If it is early 2022 ( i.e., original plan was mid-late 2021 ) then probably doesn't make much of a difference over the long term at all. (not indicative of long term support). Intel/Pandemic just slide that system into 2022.



And I agree MPX GPUs won't be a thing in the next Mac Pro AS version - since Apple is really pushing their GPU performance. But I do agree as well that they should keep pcie slots for the plethora of other cards that take advantage of it.

Longer term Apple could come up with a GPGPU API. Or at least a way for 3rd parties to add a "computational" API of their own if Apple doesn't want Metal to get entangled in it.


Here's a thought: 2013 Mac Pro was designed into a thermal corner as said by Apple, right? So they did the opposite in the 2019 Mac Pro. What if for the next AS Mac Pro, instead of avoiding the mistakes of the 2013 Mac Pro, they want to revisit that type of framework and this time dominate it with their own silicon? They can then say, "ha! we finally got you!" instead of resigning in perpetuity with that bad design.

Thermal corner was only one of four explicitly mentioned problems.

One of those was making folks buy something they don't necessarily want. Apple silicon couples RAM capacity to CPU and GPU core count. If want a substantively bigger GPU then have to buy more RAM (and CPU cores if larger enough increase ). That is a deeper coupling than the MP 2013 had. Not "dominating" that at all; it is a deeper "value proposition" hole.

They'll sell more to folks with large price elasticity (and who pragmatically spend other peoples money) . But overall volume of sales will likely go down.

They aren't going to do "dominate" that problem.... it will likely be more misdirection from the problem. "Look how fast in a smaller system box".

If they try to do with with just one fan that they hobble in a substantive way then will be right back in that thermal corner problem space again too. This Jade4C chiplet/tile SoC could run up in the 320+ W range. if they go "too cute" on the enclosure they can easily mess that up.
 

Waragainstsleep

macrumors 6502a
Oct 15, 2003
612
221
UK
Its so hard to predict what Apple might do with an AS Mac Pro. I'm going to ramble and ponder at the same time.

Looking at the full range of M1 chips, "Max" suggests it will be the best ever version of the M1. So the Mac Pro will have to use M1 Max CPU(s) or some kind of M2 CPU(s) or a completely different AS CPU, maybe an X1 or P1 range of CPUs.

If Apple use the M1 Max, they can go the dual CPU route (presumably) or they can clock it higher. If they used a different single SoC, it would be something bigger with 16 or 24 CPU cores or more, and maybe up to 128 GPU cores, which might buy it up to 128GB RAM but thats a huge die. Looking at the different power requirements between the 20W M1 and the 100W+ M1 Max, such a chip is going to need considerably more cooling so the current tower model may well remain entirely appropriate.

A different range of CPUs (Think an AS Xeon to the M1's Core iX) would only be done for two reasons. Firstly to raise the total RAM ceiling. Secondly to use a discrete GPU.
I wonder the following: Could Apple create something like level 2 RAM? Then the M1 Max with 64 GB could have a couple of TB or so of this L2 conventional RAM. This would be another reason to keep the current tower form factor.
Could they convert an M1 Max into a dedicated GPU? Disable some of the CPU cores and just use the GPU. Maybe they could even put them in an MPX Module. Or maybe even two of them.

If they do go with some version of the M1 Max, I wonder if they might also consider resurrecting the Xserve. I reckon they could fit 3 super slim rack mount M1 Max based servers in 1U of rack space. Basically headless MacBook Pros with a little extra cooling in their wider and deeper enclosures. That has to be a compelling notion for cluster computing purposes. Plus all those GPUs with ~60GB each of RAM available. I can see that doing very well.
 
  • Like
Reactions: rkruk

Weisswurstsepp

macrumors member
Jul 25, 2020
55
63
I was expecting somebody might pick up on this. This time around I added the qualifier "recent" (pioneer) in my original post for the insurance..

In what way do you believe have Sony or AMD "pioneered" UMA (a technology which, as explained, has been pioneered by someone else more than 25 years ago already)?

What a walk down the memory lane. Like lots of ideas/concepts in computing, they indeed date back many years and perhaps decades. In personal computers, "unified memory architecture" was born out of necessity I believe. For example, Apple II and Macintosh were earlier examples of "unified memory architecture. I think both weren't unique either on this regard from machines in that era. The first IBM PC was the "breakthrough" to introduce the "discrete GPU" to personal computers.

Around 1999, Intel introduced first chipset that included iGPU which is also "unified memory architecture." And ever since "unified memory architecture" dominated PC and remain true today. It has always been around people but people don't talk about it because performance is nothing to brag about since PC markets usually compromise. For example, very few laptops deployed LPDDR RAM the name of cost saving, and weak market demand.

Smartphone GPUs are "unified memory architecture" out of necessity just like early personal computers. I think Apple's root in mobile processors is one of the reasons they picked its current path in M processors. I would think the other reason was the tremendous success of Playstation 4 which proved desktops could perform very well with "unified memory architecture."

You seem to conflate UMA with Shared Memory, both which are different concepts. In a "shared memory" system the GPU uses a part of the physical system RAM as graphics memory, but the memory segments of RAM and graphics memory are still strictly separated (so there is still the need to move data between both segments as there is on a system with discrete GPU). The benefit of Shared Memory is cost, as it removes expensive dedicated vram from the BOM. The disadvantage is that it is pretty slow, not just because RAM is generally slower than VRAM but also because of the expense of the still necessary data transfer between RAM segment and GPU memory segment.

On the other side, in a "Unified Memory Architecture" system CPU and GPU not only share the physical memory but also the address space (i.e., there aren't separate RAM and GPU address spaces, there's only a single common address space; as a consequence, there is no need to transfer data between CPU and GPU memory, which is what causes the speedup of in-memory graphics operations).

Most of the things you mentioned (like intel iGPUs) are simply shared memory systems, not UMA. The same is true for most smartphones (including most of Apple's own SoCs) - these are shared memory systems, not UMA systems.

Also, the *only* PCs ever made based on UMA have been the SGI VW320 and VW540. Everything else (including all Macs aside from the new M1 based ones) which didn't use discrete graphics were shared memory systems, not UMA.
 
  • Like
Reactions: HDFan

bsbeamer

macrumors 601
Sep 19, 2012
4,313
2,713
If they do go with some version of the M1 Max, I wonder if they might also consider resurrecting the Xserve.

Interesting thought. Several potential issues for most customers, but the main one might be the lack of promise for longevity or 5-7+ years of guaranteed support on whatever they buy. This is becoming an important selling point for those dropping the kind of money those machines charge at the start. Or do they go into lease only? Or via cloud subscription only?

I almost cringe whenever I hear Xserve. I loved them back in the day, but when Apple basically abandoned them and the RAID units that FCP users were using, it sent everyone into a scramble. Then classic FCP basically was abandoned and it started the spiral out of "Professional" completely. Is that coming back, or are they sticking with it?

The drama being created with no Mac/Windows crossover on a single machine may complicate this a little. Very few facilities are 100% Mac. There was a time when it trended that way. For as much promise as the M1/M1x has, it will take awhile to convince those in charge of budgets to take the gamble. Apple bringing back their professional business focused teams might help.
 

Waragainstsleep

macrumors 6502a
Oct 15, 2003
612
221
UK
Interesting thought. Several potential issues for most customers, but the main one might be the lack of promise for longevity or 5-7+ years of guaranteed support on whatever they buy. This is becoming an important selling point for those dropping the kind of money those machines charge at the start. Or do they go into lease only? Or via cloud subscription only?

I almost cringe whenever I hear Xserve. I loved them back in the day, but when Apple basically abandoned them and the RAID units that FCP users were using, it sent everyone into a scramble. Then classic FCP basically was abandoned and it started the spiral out of "Professional" completely. Is that coming back, or are they sticking with it?

The drama being created with no Mac/Windows crossover on a single machine may complicate this a little. Very few facilities are 100% Mac. There was a time when it trended that way. For as much promise as the M1/M1x has, it will take awhile to convince those in charge of budgets to take the gamble. Apple bringing back their professional business focused teams might help.


The Original Xserves flourished because the G5 was the only 64-bit show atmosphere the theatre. It offered something unique to at least some scientists and others.
Intel Xserves never did. They had equivalent machines from Dell and HP that were updated much more often and because such devices sat in racks and weren't touched by end users, running Mac OS wasn't the same draw it was back then on a MacBook or iMac.
Apple Silicon brings back some uniqueness to the party. They'd probably still need some compelling software to drive any kind of small business sales but its not like Apple can't do that. I used to want a proper iTunes media server but thats not needed any more. Really fast, robust, secure file sharing would be essential. A half decent Apple MDM service would be compelling. I always thought a FaceTime PBX server could be nifty. Theres definitely some cool things they could for small business users without having to try to compete with Microsoft's enterprise offerings.
Forget to mention my 1/3U M1 Max Xserves would a Touch Bar on the front which displayed stats like temps and loads and enabled you to easily spot one in trouble and reboot it without needing a screen or remote desktop.
That said, imagine if you had 126 of them in one big rack and an iPad in your hand and you could use Sidecar to connect to any of the servers in the rack which you could connect by tapping a Sidecar button on the Touch Bar and authorise with a touch ID button on the end of the bar to select the one you wanted? Now that would be awesome.

The huge shared video RAM and low power consumption would make these Macs a nice option for a render farm.
 

4wdwrx

macrumors regular
Jul 30, 2012
116
26
In what way do you believe have Sony or AMD "pioneered" UMA (a technology which, as explained, has been pioneered by someone else more than 25 years ago already)?



You seem to conflate UMA with Shared Memory, both which are different concepts. In a "shared memory" system the GPU uses a part of the physical system RAM as graphics memory, but the memory segments of RAM and graphics memory are still strictly separated (so there is still the need to move data between both segments as there is on a system with discrete GPU). The benefit of Shared Memory is cost, as it removes expensive dedicated vram from the BOM. The disadvantage is that it is pretty slow, not just because RAM is generally slower than VRAM but also because of the expense of the still necessary data transfer between RAM segment and GPU memory segment.

On the other side, in a "Unified Memory Architecture" system CPU and GPU not only share the physical memory but also the address space (i.e., there aren't separate RAM and GPU address spaces, there's only a single common address space; as a consequence, there is no need to transfer data between CPU and GPU memory, which is what causes the speedup of in-memory graphics operations).

Most of the things you mentioned (like intel iGPUs) are simply shared memory systems, not UMA. The same is true for most smartphones (including most of Apple's own SoCs) - these are shared memory systems, not UMA systems.

Also, the *only* PCs ever made based on UMA have been the SGI VW320 and VW540. Everything else (including all Macs aside from the new M1 based ones) which didn't use discrete graphics were shared memory systems, not UMA.
I think Playstation also had unified memory, I don't think it's new.
 

rondocap

macrumors 6502a
Original poster
Jun 18, 2011
542
341
For me I am most curious how real world results play out between a maxed out MacBook, versus a maxed out Mac pro with quad GPUs and a 28 core processor+ afterburner.

Heavier stuff like 8k red raw and other heavier color grading, 3D work, etc - I still think the gap should be big. For consumer mirrorless cameras the MacBook likely will win in many categories, though
 

Boil

macrumors 68040
Oct 23, 2018
3,478
3,173
Stargate Command
If they try to do with with just one fan that they hobble in a substantive way then will be right back in that thermal corner problem space again too. This Jade4C chiplet/tile SoC could run up in the 320+ W range. if they go "too cute" on the enclosure they can easily mess that up.

Rumors mention a design that mixes the ventilation of the 2019 Mac Pro & is also reminiscent of the G4 Cube...

I could see a Cube on feet, intake from the bottom & exhaust out the top; these would be the protions that have the 3D air holes, with 140mm fans top & bottom...

Ports on the back, to one side, running vertically; heat sink similar to the one in the 2019 Mac Pro...
 

mattspace

macrumors 68040
Jun 5, 2013
3,344
2,975
Australia
Rumors mention a design that mixes the ventilation of the 2019 Mac Pro & is also reminiscent of the G4 Cube...


Apple has done two "thermal chimney" computers - the Trashcan, and the Cube. Both of them suffered endemic overheating that lead to catastrophic hardware failure.

What makes people think that Apple has the capability to get it right on a third attempt? How many chances do they get, before people to set their expectations based upon the prior results?

Might as well be optimistic about another Apple-designed liquid-cooling solution.
 
Last edited:

Boil

macrumors 68040
Oct 23, 2018
3,478
3,173
Stargate Command
Rumors mention a design that mixes the ventilation of the 2019 Mac Pro & is also reminiscent of the G4 Cube...

I could see a Cube on feet, intake from the bottom & exhaust out the top; these would be the portions that have the 3D air holes, with 140mm fans top & bottom...

Ports on the back, to one side, running vertically; heat sink similar to the one in the 2019 Mac Pro...

Apple has done two "thermal chimney" computers - the Trashcan, and the Cube. Both of them suffered endemic overheating that lead to catastrophic hardware failure.

What makes people think that Apple has the capability to get it right on a third attempt? How many chances do they get, before people to set their expectations based upon the prior results?

Might as well be optimistic about another Apple-designed liquid-cooling solution.
Your quote of my post attributed it to another user...?

And your reply to my post totally neglects the cooling solution I outlined (which should work excellently)...!
 

mattspace

macrumors 68040
Jun 5, 2013
3,344
2,975
Australia
Your quote of my post attributed it to another user...?

Weird, corrected.

And your reply to my post totally neglects the cooling solution I outlined (which should work excellently)...!

What solution? You described a thermal chimney, with Apple's trypophobia-inducing decorative holes.

Apple has demonstrated twice that it is incapable of producing a vertical cooling solution, that is survivable for the product to which it's applied.

ed6687a02f0aec226b164e104cea49b6--los-simpsons-burger.jpg
 

kvic

macrumors 6502a
Sep 10, 2015
516
460
To differentiate Smaller Mac Pro from the rumored Beefed Mac Mini PRO, one could reasonably assume Smaller Mac Pro will carry some sort of modularity & internal expandability in its design. Bottom to top airflow chassis is less likely. The more traditional front to back airflow like a smaller version of MacPro7,1 in compact mATX-like size is one possibility:
301c_size_02.png

The volume in red rectangle can be utilised to carry multiple modules horizontally with one MPX-like slot plus a couple of standard PCIe slots: PSU at the bottom, MPX-like module carrying SoC/RAM/built-in ports at the top. Enough room to house 4-5 single-wide PCIe add-in cards. Or one extra MPX slot + a couple of single-wide PCIe slots. Support max length up to around 260mm.

Plus socketed SSD blades

Won't it cover a LOT of existing Mac Pro user base?
 

mattspace

macrumors 68040
Jun 5, 2013
3,344
2,975
Australia
The volume in red rectangle can be utilised to carry multiple modules horizontally with one MPX-like slot plus a couple of standard PCIe slots: PSU at the bottom, MPX-like module carrying SoC/RAM/built-in ports at the top. Enough room to house 4-5 single-wide PCIe add-in cards. Or one extra MPX slot + a couple of single-wide PCIe slots. Support max length up to around 260mm.
I keep advocating that Apple should effectively just recreate Intel's Beast Canyon NUC, but with MPX modules - a bridge board with 3 full size MPX bays, you put the processor (with iGPU) / ram / default IO in one MPX module, leaving 2 bays free for MPX or PCI (including graphics). No pci-only slots between them, just 3 slots total. Use the same form-factor power supply as the current Mac Pro etc.

nuc11-dbb-side-angle-500-537-954x1024-1.png
 
  • Like
Reactions: uller6

Waragainstsleep

macrumors 6502a
Oct 15, 2003
612
221
UK
Apple has demonstrated twice that it is incapable of producing a vertical cooling solution, that is survivable for the product to which it's applied.
If at first you don't succeed.....

To be fair, the Cube had no cooling at all. I don't remember them having too many heat issues. And I installed a 1.7GHz G4 upgrade in one once. It did include a fan though as I recall.

Cubes had issues with the voltage regulators flaking out and the power buttons used to turn themselves on all hours of the day and night.
 

Boil

macrumors 68040
Oct 23, 2018
3,478
3,173
Stargate Command
What solution? You described a thermal chimney, with Apple's trypophobia-inducing decorative holes.

Apple has demonstrated twice that it is incapable of producing a vertical cooling solution, that is survivable for the product to which it's applied.

I am not a huge fan of the 3d air holes myself, but if it is what Apple is gonna do, what can we do...?

A reason I hope Apple gives us a Space Gray / Space Black option, darker color to kinda hide the weird holes...

As for my thermal chimney; yes, it is a thermal chimney...

BUT IT HAS FANS, REAL MOVE SOME AIR NOW FANS...!!! Pretty sure a 140mm fan intaking cool air from under the chassis, forcing it thru a massive heat sink (like the one in the 2019 Mac Pro), and having a SECOND 140mm FAN to exhaust out the top; pretty sure that can cool two or more M1 Max SoCs just fine...?

Get over your hate for the "Thermal Chimney", it can work just fine...!

And another thing Apple could do would be to mate a 2019 Mac Pro style heat sink to a proper vapor chamber, both for the new Mac Pro Cube & for the new Mac mini (smaller heat sink here, of course)...!
 

StuAff

macrumors 6502
Aug 6, 2007
391
261
Portsmouth, UK
I wouldn't hold your breath for these. The SSD controller is built into the M1 now.
The T2 chip does the same job for those Macs, and the 2019 Mac Pro can have SSDs replaced (as can the iMac Pro, if you disassemble it). Perfectly possible with AS, though whether it will be an option remains to be seen.
 

mattspace

macrumors 68040
Jun 5, 2013
3,344
2,975
Australia
To be fair, the Cube had no cooling at all. I don't remember them having too many heat issues. And I installed a 1.7GHz G4 upgrade in one once. It did include a fan though as I recall.

Cubes had issues with the voltage regulators flaking out and the power buttons used to turn themselves on all hours of the day and night.

They suffered endemic optical drive & power button failures, due to the system cooking those parts of the machine. It's "cooling" was one of the main promotional aspects of the machine - how advanced its fanless cooling was, which didn't cool it, and how amazing its translucent polycarbonate shell was, which suffered from crazing / cracking and was produced in a secondrate process that left mold lines on the surface.
 

mattspace

macrumors 68040
Jun 5, 2013
3,344
2,975
Australia
BUT IT HAS FANS, REAL MOVE SOME AIR NOW FANS...!!!

The 2013 Mac Pro had a big, promoted in the "look how advance our design and engineering are" fan.

It still cooked its components.


Get over your hate for the "Thermal Chimney", it can work just fine...!

It works wonderfully in my Xbox Series X. I just don't believe that Apple can pull off a thermal chimney.
 
  • Like
Reactions: J.J. Sefton

Boil

macrumors 68040
Oct 23, 2018
3,478
3,173
Stargate Command
The 2013 Mac Pro had a big, promoted in the "look how advance our design and engineering are" fan.

It still cooked its components.

It works wonderfully in my Xbox Series X. I just don't believe that Apple can pull off a thermal chimney.

Come on dude, you keep referencing old Apple designs...

I am talking about perfectly good standard 25mm thick 140mm fans, one on either side of the SoC heat sink; you know, like virtually all the tower CPU coolers on the market (with dual fans)...

The 2013 Mac Pro is a horrible example of how a thermal chimney should be, what with it trying to cool components with different cooling requirements, but all sharing the same heat sink...

And that was, for all purposes, a blower fan design with off-axis blades; I am (again) talking about standard high-performance PC fans; think of two Noctua A14 PWM fans, plenty of cooling potential...!

And if you think about it, it would be the same cooling in the 2019 Mac Pro, just better... the 2019 Mac Pro uses three fans on the front of the unit, pulling in cool air & forcing it thru the heat sinks (MPX modules have massive heat sinks), but there was nothing to exhaust the air except the pressure provided by the intake fans...

So please tell me how a proper intake fan, paired with a proper exhaust fan, and a proper high-flow heat sink; with a straight-thru bottom-to-top air flow design (that "thermal chimney") won't properly cool a collection of M1 Max SoCs...? And please have an explanation that is more than "Apple screwed up twice, so a third attempt must also be a bust"...?!?
 

mattspace

macrumors 68040
Jun 5, 2013
3,344
2,975
Australia
Come on dude, you keep referencing old Apple designs...

So please tell me how a proper intake fan, paired with a proper exhaust fan, and a proper high-flow heat sink; with a straight-thru bottom-to-top air flow design (that "thermal chimney") won't properly cool a collection of M1 Max SoCs...? And please have an explanation that is more than "Apple screwed up twice, so a third attempt must also be a bust"...?!?

The Apple that made the 2013 Mac Pro, which put Decorative values ahead of Design values is the same company, and the same culture that is still designing their products.

The Apple that put the Decorative value of having equal side and top bezels on the laptop's screen, at the cost of the Design value of having the uninterrupted menubar - ie putting how it looks ahead of how it works, is the same Apple that is likely to put an inadequate cooling solution into a machine through prioritising the decorative shape, the decorative audio profile, the decorative size etc.

Their pathological obsession with Decorative Minimalism will cause them to prioritise having only one fan, just so they can say "we did this with only one fan".
 
Last edited:
  • Like
Reactions: th0masp

Abazigal

Contributor
Jul 18, 2011
20,395
23,898
Singapore
But here's the thing - with the new chips, the M1 Macs are more than powerful enough to handle whatever tasks 99% of people throw at them.

You used to be able to conflate "pro" with "hobbyist" back when even the most powerful computers were barely good enough for "real work", but over time, standard configurations of off-the-shelf computers have become good enough for the majority of computer work, even for heavy users.

What's frustrating these enthusiasts isn't the lack of performance, but that they were sealed (ie: non-tinkerable and non-upgradeable), meaning they have to pay Apple's rates for better specs instead of being to shop for the cheapest alternatives online and install them on their own. Someone like MKBHD may need a powerful computer to crunch 8k raw footage, but they are not going to lose sleep about whether they are able to install their own ram vs simply paying to have the extra added at the time of purchase. To them, it's just the cost of doing business.

I have used this exact argument in the past to rationalise why Apple hasn't, and may not ever make a mid-tier modular Mac, because you have the Mac mini at the low end, the iMac for everything in the middle, and the Mac Pro to take on tasks that even a maxed out iMac Pro cannot handle. These products do have an audience; none of them browse Macrumours.

The M1 chip seems to have upended this argument somewhat, with the Mac mini and the iMac sharing the same processor. I don't know if we will subsequently see a "Mac mini Pro" aka iMac Pro without the screen, but it likely won't be upgradeable either.

To sum up, I think that PC performance has improved to the point where the point of intersection between Pro and Enthusiast is becoming increasingly smaller, and you have to decide which one you belong to.
 

mattspace

macrumors 68040
Jun 5, 2013
3,344
2,975
Australia
Someone like MKBHD may need a powerful computer to crunch 8k raw footage, but they are not going to lose sleep about whether they are able to install their own ram vs simply paying to have the extra added at the time of purchase. To them, it's just the cost of doing business.

And yet, the 2013 Mac Pro, which was built for exactly the business case you're describing, and advocated for the exact same reason you're advocating, was an abject failure that took a permanent hit on Apple's position in the Pro content creation industries, so much so that they had to completely reverse course on it and build a super-slotbox.

See, the thing is the people you're describing are a subset of pro users, but conflating them with the majority is a mistake.

There is no market for a computer that costs as much as a fully-reconfigurable, upgradable slotbox, but which has no reconfigurability or upgradability, no matter how much Apple would like to gaslight us into believing the "advantages" they offer are worth the majority of its working life being slower than the comparable slotbox that has received continuous upgrades.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.