When I was going for my pc technician certification a long time ago the instructor back then said there will be a day where pc components will be truly modular. Meaning that when the CPU becomes outdated the user will simply pull out the CPU modular component and swap it with a faster and better cpu. Yeah you can do that now, but what he meant was not only will the cpu be swapped out, but other components that make the cpu tick would be pulled out at that time as well. So the user would not or might not have to worry about if that CPU will is compatible with that motherboard, worry about applying TIM or a heatsink to the cpu and what have you. I haven't really kept up with the latest pc technology, but from what I have read it does look like some of that is already starting to come true. Just my .02 cents.
What your instructor predicted is impossible in a practical sense.
We already have full modularity (if you're building a PC) now. In that if some component fails, you can swap in another. But there are dependencies that have to be taken into account.
Some real-life examples (Totally not drawn from my own experience!):
1) You upgrade your graphics card to the best model available. But then you discover that it needs extra power leads that your old power supply doesn't have. So you have to buy a new power supply as well.
2) You've upgraded your power supply, and your graphics card is now running. But after trying out some games and benchmarks, you discover that it's running at half or less of its theoretical performance. Turns out the bottleneck is your CPU. It can't pre-process the graphics data (model transformation, shader compilation, etc.) fast enough to keep the graphics card working full-time.
3) You buy a new, faster CPU. But then you discover that it uses a newer kind of CPU socket that's not compatible with your old one. So you need a new motherboard...
4) You buy a new motherboard, and then plug the new CPU into it, but now it's the CPU that's not running as fast as it should. You discover that your RAM is too slow. The CPU keeps waiting on the RAM, and the Graphics Card keeps waiting on the CPU.
5) You buy some new RAM, and performance greatly improves. But when you start editing video, you discover that your CPU and Graphics card are now being bottlenecked by your slow hard drive. Game loading takes forever too. So you replace it with an SSD.
6) With the SSD, the video editing is less of a torture, and games load a lot faster. But then you discover that SATA SSD's are a lot slower than NVME SSD's. So you buy an NVME SSD for more speed.
7) When the NVME SSD arrives, you're perplexed. It looks like a weird RAM stick, and you don't know where to put it. You then discover that you can either buy a SATA to NVME adapter, which makes it just as slow as your old SSD, or you have to buy a motherboard with NVME sockets. Off to Amazon again...
8) The NVME SSD is now plugged into your new (2nd) motherboard. You now have a fast, working, upgraded computer. But you've replaced pretty much everything aside from the case and your old hard drive (used now for data backup).
9) You decide that you need more graphics horsepower, and purchase a second Graphics card that will link up with the first one. This works great... Until your computer fails to boot one day. Poking around, you discover that the NVME SSD failed from overheating. Why? Because the NVME slot was between the two graphics cards. There wasn't enough airflow to cool it properly.
10) So you buy your THIRD motherboard, which has its NVME slots in a better location. Of course this means reinstalling Windows a 3rd time, triggering an alert. You then spend an hour on the phone so that you can explain to an Indian Microsoft tech why you've been installing their OS so often lately.
And the thing is that is HAS to be this way. Because you can't design a single component interconnection standard that will last forever... Because you can't perfectly anticipate human needs for technology, and even if you could, you can only base a solution on the tech that's currently available. Not what might be available 5-10 years from now.
[doublepost=1522096123][/doublepost]
The iMac Pro has about the same size "thermal corner" that the Mac Pro 2013 painted itself into. Apple could add back a standard slot or two and crank up the system volume and power back to near old levels ( 800-900W). The display GPU could be integrated with Thunderbolt (and rest of the system) but the "compute GPU" really doesn't need to be tightly coupled.
The only "network computer" aspect I would expect is that it won't have 3-6+ HDDs sleds. Extremely large bulk storage is "elsewhere". Probably less of a dependency than the iMac Pro ( one and only one drive), but fewer ( bigger with modern 10+ TB singles ) drives than before. Apple is a bit fixed on the "single, big enough SSD works great' solution across the whole line up. It will be a task to even walk them back a bit from the position.
It wouldn't be a "clone" of a HP Z or Dell workstation or even of the old 2006-2009 Mac Pros, but there will likely be some linkages. The folks who are almost solely focused on buying a container for stuff will go but it probably will be recognizable as a workstation in function.
Yep. I think that we'll be very lucky if we even get one slot for a video card in the new Mac Pro. It's far more likely that Apple will build another headless all-in-one system line that will absorb the old Mac Mini. Where the primary difference between Mac and Mac Pro will be the number of TB3 ports that you can plug external peripherals (including their final eGPU solution) into.
Because let's face it... Apple is a hardware company, and they need to move product. Producing an updated cheese-grater Mac that can be upgraded for a decade after purchase both lowers their sales, and gives the Hackintosh segment of the market the software hooks that they need to keep avoiding buying Apple's hardware.
So if there are internal expansion ports for a GPU at all, expect it to be a proprietary Apple connector with slightly higher performance that only AMD is invited to produce cards for. The primary stumbling block to building a Hackintosh is proper support for the nVidia cards that everyone really wants. So why should Apple make Hackintoshing any easier? At least until they have T2 chips in every system that can effectively thwart the Hackintoshers without ignoring nVidia.