What your instructor predicted is impossible in a practical sense.
We already have full modularity (if you're building a PC) now. In that if some component fails, you can swap in another. But there are dependencies that have to be taken into account.
Some real-life examples (Totally not drawn from my own experience!):
1) You upgrade your graphics card to the best model available. But then you discover that it needs extra power leads that your old power supply doesn't have. So you have to buy a new power supply as well.
2) You've upgraded your power supply, and your graphics card is now running. But after trying out some games and benchmarks, you discover that it's running at half or less of its theoretical performance. Turns out the bottleneck is your CPU. It can't pre-process the graphics data (model transformation, shader compilation, etc.) fast enough to keep the graphics card working full-time.
3) You buy a new, faster CPU. But then you discover that it uses a newer kind of CPU socket that's not compatible with your old one. So you need a new motherboard...
4) You buy a new motherboard, and then plug the new CPU into it, but now it's the CPU that's not running as fast as it should. You discover that your RAM is too slow. The CPU keeps waiting on the RAM, and the Graphics Card keeps waiting on the CPU.
5) You buy some new RAM, and performance greatly improves. But when you start editing video, you discover that your CPU and Graphics card are now being bottlenecked by your slow hard drive. Game loading takes forever too. So you replace it with an SSD.
6) With the SSD, the video editing is less of a torture, and game load a lot faster. But then you discover that SATA SSD's are a lot slower than NVME SSD's. So you buy an NVME SSD for more speed.
7) When the NVME SSD arrives, you're perplexed. It looks like a weird RAM stick, and you don't know where to put it. You then discover that you can either buy a SATA to NVME adapter, which makes it just as slow as your old SSD, or you have to buy a motherboard with NVME sockets. Off to Amazon again...
8) The NVME SSD is now plugged into your new (2nd) motherboard. You now have a fast, working, upgraded computer. But you've replaced pretty much everything but the case except your old hard drive (used now for data backup).
9) You decide that you need more graphics horsepower, and purchase a second Graphics card that will link up with the first one. This works great... Until your computer fails to boot one day. Poking around, you discover that the NVME SSD failed from overheating. Why? Because the NVME slot was between the two graphics cards. There wasn't enough airflow to cool it properly.
10) So you buy your THIRD motherboard, which has its NVME slots in a better location. Of course this means reinstalling Windows a 3rd time, triggering an alert. You then spend an hour on the phone so that you can explain to an Indian Microsoft tech why you've been installing their OS so often lately.
And the thing is that is HAS to be this way. Because you can't design a single component interconnection standard that will last forever... Because you can't perfectly anticipate human needs for technology, and even if you could, you can only base a solution on the tech that's currently available. Not what might be available 5-10 years from now.
[doublepost=1522096123][/doublepost]
Yep. I think that we'll be very lucky if we even get one slot for a video card in the new Mac Pro. It's far more likely that Apple will build another headless all-in-one system line that will absorb the old Mac Mini. Where the primary difference between Mac and Mac Pro will be the number of TB3 ports that you can plug external peripherals (including their final eGPU solution) into.
Because let's face it... Apple is a hardware company, and they need to move product. Producing an updated cheese-grater Mac that can be upgraded for a decade after purchase both lowers their sales, and gives the Hackintosh segment of the market the software hooks that they need to keep avoiding buying Apple's hardware.
So if there are internal expansion ports for a GPU at all, expect it to be a proprietary Apple connector with slightly higher performance that only AMD is invited to produce cards for. The primary stumbling block to building a Hackintosh is proper support for the nVidia cards that everyone really wants. So why should Apple make Hackintoshing any easier? At least until they have T2 chips in every system that can effectively thwart the Hackintoshers without ignoring nVidia.
I hereby nominate ThatSandWyrm for Post of the Month! That had to be one of the best written analogies I've read in a very long time. Well done sir!