In Windowsland, other than home-built machines, the best example of this were the built-to-order machines that made Dell/Gateway in the late 1990s. Intel motherboard with no onboard anything beyond IDE/parallel/serial, you got a discrete graphics card, a discrete sound card, a discrete modem, a discrete network card pre-installed in your PCI slots. Those machines look so weird today, coming from the factory with potentially 4-5 slots used.
That's a really good point. Back in the 90s/early 00s I was assembling my own PCs and also remember plugging in things like hard drive adapters (in the 80s, for
any hard drive - later on if you wanted more than 2 hard drives, or ATA 66) and SCSI cards (some
scanners used to come with SCSI cards, you needed them for early writable CD drives) or FireWire cards (yes, on PCs if you wanted to edit footage from your MiniDV camcorder). There were also more reasons for upgrading things like the GPU for
prosumer purposes - e.g. I got a new GPU for my work PC because I needed it to run
dual DVI displays, there were hugely popular things like the Matrox Mystique/Rainbow Runner combo video capture/editing which was about the first
consumer-priced full screen ("better than VHS quality") video editing system.
The point is that, back then, expansion slots in a PC were a must-have for a large proportion of users, including many consumers and most "prosumers"/hobbyists. Since then, a lot of that functionality has migrated first to the motherboard and increasingly - with Apple Silicon - onto the system-on-a-chip.
A lot of on-board interfaces are also "good enough" - e.g. old RS232/parallel, even USB 1 were never really adequate for external drives and
any sort of 3D gaming needed a better-than-stock GPU.
Any full-screen video editing (even touching up your holiday vids) would likely need additional hardware and a better-than-base-level machine. By contrast, today, USB 3 came out
in 2008 and still provides more bandwidth than is needed by any single mechanical hard drive, or even most bog-standard consumer SSDs. Only a minority of users need to pay the premium for even a thunderbolt device - and while a SSD plugged into a 16x PCIe V4 slot sounds
awesome there's only so many 4k video streams a person can watch. The cheapest MacBook can run 3D games (maybe at reduced, but still playable, quality levels) and edit your iPhone-shot videos out-of-the-box.
I know people who use PCs professionally and always used to depend on a full-sized PCIe tower or two - but are now switching to mini-pcs for everything but 3D gaming.
So it's not that people calling for better support for "pro" applications don't personally need the power they're asking for - it's that they're becoming a smaller and smaller fraction of the overall market - and with "consumer" electronics, economies of scale are everything. I'm not saying that Apple's $7000 price for a $4000 Mac Studio with PCIe slots and a grotesquely over-engineered case is justified
but they'll certainly be making them in tiny quantities compared to the studio, which itself will sell in tiny quantities compared to the MacBooks.
In Macland, I would probably guess (not having been on the Mac side at the time) that the peak of this era was the B&W G3 and the G4s.
Probably, as in they were reasonably affordable Macs with PCIe - but they were still a "pro" alternatve to laptops and the iMacs - I don't think there was ever a time where the low/mid ranged Mac relied heavily on expansion slots the way mainstream PCs did. Mostly the lower-end Macs were either all-in-ones or "pizza boxes" - "classic" Macs sitting on top of SCSI drive enclosures were a common sight and (I don't know about the very first Mac) usually had on-board SCSI and LocalTalk networking with a single slot for an Ethernet card. It was the higher-end Macs that had first NuBus then PCIe slots - and for the first half of Mac history "workstation" probably meant a Sun or SGI Unix box, and high-end Macs were closer to those than 80s-to-early-90s PCs.
And the last gasp of that type of machine was the 2010 Mac Pro. A little too expensive and workstationy, but it was still a modular half-affordable desktop in a way that nothing that followed was.
The 2006 was pretty keenly priced for a Xeon/ECC workstation (RAM was expensive - but not so much from Apple gouging, more because it used FB-DIMM modules that cost a packet even from third parties).
It may come to a shock to many but shipments of desktop dGPU have been at a downward slope since 2005.
Not really a shock - again, relatively basic GPUs have been becoming "good enough" for all but specialist purposes (it's nice to game in 4k, but you can have a lot of fun at 720p) and higher-end GPUs have been becoming more and more about GPU-based computing and less about displaying big spreadsheets.
Mac desktops, on the other hand, are underwhelming. Apple doesn't have a satisfying answer to the basic question: If you already have a Mac laptop, why should you buy a Mac desktop?
Unless you have
no need for mobility or need the power of the M2 Ultra, the straight answer is
you probably shouldn't - and I think that's what they're working to. I think part of the reason for the demise of the 27" iMac is that a MacBook Pro can now offer the same computing power. MBP owners don't need an iMac for extra performance, but they
might want a Studio Display when they're using their MBP on a desk - it's clearly designed as the ultimate MacBook Pro docking station - a
lot of money has gone on cramming in a power supply big enough to power a MBP, which is totally wasted in a desktop setup.
Question is, what else can Apple do in the "pro machines for real pros" sector? AMD Threadrippers are offering silly numbers of cores and insane PCIe bandwidth. Even back in 2019, if you looked beyond the Xeon-W to the "scalable" multi-CPU capable versions, there were more powerful systems with more cores, more PCIe lanes and higher memory capacity. Many people calling for better Mac Pros are also calling for NVIDIA GPUs, and Apple seem to have burned their bridges on that one.
Apple could keep using Intel in the pro, or switch to AMD, and keep using AMD GPUs - in which case they'll have a system which is exactly as powerful as cheaper, larger-volume generic PCs using the same chipsets, or they could funnel a huge amount of cash into making an ARM-based Threadripper-killer
just for their smallest selling machine. (No, a M2 Extreme or a M3 Ultra isn't going to support 1TB+ of RAM or 128 PCIe lanes, let alone
beat that - they'd need a new die - plus, going for external GPUs and RAM throws away some of the power consumption vs. performance advantages of Apple Silicon).