What if... we see an updated Mac Pro and Mac mini, which are very similar in design, just big and small and with the expected differences in specs? It would make the line-up nice and simple. The delay might be a combination of waiting for the right CPUs, more wide-spread USB-C/TB3 adoption, figuring out where to manufacture them (US doesn't seem to work out that well according to some rumors).
And what if... you can daisy chain them for increased performance? You can use external GPUs, so why not chain together a bunch of minis or Pros or a combination, and have an app on a single device use the CPU and GPU power from the others as well. I don't know if that's feasible with the TB3 throughput and all that, but it could be a solution for keeping the device itself simple (read: non-expendable) and still offer an Apple-supported solution to get more power out of it.
Not sure about daisy chaining together. That sounds very much like the discontinued
xGrid which is very much a niche. If I was spending that much money on compute I wouldn't trust Apple to keep it going.
Final Cut Pro users could hand off rendering jobs to other machines but again why risk the investment on something that Apple can kill on a whim with no prior warning when it's not making them enough profit?
The current Mac Pro is old technology that's had a price
increase in UK/Europe and is so painfully niche it's not true. They've driven away photographers after killing Aperture. They build a machine with 2 mediocre GPUs where one does the compute and the other just drives one or more displays because they don't have OS level Crossfire. Final Cut Pro X only uses 2 GPUs for certain functions - the rest of the time it's CPU based and the Mac Pro is falling behind on that.
I don't think people have been falling over themselves to write apps that are truly optimised for 2 GPUs and hugely parallelised applications. Instead, Apple should go for the mass market of people who would pay for one powerful GPU, plenty of fast NVMe flash, and a powerful CPU in a package that is sufficiently cooled to be silent and crucially doesn't come with a screen built in.
Sell it for under $1999 and the top end Mac Mini moaners will consider it as the beefed up Mac Mini pro.
One new point to make is that if Apple delay much longer after March, the Skylake EP series of Xeon CPUs becomes available, with more cores, more PCIe ports, and a chipset that is a very interesting.
Before we reach that point, Apple could have used a variety of Intel CPUs for an entry level new Mac Pro configuration if they were carefully selective about the loss of PCIe lanes.
For example: Xeon E3-1275 v5 4 cores 8 threads, with P580 Iris Pro Graphics; $556
3.5GHz (slower than the 27" Retina iMac) 16 PCIe Lanes from the CPU plus up to 20 PCIe Lanes from the PCH on the motherboard.
If you eliminate one of the Graphics cards, you could have enough I/O for 2 NVMe SSD (which could be in a RAID configuration) and attach them to one of the triangle heatsink sides for cooling - because they do run hot.
2x4 PCIe lanes for 2 NVMe sticks
1x8 PCIe lanes for the GPU (AMD RX 460)
4x4 PCIe lanes for 4 USB-C ports
2x1 PCIe lanes for 4 USB3.1 ports (without constricting the bandwidth like it is on the new Mac Pro)
1x1 PCIe lane for Ethernet
1x1 PCIe lane for Wifi and Bluetooth Module
And a HDMI 1.4 port for a 4k monitor
Depending on how much flash memory you bundle you might have the basis of a solid machine for $1999
So there you go, nothing gets wasted, nothing idles, no bottlenecks unless Apple decide to offer a faster GPU which would do better using more than 8 PCIe lanes - but it's been proven that the performance benefit on more powerful graphics card using 16 lanes over 8 is less than 10%.
If Apple decide they want an even cheaper entry level option than let's do one with no graphics cards and up to 4 NVMe sticks OR 2 NVMe sticks plus 2 additional USB-C external ports to match up to the higher end Mac Pros.
Only problem for Apple is these kinds of machines are going to last us folks probably 10 years if they've engineered them right. Bluntly speaking, that's a bad idea for Apple!