First, lets make the assumption that the form factor does not change. Apple put a lot of investment and engineering into the current form factor, love it or hate it, its going to be around for awhile.
Agreed.
We will start with CPUs. Since the nMP came out, Haswell-E processors and a new chipset (X99) have been released. These are upgrades from Ivy Bridge-E and the X79 used currently. Neither of these are earth shattering, with maybe a 5% increase in performance. One benefit is the potential for added cores, I think up to 18 (I may be wrong on this). The modest per core performance benefit is likely why apple didn't upgrade to it last year. Apple could wait for Broadwell-E, but this won't be launched until Q1 2016. If indications from the consumer Broadwell chips that have been released can be extrapolated, the increased performance won't be anything too exciting. Additionally, Broadwell-E doesn't include any chipset enhancements, something that may be important for thunderbolt (more on this later).
Technically the Xeons in the Mac Pro come from the EP series... not the E.
Benchmarks showed some performance increases over Ivy, but also some decreases. Net performance was up 3% on average.
You're right that the top end EP chip offers 18-cores, but it comes at an insane $4500 price point which means only the super rich will be able to afford one after Apple's normal margins are applied.
The only real benefit to mainstream buyers is a new Haswell-EP 8-core part at around $1000 vs Ivy's 8-Core part at $1700. The low end 1600 series 4-core and 6-core parts haven't changed in price so there's unlikely to be any change in Mac Pro pricing on the entry level.
Second, and more interestingly, are the GPUs. Lets make the assumption that apple sticks with AMD, since AMD seems to be much more willing to make custom chips for a given form factor (see consoles, iMac, current Mac Pro) than nvidia, not to mention these chips perform a bit better for compute workloads. Currently, the GPUs in the mac pro are a generation old, and will soon be 2 generations old, with the release of AMDs 300 series GPUs in the next 2 months, its reasonable we could see this in a mac pro. While Apple doesn't always go with the latest and greatest tech in their products, they were the first ones to use AMD's Tonga architecture with the m295X in the iMac last year. Another thing to keep in mind that despite the current GPUs being "workstation" parts, they are more similar to under clocked desktop chips with a custom circuit board. Thus, there is no need for Apple to wait on the workstation class parts that are released a few months after the consumer parts come out. I could see Apple being one of the first ones to use AMDs new GPUs, especially if AMD has improved the performance to power ratio significantly. Another factor that may work in favor of this is added SDKs to make these chips work together. There is a lot of work on this with AMD's mantle and Directx 12 in windows, and while these technologies obviously can't directly translate to OS X, something like it could be announced for OS X (something perfect to announce at a developers conference).
The current parts use AMD Pitcairn and Tahiti GPU cores. There was a minor bump in performance (20%?) with Hawaii and Tonga but that also came at a thermal penalty... they ran hotter. The new part coming out is the Fiji core and that will be for the top-end only, so could form the basis for a D700 successor. It will likely support HBM memory for better bandwidth, but there's only so much AMD can do to improve performance without compromising thermals until they move to 20nm.
And, while the Fiji core offers an upgrade for the D700s... what about the D500 and D300 GPUs? Maybe they inherit the D700 and D500 cores respectively?
Also, you need to keep in mind that Apple is running two high-end Tahiti cores and a Xeon CPU off a 450W power supply with a single cooling fan. This means they are certainly binning the GPUs for optimal thermal performance. That will likely mean some hysteresis between any new GPU launch and it being added to the nMP.
I believe that until we see GPUs move to 20nm process, we're not going to see any significant improvements. And with Apple pretty much consuming all available 20nm FAB production with A8 (and it's successor), I'm starting to wonder if we'll ever see 20nm GPUs.
And unless Nvidia changes their tune about doing custom cards, we'll never see Nvidia in a nMP.
Last, what pulls everything together is a potential retina display. Lets assume Apple really wants this for their "pro" machines, but doesn't want a dual cable hack that current 5k monitors use and isn't moving away from thunderbolt. Displayport 1.3 was finalized last fall and can carry a 5k signal, and the earliest we may see it would be this summer, potentially in new GPUs from AMD. However, Thunderbolt 2 does not have enough bandwidth to carry displayport 1.3, so to see a retina thunderbolt display we would need to see thunderbolt 3. There is a rumored thunderbolt chip (alpine ridge) from intel that is supposed to release with the new skylake CPUs (as skylake has more bandwidth coming from the CPU). However, skylake is only being released for consumers (laptops/desktops) this year and we won't see release for workstation/server chips (like those used in the Mac Pro) until 2017. If Apple uses the new thunderbolt 3 controller with Haswell-E, there would be a bandwidth bottleneck that is shared between 2 GPUs and 3 thunderbolt 3 controllers. Since I am only an armchair enthusiast, can anyone correct me if it is in fact possible to use thunderbolt 3 controllers with existing CPUs?
I agree with you that a new 5K TB display is coming at some point... but I doubt it is immanent. All of Apple's displays since 2008 have been designed as docking stations for MacBooks... not for use with Mac Pros. You only have to look at the short pig-tail cable with a magsafe charger that comes attached to the display to understand this. Mac Pro owners need to forget about a new display from Apple until the MacBook line gets refreshed to DP1.3 which will probably happen over time via USB-C... not Thunderbolt.
Thoughtful post.
You forgot to mention that basically whether Apple wants to or not, they will be shipping faster PCIE SSDs since Samsung has moved to the SM951.
So they can offer 50% faster drives just by shifting to a new part. Possibly more speed if they replumb the SSD to PCIE 3.0. They could then DOUBLE drive throughput. (No doubt that 50-100% faster would be more than "incremental")
You might also want to include RAM, something I know little about so perhaps someone else could chime in.
I think they should include NVIDIA GPUs and stop pretending that a reworked desktop card becomes "workstation" card just because they say so.
I think it quite possible that they will have a nMP announcement, but I doubt it will be as big a deal as 2013.
They should either come out with new display or lower price or cancel 27", just an embarassment now. Just last OS update added reference to a specific display for use with retina imac. I think I was wrong about it being Asus 321, likely it is 5K Dell. If they are putting Dell displays in OS in April, can't see them bring new display in June.
All guessing obviously.
Agree on the SSD's... that's probably the most compelling upgrade available at the moment.
DDR4 isn't going to offer anything in the way of performance gains except in rare edge cases... it's great for reduced power consumption in mobile computers but in desktops offers little advantage. Large CPU cache sizes take a lot of pressure off the memory subsystem.
Agree, they should definitely switch to Nvidia, but unfortunately, as we know, that's probably up to Nvidia.
Agree also that if they do announce something, it will be a quiet update to the website. Nothing will warrant keynote time.
And yeah, the 27" Thunderbolt with old mag-safe and USB 2 for $999 is a ridiculously poor value and a total embarrassment to Apple. It should have been updated years ago.
----------
i have a front row seat and a large box of popcorn. The only thing missing from this back & forth is ole' tesselator! Now those were the good old days! :d
lol!