I hate to repeat myself, but nobody in this thread has actually answered the question posed by the OP.
Yes, we get it, lack of large expandable physical RAM and upgradeable video cards means there are certain tasks this Mac Pro can't do (yet, anyway).
I'd like to see the M2 Ultra compared head-to-head with the Xeon - yeah, with an afterburner and at least 1 of those dual Vega MPX cards - on a set of common tasks to see where they excel or fail, determine why (unoptimized code, hardware limits, etc.) and publish a comprehensive report to help buyers understand what it's good or bad at.
The Mac Pro is probably the product that ships in the lowest numbers compared to MacBook Pros, because it addresses a niche high-end market, and some of those use cases are a niche within a niche.
Not to say they're not useful or valuable tasks, I'm totally on the side of people that are mad about it, but I'm trying to understand Apple's product stance here. They must have understood that their decision would make their high-end niche users angry, and they needed to release something to meet their self-imposed deadline.
I am pretty sure they are working on something with M3 that will enable much larger package-on-package RAM amounts (maybe DDR5, too). I'm wondering if it was just the worldwide chip shortage and foundry issues that prevented them from stockpiling more RAM to create larger on-SOC memory pools, but we'll see.
That said, I think the path they might go down is reviving Xgrid to enable clustering; and quite possibly, macOS blade servers that let you mix SOC CPUs with pure storage modules?
One of the engineers at JetBrains created a similar system based on Raspberry Pi units; each blade uses an M.2 slot and they're powered over their Ethernet connections.