I agree, and from what rumors I've seen, I'm guessing M2 Extreme yields were so low that it didn't make financial sense. If history is an indicator, the M2 Extreme was going to be even larger in size then the M1 ultra - if that is the case, then each wafer produces less M2s driving the unit costs up
There wasn't lots of details about yields. According to the rumors Apple was using 4 "M_ Max" sized dies to compose the Ultra. The component die yields of the "Max " , "Ultra" , and "extreme" would all be the same. It would be the same die being used in three different ways. ( I don't believe that it would be the laptop "Max" die. It would need to be a different die in a similar size range. i.e., would wouldn't magically get M1 Pro or plain M1 sized. Still would be > 300mm^2 ). The Ultra is still coming and a pretty decent chance it will be the same die it was going to be if there was an Extreme. It would not have made much sense to construct them with completely different dies.
If either one was going to support provisioning multiple PCI-e slots of high bandwidth the basic building block couldn't be the laptop "Max" class die. ( Nor does it make much wafer utilization sense to print millions of UltraFusion connectors on an increasing expensive fab process that are never going to use. Once have multiple desktops using chiplets, they can fork that off the laptop "Max" die. )
From the descriptions is was more likely a wafer shortage. If Apple can only get XXXX wafers and needed to make M2 Pro, Max for laptops , Max and Ultra for Studios ... would they have to "rob Peter to pay Paul" to also churn out Extreme SoCs. It wasn't that the extreme dies were horrible and the others were insanely great. Apple had constraints on all of them. Does Apple make four 3,299 laptops ($13,1960) or one $12,999 MP extreme ? The first makes them more money. Gets worse if talking about 14 $1,199 phones ($16,768) or one $12,999 .
It has little to do with the performance.
The primarily way the M2 Extreme package yields would get substantially lower using the same 'chiplets' as the M2 destkop Max or Ultra is if the 3D packaging technology was screwing up working good dies in the construction process. There are likely some small losses , but probably not huge. ( The Ultra uses 3D packaging technology and no huge rumors about tons of good M1 Max dies being tossed in the trash can. )
or if Apple was using a even bigger "chiplet" to construct a Extreme ( try to mash the "Ultra" into a monoethnic die and it balloons up into the > 500mm^2 zone). The M1 Max is already a too big, chunky chiplet. Going bigger would just get worse 'chiplet' characteristics. (although it could get better pref/watt).
If Apple is effectively the only volume customer on N3B ( N3 ) then there is probably a max number of wafers that TSMC will want to allocate to N3B. If all the other customers are off waiting on N3E which get allocated from a different (and bigger) wafer pool than Apple has to juggle a cap.
That juggling gets even more protracted if the A17 is on N3B also. ( flipping the A17 to N3E may or may not have happened. Reportedly that switch requires non trivial redesign. )
The volume sales problem with the > $11K Mac Pro isn't the chips inside. It is the price. Fewer folks have that much money to spend. At $20K fewer still. At $30K even fewer. It isn't the "unit" costs for the M2 Extreme. if the M2 Extreme costs $1,000 to make and Apple charges the customer $4K . Apple is clearly making money. And if the MP cost $12K .. again the customer is clearly paying for the $1K unit costs. The more pressing issues is the number of people paying and the number of other people who want to also buy very profitable Apple stuff. if there is 10x as many of them then they can tip the balance can get tipped in their favor.
Apple and TSMC makes so choices based on production constraints means there are wafer limitations. Not that the defect density is very high. (or yields extremely bad).
I think there are a set of folks that don't want to pay $3-4K for an Apple GPU that spun that production limitation into "Extreme is bad" story. And this is a backdoor way to force Apple to do 3rd party display GPU drivers. Good luck with that ( don't think that is going to work) .
Pretty good chance this is more so the case that Apple 'bet the farm' on N3 and it isn't going to pay off as well as they hoped. Not going to be completely bad but they are going to have to take some lumps on the upper niche of the Mac Pro line up.