I decided to log back in after several years after seeing this question come up. A lot of people are going about this the wrong way. Here's a different perspective:
Why does Apple act like Intel is always holding them back from releasing current machines every year? Does Intel not release new chips every single year?
Currently Apple has chips in their Macbooks, iMacs, and Mac Pros that are not available to other vendors. These chips are binned from Intel's main pool of chips and reserved for Apple, because they want the CPUs with the best power consumption and least amount of leakage. Binning takes time to generate the amount of stock your customers require, so Apple has to wait at least two to three months to build up initial stocks that they can use for their Macs.
In the meantime, they have to design their hardware and iterate on their prototypes to make sure they're ready for production. This is where the bottleneck currently is. Had they been second-sourcing AMD at this point, they wouldn't have been put in this position.
As a reminder, developing new notebooks takes an average of two years from prototyping to production, sometimes three if you need new tooling. If you need new factory tooling, you're going to reuse that tooling for 5 to 10 years to make up for the cost. You have existing relationships with other manufacturers that can't be affected by slowdowns at the bottleneck.
Why couldn’t have Apple updated it every single year with 9th gen. and now 10th gen. to keep it current?
Every vendor has been having issues getting their allocations out of Intel. Vendors are currently in a sort of bidding war to get product on time because 14nm and 10nm production was under strain. Intel also restarted their 22nm production because they needed to backport chipsets to it to make up volume needed for chipset orders (because 300 and 400 series chipsets were 14nm).
On it's own, this isn't a big problem. But larger orders were taking up allocations for smaller ones. You would have Dell and Lenovo and Pegatron sucking the channel dry in order to keep their sales channel stocked, and smaller vendors (in terms of orders) would have struggled for their allocations.
So Apple has to wait more than the allotted time for their chips, and they get less chips overall because they also have to order chipsets.
But wait. Meltdown and Spectre have entered the fray. Not only do you now have chips that need to have features turned off to guarantee that your attack surfaces are smaller, you're also getting into situations where extensions you may have relied on for performance reasons are disabled. Intel just disabled TSX again, as a matter of fact.
If Apple wants silicon that contains Spectre and Meltdown mitigations, they would have had to wait even longer. This results in them relying on their T2 chip more and more.
I could be totally wrong, but it sure looks like Apple is just lazy to me. I mean if I built a PC tomorrow it would have 10th gen. Intel silicon in it. Then next year I would update it again to 11th.
Here's the next hurdle: 10nm production and the move to 10th Gen silicon. On the desktop, the socket changes from LGA1151 to LGA1200, and you now have compatibility with LPDDR4X memory on mobile. As 10th Gen arrives, chips and chipsets are limited to production availability from 10nm and 14nm fabs. Consequently, LPDDR4X production is also low.
In fact, RAM production in general is lower, which compounds existing issues. The mobile phone market isn't affected by anything going on at Intel, so their orders and purchase power remains the same, only there's less left for everyone else.
So 10th Gen requires a board redesign, and there are new power requirements and different boosting algorithm to work with. Seeing as Apple is already behind schedule thanks to Intel's production issues, they're not likely going to adopt 10th Gen immediately. They work out the kinks, get the boards ready, and lower their orders from other vendors so they're not left with overstock.
As a result, there's a cascading failure in the supply chain of third parties who relied on Apple's orders to make up the bulk of their sales. As Charlie from S|A pointed out in 2015, Intel's failures with 10nm will affect every business related to theirs, or their partners, in the supply chain.
It sure does feel like Apple plays the victim a bit and throws shade when it seems like they could get off their hind end and update their Intel machines with what Intel releases each year.
This isn't unique to Apple. Every other vendor threw shade at Intel over the last two years for their inability to meet demand thanks to their fumbling of 10nm, and 10nm+ production being lower. 10nm++ is also on the horizon with Tiger Lake. The difference is that those other vendors could, and did, pick up AMD silicon to make up for Intel's losses. A number of vendors have even given AMD more space in their product offerings now that they've seen how good it is and how well they sold.
Apple couldn't do that. They had designs ready for Intel silicon that was expected to hit the market two years in the future. They didn't have AMD prototypes that they could eventually turn into a new product to compensate for that loss.
This is not a debate about the benefits of Apple using its own silicon, but rather simple a question as to why they have never kept current with what Intel has available.
Here's the difference between Apple Silicon and Intel: Apple can buy guaranteed production from TSMC. They order wafers, TSMC delivers.
Not only can they guarantee that production, they can design a modular arch that scales up and down their product stack, and they can use their fab allocation to figure out how much goes to mobile chips, how much to Mac and Desktop, and so on.
Yearly iterations from Intel would have been possible if they hadn't lost the plot. Apple's mistake was not second-sourcing x86 chips from AMD, but since they're all in on their own chips now, this isn't going to be a problem anymore.