Don't forget that Intel offered a Pentium D with two "HotBurst" cores.
To fill in the history lesson a bit more, in the late late 1990s, AMD started to have a seriously competitive Athlon line. Back then, processors were not typically advertised with model numbers but rather with clock rates. Intel wanted to respond and out-MHz/GHz them. They famously tried to make a 1.13GHz Coppermine PIII that was effectively overclocked; it ended up being discontinued/recalled because it was shown to be unstable by the enthusiast reviewers (I forget if it was Tom's Hardware, AnandTech, etc, but one or two of those guys were responsible for exposing this embarassing blunder). So their next architecture, the NetBurst architecture in the Pentium 4, was mandated by the marketing department to deliver high clock rates that would clobber the Athlons. They did that at a cost of less performance per clock, so the first Willamette P4s at 1.4-1.6GHz were barely faster than 1GHz PIIIs.
Now, back in those days, Intel had an advantage that they have now lost, namely the best transitors. So... maybe a year after Willamette, they moved to smaller transistors with the Northwood core, and it was... pretty good. Fairly hot, but... not too too bad. Some 'desktop replacement' laptops even came with desktop P4 chips.
But AMD was not being defeated, so the marketing department said they needed more GHz. The engineers responded with the Prescott core which, again, was supposed to deliver higher GHz by lowering performance-per-GHz. Despite smaller transistors (90nm), heat output went way up, performance was not that much better, and the highest clock rate Prescott shipped was 3.8GHz, only 400MHz more than the fastest Northwood.
Now, two things were happening at this time:
1) Increased heat output was becoming a challenge, so Intel came out with the "BTX" case standard that moved things around and made it easier to output the heat from a P4.
2) Mobile was growing, and the desktop P4s stuck in laptops were clearly not a viable option. So Intel went back to the PIII core (P6 microarchitecture), updated it, shrunk it to a smaller process, etc, and launched the 'Pentium M'. Lower clock rates, higher performance per clock. And they had a couple generations of those.
Unfortunately, AMD remained undefeated, and, instead, successfully managed to move marketing away from clock speeds. The "popular" AMD CPU of that time became the Athlon X2 3800+ (and its predecessor the Athlon 64 3xxx+), which... was not clocked at 3.8GHz.
Intel marketing, though, wanted to crush AMD with GHz again, so they had the engineering department work on another iteration of the NetBurst microarchitecture with lower-performance-but-clock and higher clock rates. That was the Tejas core, which ran substantially hotter than Northwood and Prescott... and was cancelled. That was the end of marketing driving architectural design.
As a short-term solution, Intel launched a dual-core version of the Prescott, called a Pentium D, then moved those chips to a 65nm process, but that was the end of the line for the "HotBurst" architecture. Around the same time, AMD launched popular dual-core chips, the Athlon X2. A first dual-core version of the Pentium Ms was shipped as the short-lived 32-bit-only Yonah "Core Duo".
Longer term, they decided to go to the Pentium M and scale that architecture back to desktop chips. The result was Conroe, which shipped in summer 2006 under the "Core 2 Duo" name. Probably the greatest leap forward in x86 CPUs ever. AnandTech, for example, described it as "the most impressive piece of silicon the world has ever seen - and the fastest desktop processor we've ever tested." (
https://www.anandtech.com/show/2045) The most popular model, the E6600, was clocked at 2.4GHz... and utterly annihilated the Pentium 4s in every benchmark despite having 1.4GHz less. (My recollection is that that architecture was only offered as a dual, and later, quad core, though I might be forgetting a few weird low-end chips sold as Celerons or something.)
And, rather ironically, the 2.4GHz E6600 Conroe accomplished what 5-6 years of high-GHz NetBursts had not - shove AMD to the side, a spot from which they would never recover until the launch of the Zen microarchitecture over a decade later. The quad-core version, the Q6600, would become the de-facto enthusiast CPU standard of the late-2000s, replacing the Athlon X2 3800+.
(The Xeon version of the Conroe core was used in the first Mac Pros, I might add... and the mobile version landed in MacBooks and iMacs. And when Steve Jobs announced the Intel switch and talked about Intel's performance-per-watt roadmap, this is the chip he had in mind, not the Pentium 4 in the developer transition kit.)
Around that time, without hot P4s to cool anymore, Intel abandoned the BTX motherboard/case standard, which had only been used by a few machines from large OEMs like Dell, and went back to ATX which continues to this day.
Also interestingly - to this day,
17 years later, I would argue that the Conroe C2Ds are still the 'base' processor for Windowsland. Microsoft is killing them with Windows 11 and its steep hardware requirements, but I would believe that a Conroe with enough RAM can run Windows 10, productivity applications, etc perfectly adequately to this day. (Disclaimer - I sold my E6600 over a decade ago. I still have some 45nm C2D machines in the closet, but have not booted them up in 5+ years. So maybe web-technologies garbage like Electron has finally slain the mighty Conroe)
When they moved to the 45nm process a year or two later, they added various power management features that massively cut idle power consumption. (This is why I got rid of my E6600 - a 45nm C2D would idle at 30-40W less) And then they invented turbo boost, i.e. the opposite - something that, if the chip was cool enough, would allow it to consume more power and overclock itself to go faster for a short period of time.
Anyways, during all of this time, Intel always had the best transistors. The best transistors and a meh architecture created passable products like the Pentium 4s. The best transistors and a great architecture created legends like Conroe, Sandy Bridge, etc. that pushed both performance and performance-per-watt substantially forward and left AMD without a response for years. (Many Sandy Bridges are still in active use over a decade later...) And the best transistors kept the x86 juggernaut going/growing in the face of otherwise-technically-superior ISAs from others (e.g. PowerPC).
But then two things happened:
1) Intel had major, major league trouble getting their 10nm process running. Instead of a new process every two years (which powered the so-called tick-tock model) as we had seen since, well, forever, they ended up stuck at 14nm for a long time. 14nm desktop chips came out in 2015 (Skylake); 10nm ("Intel 7") desktop chips came out in late 2021 (Alder Lake), so six years instead of two.
2) The PC industry stopped being the driving force of innovation in semiconductors, replaced by smartphones. And so, with the money being in smartphones, the foundries that produce smartphone chips (Samsung, TSMC) became the leaders.
AMD, wisely, saw #2 coming, got rid of their fabs, and rearranged their business to have TSMC make their chips.
The bottom line is that the best transistors are not from Intel anymore - they are from TSMC. And that is how you end up with today's world:
1) Intel's stagnation at 14nm (and increase in clock rates/core counts/power consumption to mask that stagnation, see #3) was undoubtedly part of the factors that drove Apple to scale up their TSMC-made ARM chips for Mac use - my guess is that if Intel had continued innovating at the rate of the 2006-2014 era, Macs would still be Intel,
2) TSMC transistors have powered AMD's resurgence, as AMD is now offering the x86 chips with the best transistors (which led AMD to be the first to offer Windows enthusiasts a worthwhile upgrade to their ~2011-2015-era chips), and
3) Intel's only short-term response has been to jack up the clock rates, particularly relying on turbo boost. And as the clock rates and boost rates have gone up, power consumption has gone back up, with modern CPUs, especially the enthusiast K versions, hitting TDPs higher substantially than the peaks of the NetBurst era.