I had to read this several times. I read the whole thread and do not agree that hardware being leaps and bounds above software would cause software stagnation.
My first computer that I used as a kid was Windows 95 with a 200MB partition. Yes, I had no space to do anything but the computer was not mine. As the Pentium processors advanced, software took advantage of available hardware. Over the last decade, graphics cards have really taken off and software has advanced because of that (digital $, mass computing projects, crazy good graphics, etc).
While we have somewhat plateaued the needs of the CPU, we've found that just increasing the Ghz clock doesn't give us the gains we notice, especially today - instead, we increase the throughput for more parallel processes at once. IMO, the Operating System from Windows 95 to Now is not all that different. So running a Pentium 4 vs a modern chip today isn't going to see a lot of difference for most operations.
This is just my opinion but hardware drives software. Increasing hardware will result in software being written to take advantage of it. CPUs have not had much of a growth over the last decade imo. Apple's M CPU is amazing and the biggest jump I've ever seen in my lifetime in computing power.
On my M1 Max - I can do Parallels with a Windows 11 Arm compiling a Visual Studio project while running GB memory applications that deal with massive Oracle databases, seamlessly run 2-3 browsers opened with 20-30 tabs each, dozens of apps opened, streaming a Teams meeting sharing my screen, and according to Activity Monitor - I'm using 20% of my CPU. All this without an audible CPU fan. My i7 MacBook would scream max case fans just booting Windows 11 up not to mention doing anything in Visual Studio or the Chrome browsers.
Software can now be written to take advantage of this power.
In all my years of using computers - the power of the M1 continues to blow me away. Do I think this is stagnating software? No. Is it overpriced? No. (My opinion of course).
Right now Apple is most likely highly dependent on Silicon right now and they are probably focusing as much research as they can because right now it is a race for SoC, it is one of those times where Apple has to prove once again that they are a competing company with the likes of their opposite predecessors
Not only that because of Apple Silicon software has to be changed just as fast as it's hardware in a way they have to pull it off will be beyond measures