Well... as an old dog software developer I'll say that the only reason for the hardware to exist is so that the software can run.We all agree that the M1 chip is amazing. But, objectively speaking, the 7th gen i5 on my work PC should be plenty good enough to run Office and one bespoke piece of software which is, ultimately, a front end to a SQL database. Office and this software run like dogs.
With the M1 giving all this processing headroom will Apple with macOS and other developers just get lazy and in a few years we will end up exactly where we were a year ago, with tremendously powerful machines that are just a bit laggy and sluggish?
It has been DECADES since software was well designed and tightly written. As hardware became faster and cheaper, there was less justification for taking the extra time needed in improving design and coding. It is easier (and therefore cheaper) to simply cobble code together that works and get it out the door.
Standard frameworks allowed for developers to produce software that was "above their weight" at the expense of performance.
and... err... excuse me a moment... "hey you kids! Get off my keypunch machine!" ?