You hit on it. There is no incentive. Bad code is irrelevant when a modern CPU and GPU can chew right through it and when bandwidth isn't limited. As to storage, yeah that too.
Writing good code stopped being any sort of a thing a long time ago. Maybe it's still taught in some places but all anyone cares about (that is paying you to write code) is that your code does what they want it to do.
This has been a long term problem and it has manifested itself in many areas. I remember an article during the Pentium I era where journalist wrote a criticised a decline in the efficiency games programming. They noted that when programmers were previously writing for machines that had "static" hardware: non upgradeable CPUs and RAM, they focused on optimising their codes and routines to coax the best performance out of the hardware. However, by the mid 90s, the journalist had witnessed a shift to increasing the system requirements instead. Even then it was already becoming a lost art for programmers to refine, tweak and finesse their code till it was optimal.
From my own experiences during the 80s/90s, in the manual for one of my arcade-adventure games, the programmers proudly explained that they had used every last byte of the computer's
32K RAM to deliver an outstanding game -
and they certainly did. I recall the preview for another game from during that period in which the lead programmer said that it wasn't ready for release because improvements were required to speed up the gameplay. Today, it's quite likely that the solution would've been to insist on a faster CPU.
Is the G4 dead?
To Apple: yes.
To
me: absolutely not. The Sawtooth G4 was my first Mac. I had it delivered to my workplace and my colleagues excitedly gathered round when it was unboxed and plugged in.
Beyond the sentimental value, taking into consideration the obvious technical constraints, it still holds up well in 2019 for gaming, productivity and creativity.