What actually happens is that software engineers must meet tight deadlines and have to spend most of their time on flashy new features instead of optimising the performance and memory footprint of new and old features. That's how we get "bloated" software that needs many GHz and many GB of RAM to do what much slower computers with much less RAM could do a couple of decades ago. Add to that the bloated craziness of the modern internet...
Also, what happens is that features in software (and sizes of data both due to larger user data and requirement for larger memory buffers) expand over time.
edit: below is not aimed at the quoted poster above who gets, it - but plenty of others whining about modern software "bloat".
You might be doing most of the same basic tasks on your machine as before, but graphics are now 4k HDR instead of 256 colour 480p, sound is 24 bit at 96khz and/or dolby surround instead of 22khz, your internet connection is 100 megabit or faster instead of dialup and hence needs MUCH larger network buffers, your disk is much faster but optimised for large reads and writes and needs huge memory buffers, etc.
Video codecs are now far more efficient in terms of compression ratio, but far heavier on CPU and memory, ditto for audio codecs, etc.
If you were to re-write a modern OS in a low level language and highly optimise it many things would happen:
- a lot of the OS features would simply not be feasible to implement at all. they'd never make it to the real world
- the OS would cost a heap more due to the far higher development effort required
- many applications would have less features or maybe not even exist as the barrier to entry for writing them would be too high. They'd also cost more.
- there would be a lot more security problems because memory management at a low level is hard and people suck at it. a lot of what people consider "bloat" is a side effect of abstracting the complexity away to make these problems manageable. Managing it is a case of throwing a little more cheap resources at the problem (CPU/RAM).
- Writing direct to hardware in low level and not using OS provided high level libraries - your apps would simply be unable to take advantage of new hardware as it is released. You use the apple OS provided libraries like metal for stuff - you instantly get the benefit of improved hardware features.
Its all well and good to whine about inefficient code and bloat (which honestly, isn't the big problem most people think it is) but you need to understand its the only way the modern internet could ever exist. A modern browser for example is a massive, massive code-base and there's simply no way it would ever come into being written in low level code and highly optimised.
Programmers and programmer time is expensive and a rare resource. CPU and memory is cheap and gets exponentially cheaper over time. This is what we were taught in computer science 30+ years ago and it's still the exact same trend today. Hardware catches up. Build the new thing that wasn't possible before, don't focus on optimising the software equivalent of the wheel, that already exists and works well enough on a reasonable machine.
It mostly comes down to:
- Stop being cheap and buy a machine capable of running modern software, if that's the software you want to run. The requirements are what they are - if you think you can do better, try it yourself. Its hard.
- If you want to run "optimised" (in reality: low feature set and limited capability) old school software from 1995 or whatever - do so and live within the feature set that software has. Because you sure as hell aren't getting the modern feature set for free! Enjoy your 240p crap RealVideo internet!
Seriously the glasses really are rose tinted. Go back and actually try to live for a week with a machine from 1995 or whatever. Almost nothing you want to do with the thing today will work, especially if it involves the internet.