That is the biggest issue, I mean you can get by for years without needing a new CPU, you can add an external drive to provide more disk space but you cannot add RAM, ever. This is pertinent as so many lauded the 1st batch of M1 devices as 'you will never need more than 8GB Ram' which quickly turned into 'you really need 16GB of Ram'.
I honestly think we're close to the point now where end user RAM capacity requirements are going to remain reasonably static for some time for "most people".
So long as streaming throughput to/from storage into the RAM/CPU is fast enough, most end users (consumers, not professionals) won't need much more than 16-32GB for a while.
I mean even 8 GB has been fine (for most consumers) for the last decade or so. The step to 16 GB will be fine for most people for a decade. Absent any revolutionary tech that isn't yet on the horizon. For the coming wave of "mass market" AR/VR it is more than fine.
There will of course be niche users who need more (for my day job, i am one of those due to network device simulation, etc.), but for most consumers if anything needing more than that is in their workflow it will maybe be offloaded to a server in future.
But again, as network bandwidth improves, the big growth in memory capacity for most is on the server side.
We're gradually coming full circle where a lot of the back end work is handled by a server (the cloud) and the end device is just display and input (back to the dumb terminal days - albeit MUCH MUCH PRETTIER dumb terminals (ar/vr)
). Same concept as back in the 70s, just the UI is a lot nicer.