Some do, but your argument seems not to:
Of course that wasn't my actual point. My point was that nobody can actually predict what will be needed in the future, and everyone will be particularly bad at predicting what other people will need in the future. One person saying they know 8GB will be fine and another person saying they know it won't is just guessing.
If I buy a box with a certain configuration today, I should be able to do with it in 5 years exactly what I can do with it today. If I have no intention of using it for anything else, there's no reason to spend more for it than that. If I want to plan to use features that don't yet exist, or to work with data and files that have not yet been created, I can try to run a curve through past trends to predict some needs, but that's going to be an unreliable predictor at best and also completely ignores other disruptive changes that may be more impactful. I can guess that I might need more RAM to run some future ML model, for example, but then when the time comes to do so I may find that I need more RAM and an updated version of the Neural Engine. That is to say that more RAM may be necessary, but not sufficient, to get the "optimal" computing you're seeking and thus my expenditure on memory could simply have been wasted relative to saving that outlay today and investing it toward a future purchase.
This is especially true in the Apple ecosystem where Apple ties features to product generations. They rarely differentiate on spec requirements within a generation.
For some people, including myself, there's a bit of "imagine all I could do with some extra resources" angle to a purchase because we get excited by raw horsepower in the same way people get excited looking at the specs on a supercar. For other people, and I'd venture most people, they had a machine that worked ok and now for whatever reason they need another one to work about as well.