In years to come all this will be irrelevant because computers won't exist as we currently know them where the heavy lifting is carried out locally and determined by the specification of your computer. Computers will just be windows (pardon the pun) to a virtual machine that you access through terabit internet, OS and all. Processing power, storage capacity and RAM will be things you subscribe to on monthly or yearly plans; your machine itself will only need the absolute bare minimum processing power required to shuttle huge amounts of data to and from the internet. Playing AAA games will require no greater local processing power than your TV currently needs to stream and display a 4K film. Sounds far-fetched? Not really. Try telling someone in 1993 that by 2023 features in your new car will be enabled or disabled remotely over the internet by the manufacturer depending on whether you subscribe to optional services.
That is one possible future. It depends on how small and how cheap computing devices get. An alternative future might be that in another twenty generations computers will be 20x as fast, fit into a lapel pin and cost $10, so that compute power is everywhere rather than being piped across the net.
In that kind of a future, low-power architectures will have a key advantage, and interfaces will likely have been reinvented. It all depends on what happens first, pervasive, publicly available glassfiber data cabling for fast internet, or cheap and small compute power.
Certainly bigger jobs will get put in the cloud more and more. If you had asked a desktop publishing professional whether he’d ever trust Adobe to store his data on the internet thirty years ago you’d probably have gotten a ‘zero chance’ back, but look what happened. Software makers are not stupid, they know that in extracting maximum value from their users they have to control the process.