I think it's possible that in 5-10 years, computers will look very different than they do today.
More and more of our data is being stored in the "cloud". Some of it literally, using services like iCloud, iTunes Match, Dropbox, etc. Or, the stuff we really care about is stored in services like Google (Gmail, Calendar, etc.) and social networks like Facebook and Twitter (photos, profile info, friends, collections of status information). Or we pay to access data that's not really "ours", like Netflix.
We access this stuff using our PCs, sure, but more and more from our smartphones and tablets, and media access devices like smart TVs, Apple TV and Boxee Box, and media consoles like PS3 and Xbox.
As more "devices" are becoming powerful and computer-like, the traditional multipurpose desktop computer (running a desktop operating system) is being used comparatively less.
At work, I really just use my Dell PC as an email and web browsing client, and as a terminal into a Linux server where I do all my real work. I see that everywhere. The bank teller is using her PC to remotely access the bank server. The cashier at the grocery store is using his PC as a terminal to the store database.
I could imagine that computers of tomorrow will essentially be thin clients in a variety of shapes and sizes. An interface to the real data which is stored elsewhere.
I suspect Apple does too, which is why they are slowly bringing iOS features to desktop computers. This latest rumor that they may be interested in making their own chips would seem to add fuel to the fire. "But you won't be able to run Windows! Or desktop apps!" is the main cry against. "Exactly", I think Apple is thinking. "And you won't need to."
However, I don't see Microsoft ceasing to exist in such a world. They would simply take on more of a "behind the scenes" role. IBM no longer sells PCs or consumer software, but they still very much exist.