Oh yes I remember those days.............................
I remember the big RAM shortage of the mid-1980s (after a failed attempt at propping up the US RAM industry with tariffs on Japanese RAM). It took us months of shopping around (and about $500) to find 256K of RAM to boost that computer to 640K. We had previously boosted it to 384K by installing an expansion card that also added a DMA chip. It was exponentially better by the time i upgraded to a 16MHz 386SX PC with 1MB/RAM and 40GB HD a few years later. I think that machine cost about as much as that 8MHz 8088 PC with 256K.
Those were certainly the days. I was a DOS user back then (virtually skipping Windows until shortly before 95 came out). I got pretty good at speeding up those PCs with disk caching utilities, defragmenting hard drives, using utilities to access the "high memory area" to maximize all 640K we got, etc.
Macs had somewhat better memory management back then, but my first experience with the Mac was System 7.6 on a PowerBook 540c. It had a similar footprint to my MacBook Air, but was about 4 times as thick. With a 240MB HD and 4MB of RAM and a color screen, it was pretty advanced for its day.
in all seriousness, those old computers handled word processing, spreadsheets, and even basic PowerPoint presentations reasonably well. True, those machines would chose on SD video (let alone HD video) and didn't have enough storage for even 1/3 of a CD, but that helps put into perspective while even "ancient" chips like the Core 2 Duo still hold their own for tasks that the average user throws at them. Hardware always advances more quickly than software. Looking back at the past 25 years or so I've used computers, I've noticed that hardware advancement is pretty steady while software advancement comes in fits and starts. That means that for a time, even old hardware manages to keep pace.
On the PC side, the adoption of the GUI and breaking of the 640K barrier in the early 1990s sparked a massive increase in computing power. Before then, hardware had gotten significantly ahead of software. The next big development was multimedia, and then the Internet. Shortly after, Steve Jobs returned to Apple, and since then he has largely dragged the rest of the industry along with him. The iPod and iTunes were the next major advances, IMO, as they ushered in a new era of media consumption. Since about 2007, most of the development has been on the mobile devices front. As far as PCs/Macs are concerned, we're still digesting the conversion to 64-bit (which has been as tortuous as the transition to 32-bit was on the PC side in the 1980s/1990s, though a bit quicker). I think we're still waiting for the "next big thing" to take us to the next level on the PC front (using PC in the broad sense). Maybe it's "Watson" or maybe it's still in a lab or entrepreneur's mind at this point, but the main effect has been that the CPU and even RAM have not been the bottlenecks for a while.