So what is your take on the current situation? Just about every day as I am reading important websites related to programming I hear about a new idiom that seems to be "the next big thing" reducing the amount of time spent programming while still maintaining easy to debug code.
Yet in the last 20 years computer hardware has grown in power exponentially and yet software has not shown the same increases in speed as one would expect from such increases in hardware. This must mean one of two things, 1) software has become more complex (meaning it has more features) or 2) programming techniques have become so sloppy that we actually need that increase in hardware power to keep our software working at a reasonable speed.
Personally I am of the opinion that it is a mixture of the above, software certainly does do a lot more than it did 20 years ago but that does not necessarily account for the relatively static speed increases that software gets when compared with the underlying hardware.
I am interested in your opinions on this.
Yet in the last 20 years computer hardware has grown in power exponentially and yet software has not shown the same increases in speed as one would expect from such increases in hardware. This must mean one of two things, 1) software has become more complex (meaning it has more features) or 2) programming techniques have become so sloppy that we actually need that increase in hardware power to keep our software working at a reasonable speed.
Personally I am of the opinion that it is a mixture of the above, software certainly does do a lot more than it did 20 years ago but that does not necessarily account for the relatively static speed increases that software gets when compared with the underlying hardware.
I am interested in your opinions on this.