It's not a 'cult of the SSD', it's just that not only do hard drives become slow after 6-7 years of use
Really??? How in the world does that work? Do they mechanically start spinning more slowly? Do they start only reading every other byte after a while? Is there some buffer somewhere that kicks in after 6-7 years, that collects the data being transferred and pauses before allowing it to continue moving?
I think the only way an HD can seem to "slow down" after a long period of use is if the data stored on it has become "fragmented"; i.e., each file has been broken into multiple small pieces and scattered to various different places. OSX contains features to minimize fragmentation, but the problem
can still occur if you've got a drive that has been used for a long time
and is nearly filled to capacity.
There are several defragmentation utilities out there, but probably the easiest way to defragment an HD is to copy the contents to another drive, reformat the fragmented drive, and then copy the data back.
(Note! Don't do this for an SSD. Fragmentation doesn't cause problems for SSDs, and unnecessary extra writes only speeds up the rate at which they wear out.)
but I also think that newer versions of OSX are optimised for flash storage (or fusion).
And here, too, I've only ever heard this claim on this forum board. I can find no mention of it on Apple's website, and see no signs of it on my own El Capitan machine. So far as I can see, SSDs provide their particular speed benefits to every machine they are installed in, regardless of OS version.
Could you point to the new features of OSX that, in its optimization for SSDs, have actually
degraded its performance on HDs? Thanks!