I do video/motion picture work (editing and colour grading) with my machines, so my hardware needs are generally a fair bit higher than the average Joe's.
When I eventually replaced my Mac Pro 1,1 in 2013 (it was still ticking away perfectly after a solid seven years of service, but it just didn't have the processing power to keep up with modern video footage) I went with a maxed-out late 2012 iMac (because the Trashcan Mac Pro was not fit for purpose).
That's the only iMac I've owned (so not much of a sample size), but with the consumer-grade iMac I experienced exactly the same thing I'd had with the consumer-grade PCs I'd built myself before switching to Apple in 2006 - which was that it gave up the ghost only three-and-a-bit years after purchasing it, it simply couldn't handle the workload of regularly crunching through high-res video work.
I've since moved backwards to heavily outfitted (and now hacked) 2010 Mac Pro 5,1s because Apple's Prores codec was too fundamental to my work to move back to a PC.
And although they were ancient in 2016 when I first made the move, and practically pre-historic when I upgraded to a revised 5,1 configuration back in December (due to financing falling through on the eye-watering $17,000 AUD I was looking at for a basic Mac Pro 7,1 config), the advantage of the old Xeons is that they just don't die. So if you're going to be throwing heavy lifting duties at your machine - that's really where the painfully high-costs of the server-class Xeon systems show their worth.
For less demanding workloads it's obviously not a big deal, and plenty of people get long usable lives out of their consumer-grade machines. But certainly in my experience, that hasn't proved to be the case.
I feel like we've entered an interesting space now, as video codecs and raw formats have somewhat stabilised in the last couple of years. And a machine like a maxed-out (10-core, 5700XT) 2020 iMac can actually handle those files pretty well (with some help from a powerful eGPU or two).
It has me wondering whether (assuming the recording formats hold roughly where they are for the next few years), a high-end consumer-grade machine might (this time) last a fair bit longer than they have in the past, since the workload demanded of them won't be increasing as exponentially.
When I eventually replaced my Mac Pro 1,1 in 2013 (it was still ticking away perfectly after a solid seven years of service, but it just didn't have the processing power to keep up with modern video footage) I went with a maxed-out late 2012 iMac (because the Trashcan Mac Pro was not fit for purpose).
That's the only iMac I've owned (so not much of a sample size), but with the consumer-grade iMac I experienced exactly the same thing I'd had with the consumer-grade PCs I'd built myself before switching to Apple in 2006 - which was that it gave up the ghost only three-and-a-bit years after purchasing it, it simply couldn't handle the workload of regularly crunching through high-res video work.
I've since moved backwards to heavily outfitted (and now hacked) 2010 Mac Pro 5,1s because Apple's Prores codec was too fundamental to my work to move back to a PC.
And although they were ancient in 2016 when I first made the move, and practically pre-historic when I upgraded to a revised 5,1 configuration back in December (due to financing falling through on the eye-watering $17,000 AUD I was looking at for a basic Mac Pro 7,1 config), the advantage of the old Xeons is that they just don't die. So if you're going to be throwing heavy lifting duties at your machine - that's really where the painfully high-costs of the server-class Xeon systems show their worth.
For less demanding workloads it's obviously not a big deal, and plenty of people get long usable lives out of their consumer-grade machines. But certainly in my experience, that hasn't proved to be the case.
I feel like we've entered an interesting space now, as video codecs and raw formats have somewhat stabilised in the last couple of years. And a machine like a maxed-out (10-core, 5700XT) 2020 iMac can actually handle those files pretty well (with some help from a powerful eGPU or two).
It has me wondering whether (assuming the recording formats hold roughly where they are for the next few years), a high-end consumer-grade machine might (this time) last a fair bit longer than they have in the past, since the workload demanded of them won't be increasing as exponentially.