Um no you wont. A very large portion of the time, the memory may flip on some data that is irrelevant.
Well, yes, but irrelevant data is, er, what's the word... Oh yes,
irrelevant. I think I covered that:
If a bit flip matters, you'll know.
Plenty of memory locations that
aren't in the middle of an uncompressed bitmap file or sound sample
do matter and if you have a significant problem with memory errors, sooner or later a pointer or a byte of code gets changed and there are consequences. Plus, this was in the context of the people saying that ECC will warn you of a failing memory chip. I guess it would - and there are more advanced forms of ECC like 'chipkill' that can
cope with a failed memory chip - but that's not really the #1 reason for wanting ECC. If you have a failing chip, you'll get clues. If your data is
so mission critical that you can't risk a single bit out of place then you should have data integrity checks up the wazoo, because ECC only guards against one, fairly specific, class or error.
Like many features of data-centre grade equipment, ECC is really only something that matters
at scale when million-to-one-chances crop up nine times out of ten and its worth spending money on even a
partial mitigation. People can cope if their personal Mac Studio suffers a glitch every six months, if you're running a data centre with hundreds of systems, that level of error could take hours out of every week.... but then, Apple hasn't even
pretended to make data-centre grade equipment since about 2010 when they dropped the xServe.
I suspect that the idea that if it ain't ECC it ain't a workstation is largely thanks to Intel marketing (...it's a corollary of "if it ain't Xeon it ain't a workstation" since Intel, for a long time, studiously avoided supporting ECC on their Core i series). Because, the one thing I
don't see here is people saying "I had to throw away my iMac/Studio because it was plagued with memory errors".