One of the benefits of integration is actually much improved durability and reduced failure rates for every subsystem.
Yeah, I’m aware of that. Eliminating connectors helps: Nothing to shake loose, no debris to collect or chemistry changes to happen between conductors, etc. That’s robustness in the finished product. What about manufacturing?
Parts are so small; tolerances in manufacturing are
very low. We have read of Apple rejecting parts manufacturers due to low yields of defect-free parts, like high-PPI displays. What becomes of the rejected materials? How much failure is tolerated at that level just so we can have these compact devices with no moving parts?
There is apparently enough regular failure in manufacturing that companies have decided to sell defective product or risk making less profit (or is it so bad that they risk not making profit at all??). They demand we accept dead pixels in displays as “normal”; screens with multiple dead pixels are deemed “non-defective”.
What about flash storage? How many cells are defective on day one and are otherwise invisible due to over-provisioning? How rapidly do they fail in normal use? How much over-provisioning is there before functionality is impacted? (My iPhone 4 has MP3 files suddenly corrupted, and that device is only 12 years old; is it the flash storage?) What’s considered “normal” use? Does lots of VM paging affect this? By how much?
In the case of integrated flash storage, we can’t replace parts we KNOW for a fact
will wear out before other parts.
We‘re being pushed to throw things away after three years of use. What if you WANT to make use of one of these devices with flash storage for more than 5 years? 10? Longer?
It may sound ridiculous to the bleeding-edge ethos of tech geeks, but (as I said elsewhere), I wonder if SoC computers will ever even make it to “classic computer” status.
On the end-user (non-content-creator) end, processors and storage don’t need to be “better” anywhere near as rapidly as companies want to push repeat sales. Software always obsoletes devices faster than they physically wear out, and that should be an actively controversial issue. The hardware can’t wear out in that artificially short period of use, so we act like longevity doesn’t matter.
Eventually it will.
We are going to turn our world into a Max Headroom dystopia (where almost
nothing is made new anymore) because we’re going to exhaust easily-utilized materials.
Our culture has been conditioned by tech capitalism to act like this is all a silly joke. It’s pretty bizarre to me how far this bleeding-edge ideology goes. There’s a guy on the forum complaining that the Mac Studio will be a useless boat anchor in a few years because Apple didn’t make it bleeding-edge enough. ? ??♂️