I thought I could trust Apple considering they stood up to the FBI before and refused to build a backdoor and have been innovating with privacy features in recent years.
The CSAM detection feature feels like the first time since all of those movements that Apple didn’t work to preserve privacy, and it’s broken my trust. I knew Apple already bent to the will of the Chinese government in China, but still, Apple was innovating with privacy here in the U.S.
Now, Apple has built a way to scan your images for things right into the operating system. Yes, there are multiple checks in place (the hashes having to come from both of the two child protection agencies, human review, iCloud Photos has to be turned on, 30 picture threshold), but I just can’t stomach it. It’s essentially a backdoor, it’s able to be abused, and Apple no longer has the excuse that they can’t do something if forced by a government.
The fact that pretty much all of the major privacy advocates are saying “stop! This is dangerous” only makes me feel worse. I would’ve been more okay with this whole thing if Apple just did scanning of images only on the iCloud server, even if it was more privacy invasive, because at least I already was expecting my privacy to be invaded if my content is on someone else’s server. But when it’s baked into the OS, it just doesn’t sit right with me.
I’ve already been in the process of moving to more privacy-driven software anyways, but maybe this was the reality check and kick in the butt I needed to really kick that process into gear. I feel like I put too much trust into Apple now.