This is not meant to be politic but with covid all the people who refuse to want masks or get vaccines are not thinking of the children. Just saying.
It is amazing to say the least. Oh well. I know this post will get deleted.
People who drive with a phone in their hand aren't thinking of the children, or anyone else around them. My risk of someone running a red light and T-boning me at an intersection is astronomically higher than contracting COVID, mask/vaccine or not. It's on the local news every day with multiple people's lives being turned upside down (or worse) because of either impatience or distracted driving.
Has there been studies for the detrimental health effects of persistent mask wearing such as reduced O2 levels or increased contaminant load for what is trapped on the inside of the mask? I know I personally had a brain fog and lack of motivation that took a couple months of no mask wearing to shake.
It's about risk assessment. With Apple's scanning system, since it's well pointed out that this won't be very effective at CSAM detection because, among other things, it's easy to avoid, it must be good at something else. Since Apple has put themselves in the middle, they can only really verify that the database isn't being abused by their sources once accounts are flagged. And, since they are visual derivatives (is that not reversing the hash?), that's not a lot of protection. So, there is ripe potential for folks that innocently save pictures from the internet getting put through the wringer because some automated system accused them of something and human reviewers didn't accurately vet the report. It's like the video I watched last night of a guy riding his motorcycle, having a bunch of other motorcycle riders fly past him at way above the speed limit, but the guy doing the speed limit gets pulled over and put in handcuffs. Guilty until proven innocent. He would've spent time in jail had he not recorded his ride.
It's not whether or not I trust Apple, I don't trust the database providers to refrain from abusing the system. Apple put themselves in a very vulnerable position with their own system and stabbed all their users in the back. At face value, it'll only catch those that collect CSAM, and only old stuff at that. These people need help, but hopefully having CSAM is enough to keep them from doing actual harm to children. Apple's announcement, if the situation on iCloud is as bad as they make it seem, hurts Apple and all Apple users, and creates more demand for new, unknown content, causing harm to MORE children, not less. Industry-standard approaches and good, old-fashioned police/detective work actually go after the people making and sharing new content...as in, the ones actually doing harm to children.
It doesn't make it right to put this system on-device. Apple can already see everything in our iCloud Photo libraries (they hold the encryption keys), why go through all this hassle? E2EE iCloud isn't even a good enough answer. If Apple is worried about CSAM being hosted on their servers, that implies it's being shared with other people. Sharing photos with other users on iCloud necessarily breaks E2E encryption (otherwise, the people you share it with can't see the photos). Scan the stuff that is shared directly on-server and leave the endpoint devices alone.