Apple has always had access to and will continue after this change to have access to the photos in your iCloud Photo Library. Apple hasn't been completely transparent on the details, but most seem to agree they've already been performing the CSAM checks on iCloud Photo Library uploads since at least 2019.
The flaw with your comparison to Ring, etc. is that there's no way to detect such videos without a human having to review each and every video. Not only is that impractical, but it's also a violation of privacy, which what Apple is doing is NOT since they are only alerted to illegal images uploaded to their servers and not having people peruse all of your photos looking for something.
Wrong. Any of these manufacturers could make use of the same "AI" and say device level monitoring of the audio from the cameras is listening for certain sounds or phrases and only sends notification if it hears X number of trigger sounds/phrases, 100% the same thing as all of the video and audio is being sent to their individual cloud servers.
Moving the CSAM check to the device level doesn't do anything meaningful with regard to privacy since Apple already has access to the raw photo since its part of the user's iCloud Photo Library. Its just a guess on my part, but based on the way they describe the voucher mechanism implementation and the fact they've enabled it at a device level, they are most likely going to try and expand this to the iMessage service at some point in the future. Since iMessages are end-to-end encrypted, the check has to be performed on the individual devices.
I'm sorry, but that's not the same thing at all as comparing file hash info.
Then let them monitor it when it hits their cloud. Until then it's on MY device and that is akin to surveillance in MY house. They can monitor it when it's left MY house(phone) and is in their house(servers). NOT before.That is utter and complete BS. That's like saying if a business monitors their security cameras for illegal activity and reports it to the police, they are "acting as an agent of law enforcement." No, they're simply reporting a crime they've observed.
I can see Apple expanding the scanning in iMessage to include text phrases that a predator seeking to lure a child may use.You're just babbling at this point... Moving the CSAM check to the device level doesn't do anything meaningful with regard to privacy since Apple already has access to the raw photo since its part of the user's iCloud Photo Library. Its just a guess on my part, but based on the way they describe the voucher mechanism implementation and the fact they've enabled it at a device level, they are most likely going to try and expand this to the iMessage service at some point in the future. Since iMessages are end-to-end encrypted, the check has to be performed on the individual devices.
I suspect that the TOS for iCloud will be changed to include Consent, so you can either cancel your iCloud or accept the new TOS.I wonder of potential legal challenges to this, as it applies to personal devices and to general privacy rights (ie: non-cloud). One of the most personal items we have today is our cell phones. Even with aggregated data analysis (ie: tagging or "hashes" of photos) there is still access to same, in this case without consent. The latter part being significant.
What I have found, over the years, is there appears to be an uptick in using child porn or human trafficking as the basis behind taking invasive actions that I surmise have much different agenda and provides for broader abuses of privacy.
Getting the public riled and angry about abuse is an easy way to gain psychological pliancy. This tactic is classic "perception management" used in information warfare, etc.
I just find this move by Apple to be very perplexing and disappointing.
The EFF has chimed in with their take on this:
Apple's Plan to "Think Different" About Encryption Opens a Backdoor to Your Private Life
If there is a larger legal challenge to this, the EFF will most certainly need to get involved.
That Apple continues to hold and control encryption keys to all iCloud content is yet another obvious measure (or failure) on their part, and likely part of the intentional design strategy and is not optional.
I suspect that the TOS for iCloud will be changed to include Consent, so you can either cancel your iCloud or accept the new TOS.
Either way, you can't judge a person by what they may do in the future, only what they have already done.Nope, the announcement this week was this the scanning will happen on your device. But, yes, it only happens if you have iCloud Photos turned on. That's simply a policy decision though, and not some sort of technical limitation; it could change any time to run whether you're using iCloud or not.
Either way, you can't judge a person by what they may do in the future, only what they have already done.
So far, Apple's stance is that they will only scan photos destined to be uploaded to cloud storage, no different from what Microsoft and google are already doing, so it seems fair to grade them along these lines. In this regard, Apple is actually behind the curve, and playing catch up to these companies.
We can argue until the cows come home about a thousand different hypothetical scenarios that could happen. While said technology could also in theory be used for other purposes (like say, scan all photos regardless of whether iCloud is turned on or off), the problem with a slippery slope argument like this is what while I cannot prove that it won't happen, you can't prove that it will necessarily happen either (although the ability certainly is there).
In the end, we are just arguing over each other, and nobody is listening.
Can you share where Google is device scanning?
I do know they provide tools to help combat this https://protectingchildren.google/intl/en/ and I know they have been image scanning shared (Cloud, Gmail, etc) for a number of years. I have not heard of on device scanning.
Please come forward and be the pioneer to dismantle any form of encryption, outlaw encryption technology outside of military and secret services and encourage full unconditional disclose of any information held by any citizen.Of course it’s still true. Apple cannot see any of the scanning info on your phone. The only time they would ever see any scan results is if you have multiple CSAM images on your device and you attempt to upload those to iCloud. You never had any true privacy on iCloud to begin with, as Apple can access all your files there if they so desire. Read the iCloud legal document on Apple‘s website.
ironically, things are now going to be even more private than before, yet many of you are acting like this is a step backwards, because, again, you are interpreting things through a distorted lens.
I felt it’s true, until this “scanning library” surfaced.Let's see the real size of that "screeching minority" and let's see what percentage believes that "one 1 in 3 trillion false positives" and "slightly edited images are also detected" can be true at the same time.