CSAM Detection will be included in an upcoming release of iOS 15 and iPadOS 15.I didn't mean you in particular but the uproar this has caused all over the web.
CSAM Detection will be included in an upcoming release of iOS 15 and iPadOS 15.I didn't mean you in particular but the uproar this has caused all over the web.
Nope, the announcement this week was this the scanning will happen on your device. But, yes, it only happens if you have iCloud Photos turned on. That's simply a policy decision though, and not some sort of technical limitation; it could change any time to run whether you're using iCloud or not.
Consider this, should the makers of indoor security cameras, like Logitech, Eufy, Ring, etc, for consumer home use be able to monitor those cameras for instances of domestic violence, illegal drug use, child abuse, etc. because the video is stored on their servers via a cloud service? That will be the next step.
What happened to the presumption of innocence?
It is false advertising and I would not be surprised if there's a lawsuit coming in place in the future. Hopefully, it goes to the Supreme Court.
At Apple, we believe privacy is a fundamental human right? If that's the case why scan an iPhone without the user's authorization?
Due to an alarm for flagged photos in your library we had to review the following pictures: (including copies of 35 very private photos of your wife) Unfortunately they matched the hashes of our database, which usually only happens for one in a Trillion photos, so we checked them and found them legit. We have added a supplementary hidden stamp to them to avoid further control. They are safe to use now.
So lets say i am a dictator. I want anyone to drink cola, people who drink just plain water are subhumans to me. I know you are an political opponent, or whatsover, and i know this, because you came public with your disagreement in a viral video. I take your picture/video screenshot and/or your voice from the interview, make a hash file of this and Apple will tell me if you are one of their users, and if you are, what your name is, where you live, and so on.
That is wrong, and you know it.You are presumed innocent until the detection says you have too many matching photos and a manual review at Apple agrees with the system's assessment.
That is wrong, and you know it.
If iPhone users were presumed innocent the assessment would not be carried out in the first place.
Users ARE assumed guilty until assessment proves otherwise
The difference is: If they are doing it on their servers, I cannot stop them from doing it. That is ok for me. I do not have to use their server services.
But now, they are scanning "on device" as a default activity and retrieving my private things if a certain value is triggered. So my privacy is gone. If my government decides to introduce an "on device"-scan regarding political activities, I am lost...
Yeah if everyone was always presumed to be innocent, there wouldn't be any safety checks ever.Well, if no check is required to be presumed innocent, then you are not presumed to be innocent. Just like other situations in life:
* You have to prove you should be allowed to enter into the US by someone checking passport, ID or similar
* You have to prove you are above 21 to be served alcohol by ID
* You have to prove you are allowed to drive a vehicle when renting a car by someone checking your driver license
* You're belongings have to be searched to prove that you aren't in possession of something belonging to your employer
Wrong again. Not a thing out of this bullet pointed list does apply when in MY home. Plus: showing ID is not being searched. Totally different thingWell, if no check is required to be presumed innocent, then you are not presumed to be innocent. Just like other situations in life:
* You have to prove you should be allowed to enter into the US by someone checking passport, ID or similar
* You have to prove you are above 21 to be served alcohol by ID
* You have to prove you are allowed to drive a vehicle when renting a car by someone checking your driver license
* You're belongings have to be searched to prove that you aren't in possession of something belonging to your employer
They can do a lot of things. I can not control it, but it might be discovered, they are cheating on me.You can't really stop Apple from doing it on your phone either. They could at any moment retrieve everything on your phone using existing code already on your iPhone or making a very small change to it.
I think it has nothing to do with you. They probably legally have to scan for CSAM since EVERY major company does it.They can do a lot of things. I can not control it, but it might be discovered, they are cheating on me.
With iOS 15 they are communicating, that EVERY picture I upload to my iPhoto Library is scanned and maybe reviewd by default. I am assumed guilty and Apple Control is checking it, because Apple Control assumes, I am a liar.
I think it has nothing to do with you. They probably legally have to scan for CSAM since EVERY major company does it.
No, that’s more like whoever built and sold to you your house bugging it to monitor for a particular illegal activity, and then reporting you to the police when they find something.That is utter and complete BS. That's like saying if a business monitors their security cameras for illegal activity and reports it to the police, they are "acting as an agent of law enforcement." No, they're simply reporting a crime they've observed.
No, that’s more like whoever built and sold to you your house bugging it to monitor for a particular illegal activity, and then reporting you to the police when they find something.
There seems to be two groups of people regarding this issue. There are those who think only in the present and believe you have nothing to worry about as long as you don’t download CSAM, and there are those who look ahead and realize what this can turn into, and that’s what we’re worried about.
I think the issue at stake for many here is the aspect of bargaining away more personal privacy for the perceived societal good of subject X.I didn't mean you in particular but the uproar this has caused all over the web.
That is completely incorrect. If all Apple cared about was whether you were uploading the images to their servers, then they would simply remove the images and prevent you from using their services so you can’t do it again. No, they’re reporting you to the authorities if they find something because they do care about what’s on your phone.Not even close. Apple is not wanting to know what's on YOUR phone - they're wanting to know whether you upload illegal images to THEIR servers.
Slippery slope fallacy.
That is completely incorrect. If all Apple cared about was whether you were uploading the images to their servers, then they would simply remove the images and prevent you from using their services so you can’t do it again. No, they’re reporting you to the authorities if they find something because they do care about what’s on your phone.
What is the likely hood that a collection of innocent images in your iCloud library matching exactly to a collection of images stored in the known CSAM library? 1 in 1 trillion. That makes you scared? You have a better chance being arrested for literal murder and being innocent.That is completely incorrect. If all Apple cared about was whether you were uploading the images to their servers, then they would simply remove the images and prevent you from using their services so you can’t do it again. No, they’re reporting you to the authorities if they find something because they do care about what’s on your phone.
I addressed this a few posts up. Nobody is worried about being falsely flagged.What is the likely hood that a collection of innocent images in your iCloud library matching exactly to a collection of images stored in the known CSAM library? 1 in 1 trillion. That makes you scared? You have a better chance being arrested for literal murder and being innocent.
Then it's a non issue. Move along.I addressed this a few posts up. Nobody is worried about being falsely flagged.
You fall into the first group.Then it's a non issue. Move along.
If you're not falsely flagged, then all of your things remain private. What's the issue here? Are you afraid you'll be that 1 in a trillion that somehow has 30+ innocent photos that just so happen to match 30 photos in a known CSAM database and then having those 30 photos reviewed by Apple. Then, if they truly are innocent photos, literally nothing happens.You fall into the first group.