Even if I had iCloud ON, it should NOT be scanning in my house. THAT is my point, way to miss it.If you have iCloud photos disabled, is it SCANNING in your house(phone)? no.
Even if I had iCloud ON, it should NOT be scanning in my house. THAT is my point, way to miss it.If you have iCloud photos disabled, is it SCANNING in your house(phone)? no.
And it's not true that your device isn't constantly being monitored. You can have you kids' device constantly monitored both incoming and outgoing images for AI-detected "nudes". But that's a different thing.The act of uploading a file means you're giving Apple that file. What part of that don't you get? The scanning only takes place during the upload process, not before. Your device isn't constantly being monitored. If you don't upload anything, nothing is scanned. Get it now?
No but the software is still there that Apple will remotely activate at any point without notifying the user and then send the police to their house to arrest them.If you have iCloud photos disabled, is it SCANNING in your house(phone)? no.
WRONG they are scanning locally on your device.That's what they're doing though. They actually see nothing about your photo library until 30 photos are found to be CSAM. At that point, there's a secondary server-side scan, then if it still somehow passes through that, it's up for human review and then if it passes through that, then it's showed to the NECMC, then if they determine it's still CP, it goes to police.
Not likely.And it's not true that your device isn't constantly being monitored. You can have you kids' device constantly monitored both incoming and outgoing images for AI-detected "nudes". But that's a different thing.
Just to point out though... .that AI detection could very easily be switched to look for, say, BLM supporters. Rebel Flags. Rainbow Flags. Whatever they would like to look for....
No but the software is still there that Apple will remotely activate at any point without notifying the user and then send the police to their house to arrest them.
Actually, the photo is scanned and tagged, and THEN uploaded.The act of uploading a file means you're giving Apple that file. What part of that don't you get? The scanning only takes place during the upload process, not before. Your device isn't constantly being monitored. If you don't upload anything, nothing is scanned. Get it now?
That's my point. It's a closed system. They control it, the user does not.they could have done that a decade ago, and every moment since them. They haven’t yet.
Serious questions regarding this line of thinking:Because... once again... if I upload it to the cloud, I'm making a choice and agreeing to the TOS of the cloud provider.
With Apple's scanning, that choice is being taken away. It's a difference between being REACTIVE (upload naughty files to Dropbox, get what you get) versus PROACTIVE (Images are scanned on your device without you having any choice in the matter). Do you see the difference now?
No Apple says the photos are only scanned when they're being uploaded. If you turn off iCloud Photos, nothing is uploaded therefore nothing is scanned for any reason.Actually, the photo is scanned and tagged, and THEN uploaded.
I used iCloud in the expectation that, like Dropbox, they scan for illegal content once the content lands on their server or when I decide to share files. Not before, as is now the case with iCloud Photos. Let Apple use their own energy to do this scanning. I've wasted enough energy and bandwidth trying to get iCloud to work at all in the first place. The initial sync nearly melted my MacBook.
It's more about the POTENTIAL of that being taken away because they are using ON Device Scanning.Serious question regarding this line of thinking: Since your photo will not be scanned unless you turn on iCloud photos isn't turning it on the same as 'making a choice'? How is the choice being taken away if you 100% still have the choice to not turn on iCloud phots and not have your pic scanned?
They also have a random false-positive generator on-server, so they're constantly trying to decode your photo library.That's what they're doing though. They actually see nothing about your photo library until 30 photos are found to be CSAM. At that point, there's a secondary server-side scan, then if it still somehow passes through that, it's up for human review and then if it passes through that, then it's showed to the NECMC, then if they determine it's still CP, it goes to police.
No Apple says the photos are only scanned when they're being uploaded. If you turn off iCloud Photos, nothing is uploaded therefore nothing is scanned for any reason.
Where is this?They also have a random false-positive generator on-server, so they're constantly trying to decode your photo library.
That's my point. It's a closed system. They control it, the user does not.
I got my info directly from Apple. Files are only scanned during the process of uploading the file.
DO you NOT know what the words ON-DEVICE MATCHING mean?
Wait, did you REALLY edit my quote?Where is this?
I didn't mean to quote you on that. I was trying to quote someone else, sorry.Wait, did you REALLY edit my quote?
https://www.apple.com/child-safety/ its RIGHT in their Child Safely document. Did you not even bother reading Apples statemtn on this before argueing?
Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations.
The technical workflow is:No Apple says the photos are only scanned when they're being uploaded. If you turn off iCloud Photos, nothing is uploaded therefore nothing is scanned for any reason.
Wait, did you REALLY edit my quote?
https://www.apple.com/child-safety/ its RIGHT in their Child Safely document. Did you not even bother reading Apples statement on this before arguing?
Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations.
Which one?In the technical document.
that does NOT mean the scan happens in transit, only that its done one ones marked for transit.
Page 7
This feature runs exclusively as part of the cloud storage pipeline for images being uploaded to iCloud Photos and cannot act on any other image content on the device. Accordingly, on devices and accounts where iCloud Photos is disabled, absolutely no images are perceptually hashed. There is therefore no comparison against the CSAM perceptual hash database, and no safety vouchers are generated, stored, or sent anywhere.
I got this information directly from Apple. It looks to ME like all files are scanned. Only when you upload them, does it also include the voucher to iCloud.I got my info directly from Apple. Files are only scanned during the process of uploading the file.