Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Will you leave the Apple ecosystem because of CSAM?


  • Total voters
    1,040
Status
Not open for further replies.
The act of uploading a file means you're giving Apple that file. What part of that don't you get? The scanning only takes place during the upload process, not before. Your device isn't constantly being monitored. If you don't upload anything, nothing is scanned. Get it now?
And it's not true that your device isn't constantly being monitored. You can have you kids' device constantly monitored both incoming and outgoing images for AI-detected "nudes". But that's a different thing.

Just to point out though... .that AI detection could very easily be switched to look for, say, BLM supporters. Rebel Flags. Rainbow Flags. Whatever they would like to look for....
 
If you have iCloud photos disabled, is it SCANNING in your house(phone)? no.
No but the software is still there that Apple will remotely activate at any point without notifying the user and then send the police to their house to arrest them.
 
That's what they're doing though. They actually see nothing about your photo library until 30 photos are found to be CSAM. At that point, there's a secondary server-side scan, then if it still somehow passes through that, it's up for human review and then if it passes through that, then it's showed to the NECMC, then if they determine it's still CP, it goes to police.
WRONG they are scanning locally on your device.
 
And it's not true that your device isn't constantly being monitored. You can have you kids' device constantly monitored both incoming and outgoing images for AI-detected "nudes". But that's a different thing.

Just to point out though... .that AI detection could very easily be switched to look for, say, BLM supporters. Rebel Flags. Rainbow Flags. Whatever they would like to look for....
Not likely.
 
No but the software is still there that Apple will remotely activate at any point without notifying the user and then send the police to their house to arrest them.

they could have done that a decade ago, and every moment since them. They haven’t yet.

“Will activate”. I’m glad you have inside info from Apple that no one else has access to, and what they will and won’t do in the future. Please pass along anything you know that the general public doesn’t.
 
The act of uploading a file means you're giving Apple that file. What part of that don't you get? The scanning only takes place during the upload process, not before. Your device isn't constantly being monitored. If you don't upload anything, nothing is scanned. Get it now?
Actually, the photo is scanned and tagged, and THEN uploaded.

I used iCloud in the expectation that, like Dropbox, they scan for illegal content once the content lands on their server or when I decide to share files. Not before, as is now the case with iCloud Photos. Let Apple use their own energy to do this scanning. I've wasted enough energy and bandwidth trying to get iCloud to work at all in the first place. The initial sync nearly melted my MacBook.
 
Because... once again... if I upload it to the cloud, I'm making a choice and agreeing to the TOS of the cloud provider.

With Apple's scanning, that choice is being taken away
. It's a difference between being REACTIVE (upload naughty files to Dropbox, get what you get) versus PROACTIVE (Images are scanned on your device without you having any choice in the matter). Do you see the difference now?
Serious questions regarding this line of thinking:

Since your photo will not be scanned unless you turn on iCloud photos isn't turning it on the same as 'making a choice'?

How is the choice being taken away if you 100% still have the choice to not turn on iCloud photos and not have your pic scanned?

Whether the scanning is on your phone or in the cloud the result is the same: your pics won't be scanned unless you have iCloud photos turned on so what is all this fuss about?

And I'm not looking for conspiracy based answers. I would like an answer based on the available facts.
 
Last edited:
Actually, the photo is scanned and tagged, and THEN uploaded.

I used iCloud in the expectation that, like Dropbox, they scan for illegal content once the content lands on their server or when I decide to share files. Not before, as is now the case with iCloud Photos. Let Apple use their own energy to do this scanning. I've wasted enough energy and bandwidth trying to get iCloud to work at all in the first place. The initial sync nearly melted my MacBook.
No Apple says the photos are only scanned when they're being uploaded. If you turn off iCloud Photos, nothing is uploaded therefore nothing is scanned for any reason.
 
Serious question regarding this line of thinking: Since your photo will not be scanned unless you turn on iCloud photos isn't turning it on the same as 'making a choice'? How is the choice being taken away if you 100% still have the choice to not turn on iCloud phots and not have your pic scanned?
It's more about the POTENTIAL of that being taken away because they are using ON Device Scanning.
 
  • Like
Reactions: GBaughma
That's what they're doing though. They actually see nothing about your photo library until 30 photos are found to be CSAM. At that point, there's a secondary server-side scan, then if it still somehow passes through that, it's up for human review and then if it passes through that, then it's showed to the NECMC, then if they determine it's still CP, it goes to police.
They also have a random false-positive generator on-server, so they're constantly trying to decode your photo library.
 
Where is this?
Wait, did you REALLY edit my quote?


https://www.apple.com/child-safety/ its RIGHT in their Child Safely document. Did you not even bother reading Apples statement on this before arguing?

Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations.
 
Wait, did you REALLY edit my quote?


https://www.apple.com/child-safety/ its RIGHT in their Child Safely document. Did you not even bother reading Apples statemtn on this before argueing?

Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations.
I didn't mean to quote you on that. I was trying to quote someone else, sorry.

I read all of Apple's documentation. I'll try to find where I read it.
 
No Apple says the photos are only scanned when they're being uploaded. If you turn off iCloud Photos, nothing is uploaded therefore nothing is scanned for any reason.
The technical workflow is:

1. Photo taken or saved
2. Photo hashed, tagged, and prepped for upload
3. Photo gets uploaded to iCloud

2 and 3 happen automatically after 1, unless you snap more than ~10 images, then it waits for Wi-Fi and power. Also, which of my devices gets the pleasure of back-checking over 20,000 images already on iCloud? My Mac? So I get to listen to it scream all night? If anything is going to hash all-night, it's going to be my Bitcoin miner.
 
Wait, did you REALLY edit my quote?


https://www.apple.com/child-safety/ its RIGHT in their Child Safely document. Did you not even bother reading Apples statement on this before arguing?

Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations.

Page 7

This feature runs exclusively as part of the cloud storage pipeline for images being uploaded to iCloud Photos and cannot act on any other image content on the device. Accordingly, on devices and accounts where iCloud Photos is disabled, absolutely no images are perceptually hashed. There is therefore no comparison against the CSAM perceptual hash database, and no safety vouchers are generated, stored, or sent anywhere.
 
In the technical document.
Which one?

From what I read, they only have the keys to unlock 30 photos, but only after that threshold has been met. And then those 30 photos go through a perceptual hash process to confirm that they visually match the CSAM counterpart. Then it goes to human review.
 
You know, I'm not really reading where the scan is ONLY performed IF the image is uploaded.
This reads as if every image is read and hashed, and a voucher generated. It sounds like the only thing that happens is the voucher is uploaded with the photo, IF you send it to iCloud. There's nothing really in here that says the scan ONLY happens when the image is uploaded.

1630514127322.png
 

Page 7

This feature runs exclusively as part of the cloud storage pipeline for images being uploaded to iCloud Photos and cannot act on any other image content on the device. Accordingly, on devices and accounts where iCloud Photos is disabled, absolutely no images are perceptually hashed. There is therefore no comparison against the CSAM perceptual hash database, and no safety vouchers are generated, stored, or sent anywhere.
that does NOT mean the scan happens in transit, only that its done one ones marked for transit.
 
Status
Not open for further replies.
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.