Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
How would your pictures have gotten into NCMEC's database? I'm assuming you don't post them on illicit sharing sites.
It wouldn't. But all NN/ML algorithms have some margin for error. Also before we even get into my paranoia about getting my images falsely matched with some database...This is an enormous invasion of privacy. What I do on my phone is truly my business. The "if you're not guilty, you have nothing to worry about" defense is the worst possible take on privacy related issues. It's not Apple's business to scan my personal device without my consent. Hashed, encrypted, anonymized or any other abstraction technique is not a defense either. They are still running an algorithm on my device with my data, without my consent, and then send resulting data externally. And that is where the argument should end.
 
How would your pictures have gotten into NCMEC's database? I'm assuming you don't post them on illicit sharing sites.
Interesting question. He certainely would not post them on illicit sites, but someone could have got that photo and posted it. Photo gets flagged as abusive - you are the "producer" of that photo and own it (it is your son...). As it is flagged, you own abusive material.

Interesting development...
 
Huh? A false positive would immediately be recognized and dismissed once it is reviewed by an actual person. It would never even be sent to law enforcement. In fact, you probably won't ever even know about it.
Stop and think about that: “Hey, some complete stranger, unbeknownst to me, reviewed an image of mine that I did NOT put on a public server, and that’s OK because I didn’t know about it.”
That may not be what you intended, but that’s how it came across.
 
It's still a slippery slope here. What next, will Apple disallow nudity in icloud, because, #morals?

Plus, the whole appeal of iCloud over say Google Photos was there was no cloud AI meddling with your photos scanning them at all.

Now, what is the difference really?

It's not even a slippery slope...I'd argue that scanning every photo on every iOS device on the planet is already the bottom of a sheer cliff.
 
Let me just ask: How are they going to personally review a picture they don’t have access to? I’m just asking because I don’t know how this can be done.

But they DO have access to it. From the article:

Before an image is stored in iCloud Photos, Apple said an on-device matching process is performed for that image against the unreadable set of known CSAM hashes. If there is a match, the device creates a cryptographic safety voucher. This voucher is uploaded to iCloud Photos along with the image, and once an undisclosed threshold of matches is exceeded, Apple is able to interpret the contents of the vouchers for CSAM matches. Apple then manually reviews each report to confirm there is a match, disables the user's iCloud account, and sends a report to NCMEC. Apple is not sharing what its exact threshold is, but ensures an "extremely high level of accuracy" that accounts are not incorrectly flagged.
 
Find it ironic that this is the ad displaying on the thread of this article……
 

Attachments

  • E41E1D8C-34FA-4E3D-B870-B919A176E931.jpeg
    E41E1D8C-34FA-4E3D-B870-B919A176E931.jpeg
    184.8 KB · Views: 110
Last edited:
Stop and think about that: “Hey, some complete stranger, unbeknownst to me, reviewed an image of mine that I did NOT put on a public server, and that’s OK because I didn’t know about it.”
That may not be what you intended, but that’s how it came across.

Let's not get sidetracked. The guy I was replying to was saying someone's life could be ruined by a false positive. All I'm saying is obviously Apple is not going to report a false positive to anyone after they identify it, thus you will not be falsely arrested for possession of child porn and have your name sullied - which is the only thing I can imagine they meant by having your life ruined.

EDIT for you disagreeable and angry reaction people. Please explain how your life would be ruined by a false positive that never reaches law enforcement. This should be interesting.
 
Last edited:
That's a ******** argument. That's like arguing you bought a Tesla, but when you agreed to a software update it was okay for Tesla to start charging you .10 additional per mile driven because you consented to the update being installed.

You know that user agreement everyone just clicks on "accept" for when installing an update? That stuff is actually real and legally binding. Just saying.
 
So basically, they force an upload of the image to iCloud Photos even if you don’t use iCloud photos, even if you only have your local library, right?

My understanding is that the scanning is happening on the phone, but it doesn't affect anything unless you actually upload flagged images to iCloud.

@peanuts_of_pathos why the angry face? Read the article:
Before an image is stored in iCloud Photos, Apple said an on-device matching process is performed for that image against the unreadable set of known CSAM hashes. If there is a match, the device creates a cryptographic safety voucher. This voucher is uploaded to iCloud Photos along with the image, and once an undisclosed threshold of matches is exceeded, Apple is able to interpret the contents of the vouchers for CSAM matches.
 
Last edited:
It wouldn't. But all NN/ML algorithms have some margin for error. Also before we even get into my paranoia about getting my images falsely matched with some database...This is an enormous invasion of privacy. What I do on my phone is truly my business. The "if you're not guilty, you have nothing to worry about" defense is the worst possible take on privacy related issues. It's not Apple's business to scan my personal device without my consent. Hashed, encrypted, anonymized or any other abstraction technique is not a defense either. They are still running an algorithm on my device with my data, without my consent, and then send resulting data externally. And that is where the argument should end.
If your pictures aren't in the NCMEC's database, then there wouldn't be a match. Apple isn't scanning photos for pictures of kids, they're matching hashes.
 
It wouldn't. But all NN/ML algorithms have some margin for error. Also before we even get into my paranoia about getting my images falsely matched with some database...This is an enormous invasion of privacy. What I do on my phone is truly my business. The "if you're not guilty, you have nothing to worry about" defense is the worst possible take on privacy related issues. It's not Apple's business to scan my personal device without my consent. Hashed, encrypted, anonymized or any other abstraction technique is not a defense either. They are still running an algorithm on my device with my data, without my consent, and then send resulting data externally. And that is where the argument should end.

The first feature discussed is a parental controls option that for your children runs ML locally on conversation and images. This is very clever and invaluable to those that have kids that have been targeted.

The second is a comparison of a hash # of your photo against known hashes of commonly distributed CSAM photos. The actual photo you took is not made available. It is not an ML algorithm run on your photos to determine the photo's content. If you get X number of matches from X photos against known CSAM photo hashes this triggers action on their side and it is extremely unlikely to be wrong.
 
  • Sad
Reactions: peanuts_of_pathos
Yeah, that's the problem. Idk about you, but I definitely don't want some random guy at Apple Park perusing my personal photos without my knowledge.

The chance that even ONE of your personal photos will be falsely flagged by this technology according to the article is 1 in 1 trillion. I'm sure you'll just say you don't believe that, but I can't control your imagination.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.