Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Does this mean apple will be analysing everything we see? This would make this like the child porn detection feature, where the main takeaway was that all photos would be analysed via a third party.
Yes, they have "analysts" looking at all of our photos. And that's how the CSAM detection worked also, you totally nailed it bro.
 
Built in weenie detector? Wonder how they trained that AI….
This guy won’t like it!

1686226271062.jpeg

And this one also not…

IMG_0025.jpeg
 
Last edited:
(Edited because everyone has said the same thing)
The old Child Porn detector was also on device, and it never analysed the Images. It analysed the hash for the image for identical hashes known to be child porn, provided to them. It only alerted the appropriate authority when a match occurred, as a source for investigation. No images left the device for searching. I still can’t see what the issue was.
A secure hashing algorithm should produce a totally different hash if even one pixel is modified, which would make circumventing that system trivial. To work around that, Apple’s system (and others) apply some fuzzing to help “undo” modifications. That fuzzing inherently also increases the likelihood of hash collisions, where two different images produce the same hash.

The consequences of a hash collision are, of course, pretty dire in this specific case. Even an account being flagged for review could mean massive headaches if the person was innocent. Apple tried to design around this by requiring manual review and requiring a number of matches before that review would occur, but that doesn’t fix the unrelated issue of who controls the hash databases.

Apple was going to require a hash to be present in multiple databases, but as we’ve seen recently with Russia and Belarus for example, it’s possible for a state to develop so much influence over its region that it effectively has puppet states…and it’s not like Russia’s government at this time is particularly known for its love of freedom of expression. So what then, does an oppressive government with puppet states tell Apple to add its preferred hashes or ban Apple imports? This leaves the system vulnerable to abuse against innocent people for reasons completely unrelated to CSAM.

That’s what the issue was.
 
Hopefully the feature will include the ability to add "preferences" to fine tune images not to block! It shouldn't be all or nothing after all! :)
 
  • Like
Reactions: Joe Crow
Does this mean apple will be analysing everything we see? This would make this like the child porn detection feature, where the main takeaway was that all photos would be analysed via a third party.

🙄. The feature you are lying about is analyzed on device, and apple never planned to have the child one analyzed by a third party either. The third party would have simply provided a database of known hashes for your own device to compare and detect those images.
 
  • Disagree
Reactions: Victor Mortimer
I know this is not CSAM, I was merely pointing out that for detecting CSAM, you’d want a human to verify the contents of the picture BEFORE reporting it to law-enforcement. We know how badly wrong automating law enforcement can be.

No, I do not. Setting aside whether cops are humans (they aren't), I do not want ANYONE able to "verify" ANYTHING on anyone's device without the user's permission. That's why we were so upset about the whole "but think of the children" excuse Apple tried to use with the whole plan to have our devices report ANYTHING to ANYONE. Today it's kiddy porn, tomorrow it's criticism of the government.

If this feature is truly isolated on-device with appropriate safeguards to be sure it's not sending anything off-device, it's mostly harmless. I'd still rather not have any image analysis going on without me explicitly initiating it on a specific image, whether that means a weenie detector or face detector.
 
  • Like
  • Love
Reactions: RichTF and Joe Crow
No, I do not. Setting aside whether cops are humans (they aren't), I do not want ANYONE able to "verify" ANYTHING on anyone's device without the user's permission. That's why we were so upset about the whole "but think of the children" excuse Apple tried to use with the whole plan to have our devices report ANYTHING to ANYONE. Today it's kiddy porn, tomorrow it's criticism of the government.

If this feature is truly isolated on-device with appropriate safeguards to be sure it's not sending anything off-device, it's mostly harmless. I'd still rather not have any image analysis going on without me explicitly initiating it on a specific image, whether that means a weenie detector or face detector.
At some point someone has to be trusted to check for this stuff, or we just have to live with CSAM.
 
  • Angry
Reactions: Victor Mortimer
Does this mean apple will be analysing everything we see? This would make this like the child porn detection feature, where the main takeaway was that all photos would be analysed via a third party.
Someone didn’t see all processing done on device. Do you have any reading comprehension?
 
Imagine the poor souls that had to work and develop this technology and do all kinds of testing... lmao
 
Why is everybody assuming this is for pictures of male anatomy only???
Because it's rare for a guy to receive unsolicited nudie pics of women.😉

There was that one time I was patching my roof after hurricane Ike. I received some unsolicited MMS stating, "Hot shingles in your area needing to be laid." and "Hot shingles within 5 miles looking to get nailed."😅
 
Apple doesn't see your photos, your phone just checks them on-device.
So customers pay for devices, pay for the energy to charge the devices, but Apple can usurp that device again in the name of protection, but again its ON DEVICE where Apple has no right to act as arbiter. I'm involved in sensitive work and I would not be allowed to use Apple devices if APPLE can interrogate my device. There have been so many instances where so called optional functions have proven to operate in any event. How many companies have been fined for mis using customers data, but in this case its worse. Its selling you a device, then expecting you to pay for the energy to charge your device, utilise function that must affect performance. This may not be CSAM in name, but the principle is the same. SURVEILLANCE ON DEVICE. NO NO NO Apple.

Just a cursory look at how many times Apple have been fined over privacy. Its no good talking the talk if then they don't walk the walk.

Its the same slippery slope that industry experts condemned Apple for in the CSAM ON DEVICE surveillance, and it is the facts its ON DEVICE that makes it that slippery slope.

Of course they can check whatever they like on their own servers, but usurping users purchased Apple devices is not on. We've already seen how Apple acquiesce to China's demands in certain situations. The same is true with US agencies who cry out for a backdoor and this is the slipper slope to give one.

Anyone who suggests Apple have never been in breach of rules only has to do a cursory check. They are by no means the worst, but this is the slipper slope...again in the guise of protection, but it flies in the face of Apple's own comments on privacy. Don't be fooled by it being an option, as we've seen in the past how optional functions did not even have to be switched on by users.
















 
Last edited:
Can they also obscure it if somebody inadvertently sends me today's Wordle answer?
 
So customers pay for devices, pay for the energy to charge the devices, but Apple can usurp that device again in the name of protection, but again its ON DEVICE where Apple has no right to act as arbiter. I'm involved in sensitive work and I would not be allowed to use Apple devices if APPLE can interrogate my device. There have been so many instances where so called optional functions have proven to operate in any event. How many companies have been fined for mis using customers data, but in this case its worse. Its selling you a device, then expecting you to pay for the energy to charge your device, utilise function that must affect performance. This may not be CSAM in name, but the principle is the same. SURVEILLANCE ON DEVICE. NO NO NO Apple.

Just a cursory look at how many times Apple have been fined over privacy. Its no good talking the talk if then they don't walk the walk.

Its the same slippery slope that industry experts condemned Apple for in the CSAM ON DEVICE surveillance, and it is the facts its ON DEVICE that makes it that slippery slope.

Of course they can check whatever they like on their own servers, but usurping users purchased Apple devices is not on. We've already seen how Apple acquiesce to China's demands in certain situations. The same is true with US agencies who cry out for a backdoor and this is the slipper slope to give one.

Anyone who suggests Apple have never been in breach of rules only has to do a cursory check. They are by no means the worst, but this is the slipper slope...again in the guise of protection, but it flies in the face of Apple's own comments on privacy. Don't be fooled by it being an option, as we've seen in the past how optional functions did not even have to be switched on by users.
















Stop caring so much about it, and just find a dumb phone to use. That's what I'm going to do soon. I've seen enough now to know where this is all going. So I'm checking out. Done and done.
 
Does this mean apple will be analysing everything we see? This would make this like the child porn detection feature, where the main takeaway was that all photos would be analysed via a third party.

You have misunderstood how the CSAM Detection System worked.

It was exceptional bad at discovering nudity or pornography in general.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.