Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
67,530
37,842


Apple in iOS 17 is adding a new feature to iOS that is designed to automatically block incoming messages and files that may have sensitive content like nudity.

nude-photo-block-ios-17.jpg

Opt-in blurring can be applied to sensitive images sent in Messages, AirDrop, Contact Posters for the Phone app, FaceTime messages, and third-party apps. The feature will prevent adult iPhone users from being subjected to unwanted imagery. All nudity will be blocked, but can be viewed by tapping the "Show" button.

Sensitive Content Warnings work like the Communication Safety functionality that Apple added for children, with all detection done on device so Apple does not see the content that's being shared. Sensitive Content Warnings are an expansion of the Communication Safety options that Apple introduced for children last year.

Communication Safety detects and blocks nude images before children can view them, and with iOS 17, this too will expand to encompass AirDrop, the systemwide photo picker, FaceTime messages, and third-party apps.

Article Link: iOS 17 Can Automatically Block Unsolicited Nude Photos With 'Sensitive Content Warnings'
 
Last edited:
Does this mean apple will be analysing everything we see? This would make this like the child porn detection feature, where the main takeaway was that all photos would be analysed via a third party.
That is not at all how these algorithms work. They're not very complicated and can run on the device. There are intro level Coursera courses that have code examples that do this to a limited extent. It's not terribly difficult to extend to this kind of detection, and it's purely from the device examining the image locally for patterns.

If it were doing something like automatically identifying any person, ever, they'd need the cloud just for the pattern database. But, for "looks like a wang", that's a very small set of patterns that can be stored locally easily.
 
That is a good question. Depends on whether it happens on-device or not.
Opt-in blurring can be applied to sensitive images sent in Messages, AirDrop, Contact Posters for the Phone app, FaceTime messages, and third-party apps. The feature will prevent adult iPhone users from being subjected to unwanted imagery. All nudity will be blocked, but can be viewed by tapping the "Show" button.

Sensitive Content Warnings work like the Communication Safety functionality that Apple added for children, with all detection done on device so Apple does not see the content that's being shared. Sensitive Content Warnings are an expansion of the Communication Safety options that Apple introduced for children last year.
I bolded the text from the article for you ...
 
Does this mean apple will be analysing everything we see? This would make this like the child porn detection feature, where the main takeaway was that all photos would be analysed via a third party.

(Edited because everyone has said the same thing)
The old Child Porn detector was also on device, and it never analysed the Images. It analysed the hash for the image for identical hashes known to be child porn, provided to them. It only alerted the appropriate authority when a match occurred, as a source for investigation. No images left the device for searching. I still can’t see what the issue was.
 
Simple machine learning feature. Most likely based on some existing models but downgraded for the 16 bit ML on the devices.
 
  • Like
Reactions: Brad7
Does this mean apple will be analysing everything we see? This would make this like the child porn detection feature, where the main takeaway was that all photos would be analysed via a third party.

The involvement of a 3rd-party was not in the analysis of the content, but as a source of "hashes" that represent questionable content. If anyone believed that Apple was going to pass content over to a 3rd-party for analysis, that was *never* going to happen. The analysis was *always* happening on-device. They stated that clearly.

As per this article...
...all detection done on device [sic] so Apple does not see the content...

Analysis is done "on-device". Read that again ... "on-device". Are people not reading the article? The editor should have emphasized this important piece of info.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.