Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
(Edited because everyone has said the same thing)
The old Child Porn detector was also on device, and it never analysed the Images. It analysed the hash for the image for identical hashes known to be child porn, provided to them. It only alerted the appropriate authority when a match occurred, as a source for investigation. No images left the device for searching. I still can’t see what the issue was.

One of the issues was that Apple would show the image to a person as a final check.

Another issue was that Apple was using your phone’s resources to spy on you.
 
  • Haha
Reactions: KeithBN
About time. Never understood those folks that feel the need to take a knob photo and message it to a stranger or near stranger. I mean do they do that in public when they see a person they like?
Chances are, yes, they do. Or will escalate to doing that later.
 
  • Like
Reactions: Fred Zed
My kids at work are constantly showing everyone else dick, tits, and ass pics they get sent on snapchat, instagram, and texts. They make sure I see all the dick pics 🤤. It's not just the kids but the women too. Most all of my under 21 year old employees nowadays are non-binary and bisexual. I have 1 full on straight boy who's the outlier.
 
I am open to try this out. send those DUCK pics :)

Lol is it taco Tuesday? Oh no, that was yesterday. Got some duck pics last week in fact. IMG_0027.jpeg
 
Yes? How else do you think the also identify the faces of your family and friends, and now even your dogs?
That’s not “everything”, apple analysing images in their photo libray service is less intrusive than all images rendered by the device.
 
This is just a rebadge of the CSAM debacle. It is not to be celebrated that ON DEVICE scrutiny is being employed. That was the whole point of being so against the CSAM situation because it was ON DEVICE, on your device, a device you have paid for, and where even if its optional at present, you have no way of knowing whether scrutiny of your data is taking place and just not being filtered, and its a very short step from scrutinising everything on YOUR device, to providing a backdoor for regimes and governments, or organisations that want control over everything. It is inevitably in the guise of safeguarding, but it is not about safeguard it is about snooping. Defies the point of double end encryption and makes Apple's policy on privacy look rather empty.

We went through this argument previously, where so many in the industry realised the dangers, but where in my opinion this is just as rebadging of the same situation showing determination by Apple not about safeguarding, but the opposite!

What would be the point of double end encryption between devices on anything if then ON DEVICE surveillance is taking place.

It shows as being optional, but that is not to say the surveillance is not taking place, but not being utilised in screening.

By all means screen on Apple's servers, screen on the cloud, as its not our device, but leave our devices, devices we have paid for, leave them alone. A very slippery slope indeed and the fact Apple has wheeled this reincarnated surveillance in a different guise again is worrying. No one should be celebrating it being ON DEVICE that is the problem! It's YOUR DEVICE, YOU PAID FOR IT, YOU PAY TO POWER IT, YOU PAY FOR THE PROCESSING SPEED, YOU PAY AND SHOULD EXPECT THE HIGHEST LEVEL OF PRIVACY.

By all means screen servers owned by Apple, Google or any off device, because we don't own that.

One poster mentioned cousin repeatedly sending pictures of his balls...EASY ANSWER BLOCK IT!

I doubt there are many unsolicited communications from unknown senders, in which case a customer has the option on THEIR device to block the sender or let them know they will be blocked in that event. A DEVICE OWNER'S DECISION ON THEIR DEVICE.

In the UK its already illegal to send unsolicited photos of that ilk, so it doesn't require Apple to be surveillance masters of OUR DEVICES.
 
Last edited:
This is just a rebadge of the CSAM debacle. It is not to be celebrated that ON DEVICE scrutiny is being employed. That was the whole point of being so against the CSAM situation because it was ON DEVICE, on your device, a device you have paid for, and where even if its optional at present, you have no way of knowing whether scrutiny of your data is taking place and just not being filtered, and its a very short step from scrutinising everything on YOUR device, to providing a backdoor for regimes and governments, or organisations that want control over everything. It is inevitably in the guise of safeguarding, but it is not about safeguard it is about snooping. Defies the point of double end encryption and makes Apple's policy on privacy look rather empty.

We went through this argument previously, where so many in the industry realised the dangers, but where in my opinion this is just as rebadging of the same situation showing determination by Apple not about safeguarding, but the opposite!

What would be the point of double end encryption between devices on anything if then ON DEVICE surveillance is taking place.

It shows as being optional, but that is not to say the surveillance is not taking place, but not being utilised in screening.

By all means screen on Apple's servers, screen on the cloud, as its not our device, but leave our devices, devices we have paid for, leave them alone. A very slippery slope indeed and the fact Apple has wheeled this reincarnated surveillance in a different guise again is worrying. No one should be celebrating it being ON DEVICE that is the problem! It's YOUR DEVICE, YOU PAID FOR IT, YOU PAY TO POWER IT, YOU PAY FOR THE PROCESSING SPEED, YOU PAY AND SHOULD EXPECT THE HIGHEST LEVEL OF PRIVACY.

By all means screen servers owned by Apple, Google or any off device, because we don't own that.

One poster mentioned cousin repeatedly sending pictures of his balls...EASY ANSWER BLOCK IT!

I doubt there are many unsolicited communications from unknown senders, in which case a customer has the option on THEIR device to block the sender or let them know they will be blocked in that event. A DEVICE OWNER'S DECISION ON THEIR DEVICE.

In the UK its already illegal to send unsolicited photos of that ilk, so it doesn't require Apple to be surveillance masters of OUR DEVICES.
This is about hiding the content BEFORE the user blocks it.
 
  • Like
Reactions: RichTF and SFjohn
Yeah you definitely want a human to check the image before law-enforcement gets involved.
This isn’t CSAM at all. It’s not forced on, on all devices. It doesn’t report back to  employees to review and sick the police after you. It will be interesting to see how it works in practice. The downside I suspect is that for people who enable this, if someone send pics about breast cancer, or Michelangelo’s David it will be blocked. With that said if you identify what you’re sending they can choose to see it if they really want to. You’ll be able to ignore gotse boy pics too! 😉

1686208963715.jpeg
 
This isn’t CSAM at all. It’s not forced on, on all devices. It doesn’t report back to  employees to review and sick the police after you. It will be interesting to see how it works in practice. The downside I suspect is that for people who enable this, if someone send pics about breast cancer, or Michelangelo’s David it will be blocked. With that said if you identify what you’re sending they can choose to see it if they really want to. You’ll be able to ignore gotse boy pics too! 😉

View attachment 2214935
I know this is not CSAM, I was merely pointing out that for detecting CSAM, you’d want a human to verify the contents of the picture BEFORE reporting it to law-enforcement. We know how badly wrong automating law enforcement can be.
 
  • Like
  • Love
Reactions: Joe Crow and SFjohn
Gotta make sure you never see the tools that were used to create you.
It becomes such a big big thing and upsets people more with increasing restrictions.
What is forbidden is inviting more and more troublesome emotions in people.
From what I have seen from my travels is different than Sweden, and I don’t see a lot of upsets or even troubles with nudes here.

I don’t have troubles with getting any nudes to my devices, but I manages my settings of contacts to my ethics pretty ok.
That said, it’s great to have settings for our devices, time saving regarding all kinds of spams.
 
Last edited:
The feature will prevent adult iPhone users from being subjected to unwanted imagery.
What about non-adult users, do they not deserve this setting as well? Maybe it’s not actually on-device, if an account is needed to activate the feature.
 
I want this on the AR headset - to blur out people that I find offensive to look at... :p

TBH I don't know anyone (men or women) that has received unwanted nudes - but this may be dependent on country and culture.
 
I really appreciate this feature and wonder if Apple can implement an adjustment to the sensitivity settings. There are lots of images that might not be full nudity but nonetheless would be nice to have hidden.
 
I really appreciate this feature and wonder if Apple can implement an adjustment to the sensitivity settings. There are lots of images that might not be full nudity but nonetheless would be nice to have hidden.

A sensitivity setting bar with tolerance from minimum to maximum, just as audio-volume setting, could make everyone happy to their needs and tolerance.
 
  • Love
  • Like
Reactions: gusmula and 0339327
This would be a great feature if it blocked **** meme gifs instead. Preferably before they get downloaded to my phone.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.