(Edited because everyone has said the same thing)
The old Child Porn detector was also on device, and it never analysed the Images. It analysed the hash for the image for identical hashes known to be child porn, provided to them. It only alerted the appropriate authority when a match occurred, as a source for investigation. No images left the device for searching. I still can’t see what the issue was.
Yeah you definitely want a human to check the image before law-enforcement gets involved.One of the issues was that Apple would show the image to a person as a final check.
Another issue was that Apple was using your phone’s resources to spy on you.
Chances are, yes, they do. Or will escalate to doing that later.About time. Never understood those folks that feel the need to take a knob photo and message it to a stranger or near stranger. I mean do they do that in public when they see a person they like?
Well I DEFINITELY don’t want to see pics of THOSE, thank you very much!Gotta make sure you never see the tools that were used to create you.
That’s not “everything”, apple analysing images in their photo libray service is less intrusive than all images rendered by the device.Yes? How else do you think the also identify the faces of your family and friends, and now even your dogs?
It does. 👍🏻Hopefully the processing happens on device.
This is about hiding the content BEFORE the user blocks it.This is just a rebadge of the CSAM debacle. It is not to be celebrated that ON DEVICE scrutiny is being employed. That was the whole point of being so against the CSAM situation because it was ON DEVICE, on your device, a device you have paid for, and where even if its optional at present, you have no way of knowing whether scrutiny of your data is taking place and just not being filtered, and its a very short step from scrutinising everything on YOUR device, to providing a backdoor for regimes and governments, or organisations that want control over everything. It is inevitably in the guise of safeguarding, but it is not about safeguard it is about snooping. Defies the point of double end encryption and makes Apple's policy on privacy look rather empty.
We went through this argument previously, where so many in the industry realised the dangers, but where in my opinion this is just as rebadging of the same situation showing determination by Apple not about safeguarding, but the opposite!
What would be the point of double end encryption between devices on anything if then ON DEVICE surveillance is taking place.
It shows as being optional, but that is not to say the surveillance is not taking place, but not being utilised in screening.
By all means screen on Apple's servers, screen on the cloud, as its not our device, but leave our devices, devices we have paid for, leave them alone. A very slippery slope indeed and the fact Apple has wheeled this reincarnated surveillance in a different guise again is worrying. No one should be celebrating it being ON DEVICE that is the problem! It's YOUR DEVICE, YOU PAID FOR IT, YOU PAY TO POWER IT, YOU PAY FOR THE PROCESSING SPEED, YOU PAY AND SHOULD EXPECT THE HIGHEST LEVEL OF PRIVACY.
By all means screen servers owned by Apple, Google or any off device, because we don't own that.
One poster mentioned cousin repeatedly sending pictures of his balls...EASY ANSWER BLOCK IT!
I doubt there are many unsolicited communications from unknown senders, in which case a customer has the option on THEIR device to block the sender or let them know they will be blocked in that event. A DEVICE OWNER'S DECISION ON THEIR DEVICE.
In the UK its already illegal to send unsolicited photos of that ilk, so it doesn't require Apple to be surveillance masters of OUR DEVICES.
This isn’t CSAM at all. It’s not forced on, on all devices. It doesn’t report back to employees to review and sick the police after you. It will be interesting to see how it works in practice. The downside I suspect is that for people who enable this, if someone send pics about breast cancer, or Michelangelo’s David it will be blocked. With that said if you identify what you’re sending they can choose to see it if they really want to. You’ll be able to ignore gotse boy pics too! 😉Yeah you definitely want a human to check the image before law-enforcement gets involved.
I know this is not CSAM, I was merely pointing out that for detecting CSAM, you’d want a human to verify the contents of the picture BEFORE reporting it to law-enforcement. We know how badly wrong automating law enforcement can be.This isn’t CSAM at all. It’s not forced on, on all devices. It doesn’t report back to employees to review and sick the police after you. It will be interesting to see how it works in practice. The downside I suspect is that for people who enable this, if someone send pics about breast cancer, or Michelangelo’s David it will be blocked. With that said if you identify what you’re sending they can choose to see it if they really want to. You’ll be able to ignore gotse boy pics too! 😉
View attachment 2214935
Were you successful in your scientific trials? There must be more than enough material on the internet for that 🤔
Built in weenie detector? Wonder how they trained that AI….
It becomes such a big big thing and upsets people more with increasing restrictions.Gotta make sure you never see the tools that were used to create you.
What about non-adult users, do they not deserve this setting as well? Maybe it’s not actually on-device, if an account is needed to activate the feature.The feature will prevent adult iPhone users from being subjected to unwanted imagery.
I really appreciate this feature and wonder if Apple can implement an adjustment to the sensitivity settings. There are lots of images that might not be full nudity but nonetheless would be nice to have hidden.
Because this forum is full of frat boysWhy is everybody assuming this is for pictures of male anatomy only???