Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Rafagon

macrumors 6502a
Original poster
Jun 19, 2011
988
1,095
Miami, FL
Greetings.

Immediately below this paragraph is the original screenshot (after I cropped it) from a post on Threads. I'm trying to erase her face using the photo Clean-up Tool because it contributes nothing to my screenshot. It's a screenshot of a grammar lesson for Russian. In the original post, the lady explains the concept in question.

IMG_2015.jpeg

Immediately below this paragraph is what the Clean-up Tool does when I attempt to erase her face. Instead of erasing her face, it pixelates it and near the bottom of the screen, as you can see in my screenshot, it displays the message "Safety filter applied." This message then disappears after a couple of seconds.


IMG_2023.png

I'm not trying to protect her identity, I'm trying to erase her face completely!

I have found no way to change this.

Can I go back to erasing the part of the photo I'm circling instead of protecting the identity of the person by pixelating his or her face?

Why does it assume I'm trying to hide the identity of the person?

And if in fact that's why the Clean-up Tool is pixelating the person's face (to protect their identity), then why won't it let me erase her face completely, which would protect her identity even better than merely pixelating it?

I know the Clean-up Tool is perfectly capable of erasing an entire person from a photo. Why won't it let me just erase their face?

This is extremely frustrating.

OS and device: iOS 18.2 RC on an iPhone 16 Pro Max

Any and all help will be greatly appreciated.
 
The safety filter feature has actually nothing to do with protecting the person's identity. This feature gets triggered when a picture is recognized as being NSFW. In this case, your picture was mistakingly identified as NSFW. You may submit feedback so Apple can improve their detection logic.
 
  • Like
Reactions: Rafagon
The safety filter feature has actually nothing to do with protecting the person's identity. This feature gets triggered when a picture is recognized as being NSFW. In this case, your picture was mistakingly identified as NSFW. You may submit feedback so Apple can improve their detection logic.
I would have never guessed that was the case.

Thank you for your reply and yes, I'll submit feedback to Apple; hopefully that option isn't too difficult to find and hopefully it'll even send a screenshot to Apple so they can see just how badly their NSFW-detection algorithm failed in this particular case.
 
  • Like
Reactions: ManuCH
Oh. I just looked again at the screenshot I was trying to use the Clean-up Tool on.

I see. Her mouth was open. I'm sure that had something to do with the NSFW false alarm.

As if women don't open their mouths to utter words.

Apple Intelligence's mind was very much in the gutter.
 
  • Like
Reactions: UpsideDownEclair
Oh. I just looked again at the screenshot I was trying to use the Clean-up Tool on.

I see. Her mouth was open. I'm sure that had something to do with the NSFW false alarm.

As if women don't open their mouths to utter words.

Apple Intelligence's mind was very much in the gutter.

Yeah, I was quite surprised about that one too. I guess that's why it's called a beta 😊
 
Same issue in Sequoia and Photos app. Anything Apple Intelligence decides is a face (including items not that - like some graphics) the app will pixelate, rather than remove in Clean Up. They've removed Retouch.

'Apple Intelligence' proving to be an oxymoron, while Apple's penchant for playing Nanny - deciding what we really want or need - is irritating.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.