Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Rogifan

macrumors Penryn
Original poster
Nov 14, 2011
24,824
32,386
I have some photos from the beach. I’m trying to remove some people far in the background. On some of the photos it works perfectly but on others it’s all pixelated and I see a message at the bottom that a safety filter is applied. Maybe it’s an 18.1 bug but if not will there be a way to turn this off or does Apple get to decide what’s removable and what’s not?
 
  • Like
  • Wow
Reactions: twanj and g-7
Good question. Wish I had the answers because this is stupidly annoying that Apple feels they need to tell me what I need to erase vs. pixilate.
 
I have some photos from the beach. I’m trying to remove some people far in the background. On some of the photos it works perfectly but on others it’s all pixelated and I see a message at the bottom that a safety filter is applied. Maybe it’s an 18.1 bug but if not will there be a way to turn this off or does Apple get to decide what’s removable and what’s not?
I also have issues with beach pictures applying the safety filter… strange. I wonder what is causing it.
 
  • Like
  • Wow
Reactions: bluespark and twanj
It seems like anything with skin gets pixelated. I tried removing a tat from my chest and it pixelated. Same thing with one on my arm. It’s things like this that occasionally make me consider switching OS’s. Sammy and google don’t blur images.

Hopefully this will change as they make improvements
 
  • Like
  • Wow
Reactions: bluespark and twanj
I have some photos from the beach. I’m trying to remove some people far in the background. On some of the photos it works perfectly but on others it’s all pixelated and I see a message at the bottom that a safety filter is applied. Maybe it’s an 18.1 bug but if not will there be a way to turn this off or does Apple get to decide what’s removable and what’s not?
You must use the Pencil to paint the object, paint a wider area around the object, cover it solid. Then apply clean up to the object painted.
 
  • Angry
Reactions: Bazza1
You must use the Pencil to paint the object, paint a wider area around the object, cover it solid. Then apply clean up to the object painted.
Doesn't work for me.

This is excessive nannying in my view. Clearly Apple Intelligence is scanning the photo for nudity and instead of just letting me edit the photo, has determined that it needs censoring. This is a big over-reach of censorship. I'm a grown adult and can decide for myself what is and isn't appropriate for me to look at.

I'm also very concerned at how the photos are being scanned - is it on device or in the cloud? And what is the determining factor for a "safety filter" being applied or not? Most of the photos I've tried it on where the filter was applied were not NSFW, so whatever algorithm it's using isn't very good at its job either.

I'm surprised there isn't a louder reaction to this from users. Is this Apple covertly going back on their previous decision not to scan all your photos in the cloud for CSM? Not only would that be a massive invasion of privacy, but they're also just clearly not very good at determining what is and isn't NSFW content, and are flagging things incorrectly.

The privacy issue is a big concern for me. What are they doing?
 
  • Like
  • Love
Reactions: Tagbert and rjp1
Doesn't work for me.

This is excessive nannying in my view. Clearly Apple Intelligence is scanning the photo for nudity and instead of just letting me edit the photo, has determined that it needs censoring. This is a big over-reach of censorship. I'm a grown adult and can decide for myself what is and isn't appropriate for me to look at.

I'm also very concerned at how the photos are being scanned - is it on device or in the cloud? And what is the determining factor for a "safety filter" being applied or not? Most of the photos I've tried it on where the filter was applied were not NSFW, so whatever algorithm it's using isn't very good at its job either.

I'm surprised there isn't a louder reaction to this from users. Is this Apple covertly going back on their previous decision not to scan all your photos in the cloud for CSM? Not only would that be a massive invasion of privacy, but they're also just clearly not very good at determining what is and isn't NSFW content, and are flagging things incorrectly.

The privacy issue is a big concern for me. What are they doing?
I’ve found that if you can find a way to cover the entire image (draw) or edit the photo entirely to just the part you want edited with a shape crop, the ai won’t pick it up, but you have to hide as much as you can in order for it to work. Yeah super annoying
 
  • Like
Reactions: jjdodders
I’ve found that if you can find a way to cover the entire image (draw) or edit the photo entirely to just the part you want edited with a shape crop, the ai won’t pick it up, but you have to hide as much as you can in order for it to work. Yeah super annoying
It's usually things in the photo like dirt on mirrors that I used to be able to blur out on the desktop app with the heal tool. That's the kind of thing I want to edit but that tool doesn't exist now and you can ONLY use the AI fix tool. I hate it.
 
  • Like
Reactions: UpsideDownEclair
Insane! We live a block from a beach and recently had a family outing. I started trying to remove things like pop cans and garbage bins from some of the photos but because there were men in swim trunks in the photos (knee length mind you) I could only “redact” items. The bulk of my photos are at the beach so this feature is now useless! What a joke. If AI is so intelligent, it should do a better job at recognizing inappropriate content or let us override it.
 
Just now running into this. Terrible. Nanny Apple overreaching for sure here.
 
Apple removed Retouch and replaced it with Clean Up. On iOS and MacOS.
If Clean Up is meant to be 'safety' thing, it seems to work in reverse of its intent - making something obvious when you'd rather just delete it. Further, if it 'thinks' something might be face or graphic, it'll pixelate it. Retouch - while hardly perfect - was far more granular and allowed user to decide what to keep. Current offering (and now into second OS version with it active) is terrible.
 
  • Like
Reactions: jjdodders
I want retouch back. They don't even have to get rid of Clean up. Leave it there, I won't use it, I just want the retouch tool. I'm sick of basically all the photo editing apps now only having AI editing tools. It's all garbage.
 
Best advice? Pay the £$5 for Pixelmator and use the repair tool.

Personal betterment > relying on AI.
 
  • Like
Reactions: NomoN
Doesn't work for me.

This is excessive nannying in my view. Clearly Apple Intelligence is scanning the photo for nudity and instead of just letting me edit the photo, has determined that it needs censoring. This is a big over-reach of censorship. I'm a grown adult and can decide for myself what is and isn't appropriate for me to look at.

I'm also very concerned at how the photos are being scanned - is it on device or in the cloud? And what is the determining factor for a "safety filter" being applied or not? Most of the photos I've tried it on where the filter was applied were not NSFW, so whatever algorithm it's using isn't very good at its job either.

I'm surprised there isn't a louder reaction to this from users. Is this Apple covertly going back on their previous decision not to scan all your photos in the cloud for CSM? Not only would that be a massive invasion of privacy, but they're also just clearly not very good at determining what is and isn't NSFW content, and are flagging things incorrectly.

The privacy issue is a big concern for me. What are they doing?
And how does this approach even make sense? If Apple views nudity as inherently bad, shouldn't that be reason enough to allow a user to edit out the supposedly nude people?

Beyond that, this view on nudity makes no sense. Nude beaches, for example, are typically entirely legal where they exist. Women can be legally topless in NYC or San Francisco (to name just two such places), and this has been true for decades. There are many, many other situations in which nudity is non-objectionable, or at least legal. All of this, of course, is in addition to the fact that Apple's fairly unintelligent AI is mistakenly confusing skin generally with particular body parts it thinks must remain hidden.

I'm all for companies like Apple taking steps to combat exploitation of minors, assuming that's the goal here. But if those measures affect normal use, then they aren't ready for prime time.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.