Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Will you update to iOS 15 and let Apple’s operating system search your personal devices for CSAM?

  • Yes, I will update to iOS 15

    Votes: 117 68.0%
  • No, I will stay on iOS 14.7 or earlier

    Votes: 55 32.0%

  • Total voters
    172
Everyone make sure to read Ben Thompsons piece on this

Really really good - particularly how he makes the distinction between a Policy Decision by Apple and the Capability they've now built in to every device

 
One’s device ought be one’s property, with all of the expectations of ownership and privacy that entails; cloud services, meanwhile, are the property of their owners as well, with all of the expectations of societal responsibility and law-abiding which that entails. It’s truly disappointing that Apple got so hung up on its particular vision of privacy that it ended up betraying the fulcrum of user control: being able to trust that your device is truly yours.
From the source above.
 
Well, I was going to. I was really looking forward to the new Safari features to help reduce tab clutter, but that’s a big, Fat HELL NO now. What was the idea behind this? So now I can get in trouble when some random joker sends me a picture over SMS? What happens when an account is breached and a bunch of “unapproved” photos get uploaded to iCloud and then synced to your devices?
Same exact thing that can happen RIGHT NOW on iOS 14. If someone uploads CSAM files to your iCloud, they're scanned. There's no difference whatsoever. You can still get "pranked".
 
  • Angry
Reactions: dk001
Well it's a shame that you are never going to be able to update your iPhone again because you misunderstood what Apple is doing. The child safety features with Messages has NOTHING to do with the CSAM reporting with iCloud. All Messages will do is warn children before they view or send sexually explicit material and their parents will be notified only IF they choose to view or send that material.

Also, don't you use a strong password and 2FA for your iCloud account? You should be! In fact, 2FA is mandatory. So in the extremely remote chance that your iCloud account is breached by some expert hacker who also has one of your Apple devices in their possession (to get the 2FA code) AND device passcode AND happens to know and dislike you and wants to ruin your life by uploading CSAM to your iCloud account, then I guess you have something to be worried about. If you think there's a good chance of that happening, then I'd submit that either you're running with the wrong crowd or you're an extremely paranoid person.
In fact all of this can be done now without iOS 15 as iCloud does indeed scan your uploaded photos for CSAM.
 
  • Haha
Reactions: dk001
Your devices would scan photos and perform object recognition to enable search and group photos by the people in the photos, but there wasn’t framework to report it to authorities, or to Apple. This update adds a whistleblower to your phone so that if some image hashes match a secret list, you become a big, red dot on Apple’s radar.
Just like before when all the scanning was done at Apple. This is no different. Your life could be ruined before if you upload CSAM to Apple all the same. People don't really realize that their photos have been scanned for this content way before this announcement.
 
In fact all of this can be done now without iOS 15 as iCloud does indeed scan your uploaded photos for CSAM.

Yes, but even some people who know that still have a problem with it because they mistakenly think an on-device scan is more invasive, when in reality it's much more private since Apple can't see any scan results except CSAM matches. So if a user has no CSAM on their iPhone or iPad, then Apple isn't seeing ANY scan data from their photo library. In fact, NO ONE is seeing any scanning data in that case.
 
Yes, but even some people who know that still have a problem with it because they mistakenly think an on-device scan is more invasive, when in reality it's much more private since Apple can't see any scan results except CSAM matches. So if a user has no CSAM on their iPhone or iPad, then Apple isn't seeing ANY scan data from their photo library. In fact, NO ONE is seeing any scanning data in that case.

Exactly.

The problem with your post is you’re using a seldom-used thing knows as common sense. (That’s a joke/sarcasm people) Which, is sadly very much missing when it comes to this non-issue of a “feature”.
 
Last edited:
  • Haha
  • Like
Reactions: dk001 and usagora
Yes I will be upgrading. There's a ton of misinformation and knee jerk doomsday posts going around with a lot of people not understanding how any of this actually works.
 
Exactly.

The problem with your post is you’re using a seldom-used thing knows as common sense. (That’s a joke/sarcasm people) Which, is sadly very much missing when it comes to this non-issue of a “feature” as demonstrated by a few of these posts IMO.


It is technically more private however is it right?

This is like having a HOA private cop in your house to check and verify before you do "A" but will report you if you do wrong.

For the little bit that Apple gets from this, this seems to be a big effort for very little if any gain.

Good read.
 
  • Like
Reactions: schneeland
This is like having a HOA private cop in your house to check and verify before you do "A" but will report you if you do wrong.

If that "HOA private cop" is a non-human/robot who never reports anything but illegal activity when you attempt to share that illegal activity with others, then yes. Again, no one from Apple or any other human is seeing anything from your phone except scan data from illegal material, and only then if you upload it to iCloud.
 
  • Like
Reactions: Runs For Fun
If that "HOA private cop" is a non-human/robot who never reports anything but illegal activity when you attempt to share that illegal activity with others, then yes. Again, no one from Apple or any other human is seeing anything from your phone except scan data from illegal material, and only then if you upload it to iCloud.

Human, AI or puppy; I do not want nor care for the snoop coverage.
Aside from this, why is Apple even doing this? It makes no sense at all and the opportunity of them discovering CSAM is extremely limited at best. Look at Google's results on server / Gmail. Heck, I wonder what Apple has found on their server and mail scans ...

This amount of effort based on the current info makes no sense unless they have a secondary / tertiary agenda. If this was an effort to make Apple appear more "private" someone there borked the cow.
 
  • Like
Reactions: turbineseaplane
Human, AI or puppy; I do not want nor care for the snoop coverage.
Aside from this, why is Apple even doing this? It makes no sense at all and the opportunity of them discovering CSAM is extremely limited at best. Look at Google's results on server / Gmail. Heck, I wonder what Apple has found on their server and mail scans ...

This amount of effort based on the current info makes no sense unless they have a secondary / tertiary agenda. If this was an effort to make Apple appear more "private" someone there borked the cow.

You simply don't get it (or perhaps don't want to get it). If all the explanation from me, others, and Apple themselves hasn't gotten through, then there's no point in repeating it all 👋
 
  • Like
Reactions: Runs For Fun
You simply don't get it (or perhaps don't want to get it). If all the explanation from me, others, and Apple themselves hasn't gotten through, then there's no point in repeating it all 👋

To you it may be crystal clear however the more I dig into this the muddier it gets.
Best of luck to you. Meanwhile I will continue to dig deeper to better understand just what this is and not the rhetoric many post / publish.
 
To you it may be crystal clear however the more I dig into this the muddier it gets.
Best of luck to you. Meanwhile I will continue to dig deeper to better understand just what this is and not the rhetoric many post / publish.
👌🏻
 
  • Like
Reactions: dk001
Turn off iCloud Photo Library and no scanning will be done. That's how you opt-out
Wrong.
The scanning happens always. The results only get uploaded to Apple when you turn iCloud photos on (or Apple turns it on for you without consent like they usually do every major update).

And soon Apple will scan all keyboard entries unless you disabled spell check, search suggestions and emojis etc.

Following Siri will scan 24/7 for suspicious words, sending hour long aac records if you said "worst President" more than five times.
 
Everyone make sure to read Ben Thompsons piece on this

Really really good - particularly how he makes the distinction between a Policy Decision by Apple and the Capability they've now built in to every device

Yes, this is the key point for me - a policy is much weaker and requires much more trust (especially in the light of Apple's past policy revisions, e.g. in China) than the lack of the respective capability in the device.
But even beyond that it's a solid article as the author recognizes that it's not all black and white an there is a necessary tradeoff between privacy and law enforcement in the Internet age.
 
Doubt it. It's probably too big and complex to just sneak it into a point release of 14. I would bet my life on it.
I agree, it’s not even going to be in 15.0 at launch time. Apple specifically said “CSAM Detection will be included in an upcoming release of iOS 15 and iPadOS 15.”
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.