Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I agree, got nothing to hide. just some adult women nudes that ive been sent, let em get a view.
Consider the Cypherpunk manifesto:
Privacy is necessary for an open society in the electronic age. Privacy is not secrecy. A private matter is something one doesn't want the whole world to know, but a secret matter is something one doesn't want anybody to know. Privacy is the power to selectively reveal oneself to the world.
 
It’s actually not only about having illegal nudes on your phone. It’s about Apple using IOS to actually scan your phone and that technology being abused at some point in time to keep tabs on what you do on your phone. If Apple said they will implement their CSAM algorithm on icloud servers only, that’s fine by me. But them actually enabling scanning on my phone by adding that algorithm to IOS code is too invasive for my taste. Yes, Apple says if you don’t want scanning on your phone, don’t subscribe to iCloud. But then if you don’t subscribe to iCloud, you lose a lot of the it-just-works features across apple devices. So I ditched my iPhone but kept my iMac and iPad for now.
 
  • Like
Reactions: Mendota
It’s actually not only about having illegal nudes on your phone. It’s about Apple using IOS to actually scan your phone and that technology being abused at some point in time to keep tabs on what you do on your phone. If Apple said they will implement their CSAM algorithm on icloud servers only, that’s fine by me. But them actually enabling scanning on my phone by adding that algorithm to IOS code is too invasive for my taste. Yes, Apple says if you don’t want scanning on your phone, don’t subscribe to iCloud. But then if you don’t subscribe to iCloud, you lose a lot of the it-just-works features across apple devices. So I ditched my iPhone but kept my iMac and iPad for now.

You do realize this is also going on the iPad and Mac .... :(
 
If you can’t trust Apple about how their highly talked about and subsequently scrutinized CSAM scanning system works, how do you know that they don’t already scan and catalogue everything on your phone? How is it people are fixating on this one thing? This highly targeted FUD is nonsensical and illogical.
 
  • Like
Reactions: aliensporebomb
If you can’t trust Apple about how their highly talked about and subsequently scrutinized CSAM scanning system works, how do you know that they don’t already scan and catalogue everything on your phone? How is it people are fixating on this one thing? This highly targeted FUD is nonsensical and illogical.

We don’t.
Until this came to light, we were assuming and believed that Apple was protecting our privacy and the privacy of our devices.

Now, we are questioning what they have really been doing…
 
We don’t.
Until this came to light, we were assuming and believed that Apple was protecting our privacy and the privacy of our devices.

Now, we are questioning what they have really been doing…
I find this selectively suspicious.

My annoyance is that it sounds more like they are offloading the CPU/power costs to do the scanning on our phones so they don’t have to pay for it.

On the plus side, this may allow them to do both the required* scanning before uploading which means they may finally move to encrypting our photos before being stored in the cloud (at least the non-publicly-shared ones).

*required?: not sure about the legalities of the requirement and their legal responsibility to scan for CSAM, except it seems “everybody (else) is doing it (on their servers)”
 
*required?: not sure about the legalities of the requirement and their legal responsibility to scan for CSAM, except it seems “everybody (else) is doing it (on their servers)”
Ah, you're missing some detail here that might be helpful. Apple does not have* a legal responsibility to scan for CSAM. They do have a legal requirement to report it if discovered. Until now, they have simply not been looking.
*cannot have, actually, but will leave that to you to learn about
 
If you can’t trust Apple about how their highly talked about and subsequently scrutinized CSAM scanning system works, how do you know that they don’t already scan and catalogue everything on your phone? How is it people are fixating on this one thing? This highly targeted FUD is nonsensical and illogical.
Excellent question. While a forum discussion can certainly been nonsensical and illogical, you might enjoy the take over at Stratechery:


It highlights the difference between trust about technology, and trust about policy.
 
If you can’t trust Apple about how their highly talked about and subsequently scrutinized CSAM scanning system works, how do you know that they don’t already scan and catalogue everything on your phone? How is it people are fixating on this one thing? This highly targeted FUD is nonsensical and illogical.
I have to agree, I'm not 100% sure why so many people started fixating on Apple's CSAM measures either. Like, Apple technically has the power to remote control devices but no one complains about that and it's potential abuse.
 
I have to agree, I'm not 100% sure why so many people started fixating on Apple's CSAM measures either. Like, Apple technically has the power to remote control devices but no one complains about that and it's potential abuse.

Apple does have if they build in the ability to do so. They currently have not AFAIK.
With the new scanning tool, they are building in some of that ability and using it to report (unknown to you) violations to authorities. That leads to the other concern over false positives.

The thing to keep in mind; this has nothing to do with CSAM. It is the ability of the tool to execute on device scanning and surreptitiously report to outside authorities. Every security and privacy group has come out against this.

As there is no legal requirement for Apple to build this, no other device company has built this, and scanning server side better accomplishes Apple stated goal, you have to wonder … why? Apple is being mum about this.
 
  • Like
Reactions: Mendota and mrex
Many people are just paranoid about what Apple might do without any facts other than the paranoia in the brains.

meanwhile millions are busy ordering new iPhones and getting on with life.
 
  • Haha
Reactions: dk001
Many people are just paranoid about what Apple might do without any facts other than the paranoia in the brains.

meanwhile millions are busy ordering new iPhones and getting on with life.
Sad for them, really. And if everyone is just getting on with it, then why did they postpone the implementation? Hmmm.
 
  • Like
Reactions: Mendota
Excellent question. While a forum discussion can certainly been nonsensical and illogical, you might enjoy the take over at Stratechery:


It highlights the difference between trust about technology, and trust about policy.
Thank you for this link. I would say I agree with many of the objections to in phone scanning, but I think this following quote from the linked article. So one could argue that Apple is jumping the gun or that maybe they are “skating to where the puck will be”. So, yeah, maybe not be so preemptive on the rollout.

The key quote, IMO:

“Then again, Apple’s policy isn’t the only one that matters: both the UK and the EU are moving forward on bills that mandate online service companies proactively look for and report CSAM.”
 
Apple does have if they build in the ability to do so. They currently have not AFAIK.
With the new scanning tool, they are building in some of that ability and using it to report (unknown to you) violations to authorities. That leads to the other concern over false positives.

The thing to keep in mind; this has nothing to do with CSAM. It is the ability of the tool to execute on device scanning and surreptitiously report to outside authorities. Every security and privacy group has come out against this.

As there is no legal requirement for Apple to build this, no other device company has built this, and scanning server side better accomplishes Apple stated goal, you have to wonder … why? Apple is being mum about this.

I guess my main point is the “AFAIK” doing a lot of work in making your above point.

I’m glad folks are asking the question and not simply accepting everything as it comes, but as more comes to light about how the system works I am still feeling comfortable with it’s described design. Correct me if I have misunderstood, but the source lists of specific examples are sourced from a logical AND of examples from two responsible organizations, and the hashes of those examples are baked into the OS release. There is also a significant threshold of matched images needs to be reached before reports are pushed out to Apple’s human review team. So as described, it’s not a system some government can simply have Apple insert images and use generalized ML for identifying “politically unacceptable” content or OCR to identify and report out photo contents. Apple would have to build something specific to the requesting governments specs, which leaves us where we are right now as far as government being able to push for on phone scanning. In other words, to make this work as anything other than it’s described intention, it would have to be totally rewritten.

But I am listening to folks like you who are concerned. Without the legal requirement being in place yet, it does feel a bit overly aggressive and preemptive, so I appreciate this conversation.

Re Why is Apple doing this and why be mum about that… 2 guesses:
1. Why do it?: European Union is heading to the requirement for scanning (but yeah, why put it out before being required? Testing? Getting the public’s reaction?)
2. Why not respond yet?: PR? it needs to be bandied about in the public so they don’t look immediately defensive, they can here the specific complaints, and then hopefully address them.

So, what have I missed or misunderstood?
 
I guess my main point is the “AFAIK” doing a lot of work in making your above point.

I’m glad folks are asking the question and not simply accepting everything as it comes, but as more comes to light about how the system works I am still feeling comfortable with it’s described design. Correct me if I have misunderstood, but the source lists of specific examples are sourced from a logical AND of examples from two responsible organizations, and the hashes of those examples are baked into the OS release. There is also a significant threshold of matched images needs to be reached before reports are pushed out to Apple’s human review team. So as described, it’s not a system some government can simply have Apple insert images and use generalized ML for identifying “politically unacceptable” content or OCR to identify and report out photo contents. Apple would have to build something specific to the requesting governments specs, which leaves us where we are right now as far as government being able to push for on phone scanning. In other words, to make this work as anything other than it’s described intention, it would have to be totally rewritten.

But I am listening to folks like you who are concerned. Without the legal requirement being in place yet, it does feel a bit overly aggressive and preemptive, so I appreciate this conversation.

Re Why is Apple doing this and why be mum about that… 2 guesses:
1. Why do it?: European Union is heading to the requirement for scanning (but yeah, why put it out before being required? Testing? Getting the public’s reaction?)
2. Why not respond yet?: PR? it needs to be bandied about in the public so they don’t look immediately defensive, they can here the specific complaints, and then hopefully address them.

So, what have I missed or misunderstood?
That they shouldn't be doing it on the device. Do it on their servers if they feel the need.
 
  • Like
Reactions: Mendota
I guess my main point is the “AFAIK” doing a lot of work in making your above point.

I’m glad folks are asking the question and not simply accepting everything as it comes, but as more comes to light about how the system works I am still feeling comfortable with it’s described design. Correct me if I have misunderstood, but the source lists of specific examples are sourced from a logical AND of examples from two responsible organizations, and the hashes of those examples are baked into the OS release. There is also a significant threshold of matched images needs to be reached before reports are pushed out to Apple’s human review team. So as described, it’s not a system some government can simply have Apple insert images and use generalized ML for identifying “politically unacceptable” content or OCR to identify and report out photo contents. Apple would have to build something specific to the requesting governments specs, which leaves us where we are right now as far as government being able to push for on phone scanning. In other words, to make this work as anything other than it’s described intention, it would have to be totally rewritten.

But I am listening to folks like you who are concerned. Without the legal requirement being in place yet, it does feel a bit overly aggressive and preemptive, so I appreciate this conversation.

Re Why is Apple doing this and why be mum about that… 2 guesses:
1. Why do it?: European Union is heading to the requirement for scanning (but yeah, why put it out before being required? Testing? Getting the public’s reaction?)
2. Why not respond yet?: PR? it needs to be bandied about in the public so they don’t look immediately defensive, they can here the specific complaints, and then hopefully address them.

So, what have I missed or misunderstood?

That is part of the problem. Many of us trusted Apple when it came to privacy - like they claim in their statements and billboards. Then they announce 3 new features, one of which is the CSAM checker followed by how to get around it. Huh?

For the number of matched, that is more along the lines of, according to Apple, to ensure they lower the chance of false positives. 30 weighs towards a lack of accuracy IMO. Definitely not risk adverse.

As for inserting images, I suspect they can. Surreptitiously? Likely not. Just taking it over? Maybe. Likely. Especially other Nation States. Can it search even with the iCloud Photo off? Likely. Can it search for other things? Likely. How often does the database get updated? There are just too many unknowns and Apple isn’t talking. The biggie is why on device? That part makes little sense from both a privacy perspective and with Apples stated goals of cleaning up CSAM in the iCloud. Highly inefficient.

1. If the EU was pushing for this, why not start it in the EU? They have, by their own admission, the biggest CSAM chunk globally.
2. That is what many of us are hoping; they engage in open honest discussion. It isn’t the scanning, it is the tool. In a couple of interviews, even some NCMEC board members are concerned especially with regards to privacy. The legality is wide open. Unchallenged. PR? Maybe. Maybe it was all a PR stunt that went wrong … We don’t know.

In the absence of answers, look for worst case risk assessment and try to get Apple to the table and discuss / do another evaluation. That is my hope.

I think you got a chunk however I suspect based on what I have learned that this process is more fragile and has more false positives than is “safe”. There is a reason MS keeps PhotoDNA out of public view.

Now it is a wait and see. One thing for me; I have learned that Apple is no better than Google, MS and others When it comes to my privacy.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.