Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Will you leave the Apple ecosystem because of CSAM?


  • Total voters
    1,040
Status
Not open for further replies.
Still doesn't answer the question. I understand the proposed process. What I am having difficulty understanding is the "two or more child safety organizations operating in separate sovereign jurisdictions".

When I went digging, most institutions like NCMEC get the data from ICMEC or a global partner. These "databases" are shared under Government jurisdiction and they pretty much all use PhotoDNA and hashing as a "fingerprint". So even if Apple goes to two - or more - that means they already have buy-in from that Government - they are getting pretty much the same info and same hash.
The Macrumours team originaly wrote the article, which the forum member kindly quoted in answering your question. You were still not happy with the response so instead of trying to trip up the forum member to find the information you require, why don't you ask the Macrumours editor(s) who wrote the article to explain themselves on what they wrote?
 
We complained a little and Apple delayed the entire thing. You have no idea of our true power :)
I sincerely doubt the "we", above, was MacRumors posters. If MacRumors posters really held sway over Apple policies, features and future directions:
  1. Tim Cook would be long gone and replaced with someone who knew what they were doing, like Musk
  2. Sideloading, multiple app stores would have already been implemented
  3. Under glass fingerprint sensor would have been implemented
  4. Notch would be long gone
  5. Any version of ios would be available for update
  6. iphones would be bigger, with bigger battery and cheaper
  7. iphone would lead android with features, instead of "copying" features from android
  8. Apple stores would never have been redesigned and would have retained the old format
  9. iphones would have had usb-c already
  10. Apple would have lowered their margins and absorbed the cost of implementing features and fx deviations
  11. ...
There's much more on the list, but you get where we are going. :p
 
I sincerely doubt the "we", above, was MacRumors posters. If MacRumors posters really held sway over Apple policies, features and future directions:
  1. Tim Cook would be long gone and replaced with someone who knew what they were doing, like Musk
  2. Sideloading, multiple app stores would have already been implemented
  3. Under glass fingerprint sensor would have been implemented
  4. Notch would be long gone
  5. Any version of ios would be available for update
  6. iphones would be bigger, with bigger battery and cheaper
  7. iphone would lead android with features, instead of "copying" features from android
  8. Apple stores would never have been redesigned and would have retained the old format
  9. iphones would have had usb-c already
  10. Apple would have lowered their margins and absorbed the cost of implementing features and fx deviations
  11. ...
There's much more on the list, but you get where we are going. :p
Apple would have the usb port on the front of the Magic Mouse 😉
 
  • Like
Reactions: Pummers
How can a device hash an image, have a reference database locally stored, and not know if it's a match? It has to know whether the photo matches or not, or the server couldn't decrypt positively-matched vouchers. The server wouldn't even know how many positive vouchers there are. It has to be flagged in some manner, and there's no way for the device to do that if it doesn't know if an image matches or not.

"Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image."


That human review is our only protection from abuse. Frankly, I have little faith in it.
It's not your only protection. There's another hashing process done on the server that's separate from the on-device system, except it's only done to a minimum of 30 images rather than your whole library.

It's a hybrid system. First part is on device, second is on server and only when it passes through both of those does it get to human review.
 
The Macrumours team originaly wrote the article, which the forum member kindly quoted in answering your question. You were still not happy with the response so instead of trying to trip up the forum member to find the information you require, why don't you ask the Macrumours editor(s) who wrote the article to explain themselves on what they wrote?

Actually no. The original article stated some things I have had trouble verifying and what I have found does not support the point. On the surface Apple’s answer sounds good. Once I dig below the surface of Apple’s explanation, the murkier the “solution” becomes.
 

"Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image."



It's not your only protection. There's another hashing process done on the server that's separate from the on-device system, except it's only done to a minimum of 30 images rather than your whole library.

It's a hybrid system. First part is on device, second is on server and only when it passes through both of those does it get to human review.
That's the shortest Wikipedia article I've ever seen. Will do more research.

I understand this system works on both sides, but if there are non-CSAM hashes in the database, it'll pass straight through both automated checks because the system assumes all hashes in the database are CSAM. Only at the human review stage could abuse be identified (why does this account have a flag for saved memes)?
 
I sincerely doubt the "we", above, was MacRumors posters. If MacRumors posters really held sway over Apple policies, features and future directions:
  1. Tim Cook would be long gone and replaced with someone who knew what they were doing, like Musk
  2. Sideloading, multiple app stores would have already been implemented
  3. Under glass fingerprint sensor would have been implemented
  4. Notch would be long gone
  5. Any version of ios would be available for update
  6. iphones would be bigger, with bigger battery and cheaper
  7. iphone would lead android with features, instead of "copying" features from android
  8. Apple stores would never have been redesigned and would have retained the old format
  9. iphones would have had usb-c already
  10. Apple would have lowered their margins and absorbed the cost of implementing features and fx deviations
  11. ...
There's much more on the list, but you get where we are going. :p
Oh god nothing would have me ditching Apple and running for the hills quicker than bringing Elon ”Ego” Musk 🙄
 
That's the shortest Wikipedia article I've ever seen. Will do more research.

I understand this system works on both sides, but if there are non-CSAM hashes in the database, it'll pass straight through both automated checks because the system assumes all hashes in the database are CSAM. Only at the human review stage could abuse be identified (why does this account have a flag for saved memes)?
Playing devils advocate here, but if the CSAM database was only on the server, couldn't they add anything they want to it anyways?

In technical terms, nothing has changed. If you don't upload any photos, nothing is checked for CSAM, but if you do, then you allow your images to be checked. Period.
 
It is disconcerting that a State entity is moving so quickly to possibly leverage this.

Haven’t ditched Apple - currently - but I am in the process of doing so.
My move started well prior to this Apple announcement for other reasons.
And so publicly too! I wonder if there's anything else like this that's public but hasn't been as well covered in the news.
 
  • Like
Reactions: dk001
Playing devils advocate here, but if the CSAM database was only on the server, couldn't they add anything they want to it anyways?

In technical terms, nothing has changed. If you don't upload any photos, nothing is checked for CSAM, but if you do, then you allow your images to be checked. Period.
This is only true if you are taking Apple's word. I don't. The evidence doesn't support it.

1. If the system actually works the way Apple claims, then it is a weaker system in terms of protecting children than the ones in use by others such as Facebook, Google, and Microsoft. It actually tells perps how not to get caught.
2. If that were the case the center for Missing children would not have been in such glee. They were very happy because Apple was finally going to break the "sacred" device barrier.
3. Apple has lied and been deliberately obtuse many times in the past when it served them. Once again, no matter how this system starts out, it can be altered at any time, and without notice depended on how the EULA reads. Most EULAs, state that the terms can change without notice, to include Apple's own.

Again, your argument comes down to nothing more than just trusting Apple, even though they have proven to be very capable of lying and screwing over their users in the past.
 
Playing devils advocate here, but if the CSAM database was only on the server, couldn't they add anything they want to it anyways?

In technical terms, nothing has changed. If you don't upload any photos, nothing is checked for CSAM, but if you do, then you allow your images to be checked. Period.
Again, anything in theory is possible, but we are dealing with real life here, not some mission impossible movie. I will go back to my original statement - any proposal to "fix" targeted citizens by tainting the CSAM database is going to require jumping through so many hoops that it's probably easier for the government to just trump up some other crime to charge them with, like planting a bag of drugs in their car or honey potting them.
 
  • Like
Reactions: dk001
Playing devils advocate here, but if the CSAM database was only on the server, couldn't they add anything they want to it anyways?

In technical terms, nothing has changed. If you don't upload any photos, nothing is checked for CSAM, but if you do, then you allow your images to be checked. Period.
They could, but if the whole system is only on-server, that's Apple prerogative. Don't put this on my device.

Right, nothing has changed at the moment...because we fought Apple on this. Best way to avoid a slippery slope is to not even start down it.
 
I've just went farther in my 'downgrading' process (which, IMO feels more like an 'upgrade') Switched out S20 FE for my old Pantech Breakout (Android 2.3). I'm actually using SeaMonkey 1.1 to browse this site (doesn't support any trackers at all). I've skeuomorphed everything. My S4 Mini got switched out for my HTC Thunderbolt. I forgot how wonderful those devices were. Thick bricks, easy on the hand and easy to look at without hurting my eyes. No more PWM. I was going to reactivate the Samsung Flight II (Cingular, 2009) to go total privacy but I've held off that--for now. Privacy is the intent as much as possible, but that wonderful UI design and everything being skeuo is perfect compared to the mix I had before.
 
There's another hashing process done on the server that's separate from the on-device system, except it's only done to a minimum of 30 images rather than your whole library.
That's interesting - think I missed that in my review of the initial papers. Can I impose on you to point me towards the details? Appreciate it!
 
On some sites, I've read that China monitors their citizens via WeChat for views of government dissent, unpopular opinions, or 'misinformation' (whatever that is--that's the new term for critical thinking or skepticism it seems today) but you can take it with grains of salt. Does anyone have any way to know if that's happening? If it is, then I'm sure we're not far from it, and CSAM scanning is just the start.
 
That's interesting - think I missed that in my review of the initial papers. Can I impose on you to point me towards the details? Appreciate it!

Page 13 of https://www.apple.com/child-safety/...del_Review_of_Apple_Child_Safety_Features.pdf

Human review and reporting

Once Apple's iCloud Photos servers decrypt a set of positive match vouchers for an account that exceeded the match threshold, the visual derivatives of the positively match- ing images are referred for review by Apple. First, as an additional safeguard, the visual derivatives themselves are matched to the known CSAM database by a second, independent perceptual hash. This independent hash is chosen to reject the unlikely possi- bility that the match threshold was exceeded due to non-CSAM images that were adversarially perturbed to cause false NeuralHash matches against the on-device encrypted CSAM database. If the CSAM finding is confirmed by this independent hash, the visual derivatives are provided to Apple human reviewers for final confirmation.”
 
  • Like
Reactions: mr_jomo
BTW, we will not cause damage to Apple since we are a minority that understands the implications of CSAM. I have asked a few ordinary iPhone users about it, some said ”what is CSAM?” others said “I don’t care, they can search my stuff and they won’t find anything”. Oh well, c’est la vie.
 
  • Like
Reactions: mrex
BTW, we will not cause damage to Apple since we are a minority that understands the implications of CSAM. I have asked a few ordinary iPhone users about it, some said ”what is CSAM?” others said “I don’t care, they can search my stuff and they won’t find anything”. Oh well, c’est la vie.

Unfortunately that is true… But apple postponed this feature, because they are, for sure, afraid of selling. They cant risk september event and delayed the code to be activated.

i think it is not so ”small” anyway and it is not only the CSAM but other features too.. apple had to made this decision when they reliazed that their pr stunt didnt work. It seems to be ”small” but i quess it would be quite markable thing for teens for example (parents spying on you - who wants iphone when you know your parents spy on you? Apple cant risk loosing teens.) and many countries, e.g. in china to see how it can be used against people. (China is the seconda largest market place and apple cant forget/bypass it.)
 
Last edited:
This postponement could be bad news as well. What if they're silently 'updating' it to include scanning for far more than just CSAM now?
 
Status
Not open for further replies.
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.