Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Will you leave the Apple ecosystem because of CSAM?


  • Total voters
    1,040
Status
Not open for further replies.
It could be because having CSAM in the cloud is stopping Apple from implementing end-to-end encryption? We could conjecture any rational, but we will most likely not find out the answer.

Think about what you just wrote .... 🤨

It will have no effect on what is already there ...
As long as alternative methods of uploading are available it won't stop it ...

So how does this stop E2EE?
It doesn't.
 
That's why they require 2 separate organizations that are not part of the same jurisdiction to each provide a list of hashes of known CSAM and anything that exists on one list but not the other is thrown away, so you would need to rival governments to somehow work together and get the same image into each database.

Apple will also provide the root hash of the CSAM database in the settings of the device as well as in a knowledge base on their website so you can guarantee that they haven't changed it in any way.

What two orgs?

When I looked for alternative legal repositories that could be used for this, I found that they pretty much share and align and use the same tech for hashing.

If you have more info on this kindly share.
 
What two orgs?

When I looked for alternative legal repositories that could be used for this, I found that they pretty much share and align and use the same tech for hashing.

If you have more info on this kindly share.

Page 6:
"The on-device encrypted CSAM database contains only entries that were independently submitted by two or more child safety organizations operating in separate sovereign jurisdictions, i.e. not under the control of the same government. Mathematically, the result of each match is unknown to the device. The device only encodes this unknown and encrypted result into what is called a safety voucher, alongside each image being uploaded to iCloud Photos. The iCloud Photos servers can decrypt the safety vouchers corresponding to positive matches if and only if that user's iCloud Photos account exceeds a certain number of matches, called the match threshold."



Pages 5 & 6:
"Could governments force Apple to add non-CSAM images to the hash list
No. Apple would refuse such demands and our system has been designed to prevent that from happening. Apple’s CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups. The set of image hashes used for matching are from known, existing images of CSAM and only contains entries that were independently submitted by two or more child safety organizations operating in separate sovereign jurisdictions. Apple does not add to the set of known CSAM image hashes, and the system is designed to be auditable. The same set of hashes is stored in the operating system of every iPhone and iPad user, so targeted attacks against only specific individuals are not possible under this design. Furthermore, Apple conducts human review before making a report to NCMEC. In a case where the system identifies photos that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC. We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future. Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it.

Can non-CSAM images be “injected” into the system to identify accounts for things other than CSAM?
Our process is designed to prevent that from happening. The set of image hashes used for matching are from known, existing images of CSAM that have been acquired and validated by at least two child safety organizations. Apple does not add to the set of known CSAM image hashes. The same set of hashes is stored in the operating system of every iPhone and iPad user, so targeted attacks against only specific individuals are not possible under our design. Finally, there is no automated reporting to law enforcement, and Apple conducts human review before making a report to NCMEC. In the unlikely event of the system identifying images that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC."
 

Page 6:
"The on-device encrypted CSAM database contains only entries that were independently submitted by two or more child safety organizations operating in separate sovereign jurisdictions, i.e. not under the control of the same government. Mathematically, the result of each match is unknown to the device. The device only encodes this unknown and encrypted result into what is called a safety voucher, alongside each image being uploaded to iCloud Photos. The iCloud Photos servers can decrypt the safety vouchers corresponding to positive matches if and only if that user's iCloud Photos account exceeds a certain number of matches, called the match threshold."



Pages 5 & 6:
"Could governments force Apple to add non-CSAM images to the hash list
No. Apple would refuse such demands and our system has been designed to prevent that from happening. Apple’s CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups. The set of image hashes used for matching are from known, existing images of CSAM and only contains entries that were independently submitted by two or more child safety organizations operating in separate sovereign jurisdictions. Apple does not add to the set of known CSAM image hashes, and the system is designed to be auditable. The same set of hashes is stored in the operating system of every iPhone and iPad user, so targeted attacks against only specific individuals are not possible under this design. Furthermore, Apple conducts human review before making a report to NCMEC. In a case where the system identifies photos that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC. We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future. Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it.

Can non-CSAM images be “injected” into the system to identify accounts for things other than CSAM?
Our process is designed to prevent that from happening. The set of image hashes used for matching are from known, existing images of CSAM that have been acquired and validated by at least two child safety organizations. Apple does not add to the set of known CSAM image hashes. The same set of hashes is stored in the operating system of every iPhone and iPad user, so targeted attacks against only specific individuals are not possible under our design. Finally, there is no automated reporting to law enforcement, and Apple conducts human review before making a report to NCMEC. In the unlikely event of the system identifying images that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC."

Still doesn't answer the question. I understand the proposed process. What I am having difficulty understanding is the "two or more child safety organizations operating in separate sovereign jurisdictions".

When I went digging, most institutions like NCMEC get the data from ICMEC or a global partner. These "databases" are shared under Government jurisdiction and they pretty much all use PhotoDNA and hashing as a "fingerprint". So even if Apple goes to two - or more - that means they already have buy-in from that Government - they are getting pretty much the same info and same hash.
 
Still doesn't answer the question. I understand the proposed process. What I am having difficulty understanding is the "two or more child safety organizations operating in separate sovereign jurisdictions".

When I went digging, most institutions like NCMEC get the data from ICMEC or a global partner. These "databases" are shared under Government jurisdiction and they pretty much all use PhotoDNA and hashing as a "fingerprint". So even if Apple goes to two - or more - that means they already have buy-in from that Government - they are getting pretty much the same info and same hash.
I think the point Apple was trying to make is they won't accept any hashes that governments decide to just slip in there. So Russia can't sneak in some and China and so on.
 
At the end of the day, Apple will do whatever it needs to do to make sure it's products get sold and if that means making decisions that are unpleasent to some of it's customers then so be it. They are in the business of selling, not in the business of making you feel good about yourself.

People will still buy iphones because buyers will convince themselves that having ones device scanned is something that was eventually going to happen so might as well get it over with now rather than later.
It is the kind of attitude that you are describing that sets the stage. No, "It is not something that will eventually happen", that is only true if your just accept it. You should watch some of those old cop shows from the 50's and 60's. It is unreal the latitude police were given back at the start of the war on drugs. They could just come in and search your house, your car, etc. People pushed back, and the laws were rewritten. So no, you don't have to sheeplike accept whatever someone, including the government wants.
 
This leads to another likely point. Apple probably no longer even has a choice. They will be (or have already been) strong-armed into doing something like this, and it will happen, with Apple's full cooperation or otherwise.

That is not to suggest any goodwill or innocence on the part of Apple - they are no less and no more malevolent than any government agency. It's increasingly difficult to tell the difference between the two.
Apple has a choice, it is just that now they have backed themselves into a corner. There is no way for them to make everyone happy now. It is too bad they didn't think this through before coming up with this "feature". Apple seems to be laboring under the delusion that what we would not accept from Google or Facebook, we would accept from them. That users would just take their word for it that their spyware really was designed to protect their privacy while invading it. "It is private because we say it is."
 
I am sure that some people are really up in arms and have, or are planning to do so, get rid of their Apple gear. And they should, if that is the message they believe they want to send to Apple. However, imo, it would be like throwing a bucket of water in the ocean expecting the tide to rise.
It is not about trying to get to Apple. It is about what you will or will not tolerate. I myself do not use any of Apple's services. They are the "toy" OS of my vast collection. I really really like tech, and I run every operating system to include Linux. If more people become aware and stop using their services, a major part of their revenue, it will have an impact. But the most salient outcome of this in terms of damage, is what Apple has done to their whole privacy marketing campaign.
 
  • Like
Reactions: Pummers and dk001
I think the point Apple was trying to make is they won't accept any hashes that governments decide to just slip in there. So Russia can't sneak in some and China and so on.

That could be. Thx.

What I can see is the US and a couple of other countries coordinating? Maybe.
US and UK?

What I did find in my digging is that most countries that are global partners to combat CSAM run in the US/UK/DE/etc... circles.
 
Apple has a choice, it is just that now they have backed themselves into a corner. There is no way for them to make everyone happy now. It is too bad they didn't think this through before coming up with this "feature". Apple seems to be laboring under the delusion that what we would not accept from Google or Facebook, we would accept from them. That users would just take their word for it that their spyware really was designed to protect their privacy while invading it. "It is private because we say it is."
The interesting thing during this "learning search" I have run into 3(4?) papers that predate Apple's announcement and all have reached the same conclusion; this would be cool however it is a privacy violation, will likely be challenged, and will likely be abused by State actors.

Yet Apple is trying to make something work that others have warned is a "dangerous" thing.

Something is very wrong with that last statement.
 
It is not about trying to get to Apple. It is about what you will or will not tolerate. I myself do not use any of Apple's services. They are the "toy" OS of my vast collection. I really really like tech, and I run every operating system to include Linux. If more people become aware and stop using their services, a major part of their revenue, it will have an impact. But the most salient outcome of this in terms of damage, is what Apple has done to their whole privacy marketing campaign.
That’s my point. One has to do what they feel is right. I personally don’t think there will be this huge backlash. And in my opinion privacy is not csam…unless you believe it’s your right to have copies of it.

And whether they are a “toy” is opinion. But as always horses for courses.
 
That's why they require 2 separate organizations that are not part of the same jurisdiction to each provide a list of hashes of known CSAM and anything that exists on one list but not the other is thrown away, so you would need to rival governments to somehow work together and get the same image into each database.

Apple will also provide the root hash of the CSAM database in the settings of the device as well as in a knowledge base on their website so you can guarantee that they haven't changed it in any way.
So, we are back to, just believe them.
That’s my point. One has to do what they feel is right. I personally don’t think there will be this huge backlash. And in my opinion privacy is not csam…unless you believe it’s your right to have copies of it.

And whether they are a “toy” is opinion. But as always horses for courses.
The OS itself is fine, it is Unix, but the policies that surround it have always been questionable for me, and this is just icing on the cake. The privacy issues is about on device scanning itself, not about the content, so your comment about CSAM is irrelevant. Drugs are illegal, but that does not mean it is okay for the police to come in and search your house.
 
What I can see is the US and a couple of other countries coordinating? Maybe.
US and UK?
The worst abusers - by a wide margin - will be the US and EU governments. Hunting down dissident political views will overtake CSAM as the primary use almost immediately. Covid and refugee skeptics will be targeted more than pedo's.
 
  • Like
Reactions: nickdalzell1
So, we are back to, just believe them.

The OS itself is fine, it is Unix, but the policies that surround it have always been questionable for me, and this is just icing on the cake. The privacy issues is about on device scanning itself, not about the content, so your comment about CSAM is irrelevant. Drugs are illegal, but that does not mean it is okay for the police to come in and search your house.
I got you, but the difference is there is a body of law surrounding what the police can't or can do. There are no laws in that regard stopping Apple from doing this, although the optics aren't great.
 
The worst abusers - by a wide margin - will be the US and EU governments. Hunting down dissident political views will overtake CSAM as the primary use almost immediately. Covid and refugee skeptics will be targeted more than pedo's.
So we’re just supposed to believe that?
 
  • Like
Reactions: Fordski
I am sure that some people are really up in arms and have, or are planning to do so, get rid of their Apple gear. And they should, if that is the message they believe they want to send to Apple. However, imo, it would be like throwing a bucket of water in the ocean expecting the tide to rise.
We complained a little and Apple delayed the entire thing. You have no idea of our true power :)
 
  • Like
Reactions: CasualFanboy
UK Home Secretary Priti Patel:

"Recently Apple have taken the first step, announcing that they are seeking new ways to prevent horrific abuse on their service. Apple state their child sexual abuse filtering technology has a false positive rate of 1 in a trillion, meaning the privacy of legitimate users is protected whilst those building huge collections of extreme child sexual abuse material are caught out. They need to see though that project."

source: https://homeofficemedia.blog.gov.uk/2021/09/08/new-safety-tech-fund-challenge/

When Apple said "These efforts will evolve and expand over time" I didn't realize they would move so fast!

Did you?

Good thing I'm fast too. I ditched Apple within days after the CSAM announcement. In just the last 24 months I had spent nearly $5k on Apple devices. Leaving never crossed my mind. I had been a Mac addict for my whole life.

Today I'm running Pop OS on my brand new System76 Lemur Pro. Upgraded my mouse and keyboard as well! My Mac, formerly beloved, now gathers dust. Once you disable iCloud it becomes a lot less of an "ecosystem" and it's just a bunch of shiny devices. Recommend this as a first step for those who want to wean off Apple.

Free (from spyware) Open Source Software is the only sustainable path forward. Learn more at https://itsfoss.com/what-is-foss/

Now that this is real -- what's next?
 
Mathematically, the result of each match is unknown to the device. The device only encodes this unknown and encrypted result into what is called a safety voucher, alongside each image being uploaded to iCloud Photos.
How can a device hash an image, have a reference database locally stored, and not know if it's a match? It has to know whether the photo matches or not, or the server couldn't decrypt positively-matched vouchers. The server wouldn't even know how many positive vouchers there are. It has to be flagged in some manner, and there's no way for the device to do that if it doesn't know if an image matches or not.

Apple does not add to the set of known CSAM image hashes, and the system is designed to be auditable.
Bull. So, they'll never add any new CSAM content? How can they audit something they're legally not allowed to see?

Furthermore, Apple conducts human review before making a report to NCMEC. In a case where the system identifies photos that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC.
That human review is our only protection from abuse. Frankly, I have little faith in it.

The set of image hashes used for matching are from known, existing images of CSAM that have been acquired and validated by at least two child safety organizations.
Two organizations that likely share the same database?
 
Hunting down dissident political views will overtake CSAM as the primary use almost immediately.

I have to think this very possible. Which is one reason why I've been against this technology--I can easily imagine it being employed for more than what they are saying now, and I can imagine it being done on all phones, whether or not one uses iCloud.

And I even wonder one thing--there are stories that Apple is being pressured into this technology. I think some of this has been discussed here. I almost have to wonder if that's true if the people pressuring Apple weren't really interested in uses other than CSAM all along. Maybe, maybe not. But I certainly have no doubt that there will be governments out there that are right now actively actively thinking about forcing Apple into employing this technology for things other than CSAM.
 
  • Like
Reactions: CasualFanboy
But you have already bailed on your other thread because you could not answer the questions.
You are struggling with the OS and so far have nothing in place to provide the cloud services and integration you have lost, you really have no solution other than you have a new laptop.
Simply installing a NAS and taking control of your cloud data would have saved you a lot of money and still be using your Apple products On your own cloud.

I’m sure we in Britain will take a far more logical response than what we are seeing from some on the forum.
 
  • Like
Reactions: hg.wells
Status
Not open for further replies.
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.