Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Natrium

macrumors regular
Aug 7, 2021
125
246
Here is the thing, Apple says it is necessary and I believe them. When end-to-end encryption is implemented in iCloud then it will be impossible to detect CSAM. Therefore, it will be necessary to make the comparison at the device level before it is uploaded to iCloud and encrypted.

Since you don’t, it looks like you are going down the Open-Source Linux rabbit-hole.
Firstly, there’s absolutely no proof Apple is planning to implement end to end encryption for iCloud.

Secondly, Apple’s backdoor has access to your personal photos before they’re even encrypted, completely defeating the purpose of end to end encryption.

Thirdly, it is not necessary to scan photos on your own device to prevent storing illegal content on iCloud. They can scan it on the iCloud servers. Nobody had a problem with that and nobody asked them for this dangerous backdoor that literally opens the door for mass surveillance on a scale the world has never seen before.
 

bobcomer

macrumors 601
May 18, 2015
4,949
3,699
Right now it’s done, in those places, at the cloud level. Is that any better for you? At least if it’s done at the device level you can disconnect to prevent it.
Like they can track your iPhone, even if it's off? That plus this scanning thing just gives me an extremely bad taste in my mouth. No, it's not going to change me wanting an iPhone/iPads for now, but the next step down this slope probably will. Now Mac's, I don't really need those, so I'll be thinking long and hard before I purchase another one.

And yes, I know you'll say that I can turn off find my iPhone, so don't bother, it doesn't change what I'm worried about.
 

GrumpyCoder

macrumors 68020
Nov 15, 2016
2,127
2,707
Here is the thing, Apple says it is necessary and I believe them. When end-to-end encryption is implemented in iCloud then it will be impossible to detect CSAM. Therefore, it will be necessary to make the comparison at the device level before it is uploaded to iCloud and encrypted.
First, please read this: https://www.hackerfactor.com/blog/index.php?/archives/929-One-Bad-Apple.html
I just found this now, but some of his arguments mirror what I've been saying.

Back to end-to-end encryption in iCloud... you said earlier "Until it can be proven otherwise, the rest is conspiracy theory.". Until E2EE in iCloud, it's wishful thinking. Also, even if they had it, there's another problem.

I quote from above link and Apple:
Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the contents of the safety vouchers associated with the matching CSAM images. Apple then manually reviews each report to confirm there is a match, disables the user’s account, and sends a report to NCMEC.
So how is Apple manually reviewing a report if content in iCloud can't be accessed due to end-to-end encryption? They'd have to access your/our systems for that. Without the encryption, they could just look at the iCloud content. So in the end, are we talking about a "backdoor" which is used in case of a report and assuming E2EE is in place? If no E2EE is available, then why not scan it in the cloud. I hope the problem can be seen here.

This is concerning (from the above link):
In the six years that I've been using these hashes at FotoForensics, I've only matched 5 of these 3 million MD5 hashes. (They really are not that useful.) In addition, one of them was definitely a false-positive. (The false-positive was a fully clothed man holding a monkey -- I think it's a rhesus macaque. No children, no nudity.) Based just on the 5 matches, I am able to theorize that 20% of the cryptographic hashes were likely incorrectly classified as CP. (If I ever give a talk at Defcon, I will make sure to include this picture in the media -- just so CP scanners will incorrectly flag the Defcon DVD as a source for CP. [Sorry, Jeff!])
So is Apple going to look at confidential files on my system if it is false-positive? Let's say confidential corporate information? I have confidential government information on my systems, which I need for my research. Would that be made available to Apple? I have a former colleague working in security. He's often tasked with securing forensic evidence in CSAM cases. What does that mean for ongoing legal cases if such information is made available to 3rd parties (Apple)?

Several arguments have been made in this thread. I can't address them all, here's another quote from the link above when it comes to illegal searches:
Think of it this way: Your landlord owns your property, but in the United States, he cannot enter any time he wants. In order to enter, the landlord must have permission, give prior notice, or have cause. Any other reason is trespassing. Moreover, if the landlord takes anything, then it's theft. Apple's license agreement says that they own the operating system, but that doesn't give them permission to search whenever they want or to take content.
Food for thought.
Since you don’t, it looks like you are going down the Open-Source Linux rabbit-hole.
Maybe. As I said before, we'll have to see what happens. This whole thing is not going to fly well with anyone outside of the US. Whether it's activated on a per country basis or included in the local OS release remains to be seen. One could easily install a foreign OS version, no matter where you are in the world.

I don't have a problem with Linux, it's really the UI experience that sets macOS apart. I'll certainly have a look if I can remove the functionality that Apple implemented and if that's the case, no worries. If not, then I'll just go somewhere else. macOS is already in a state that doesn't allow me to do part of my research/work, so it's not a big loss I guess. A few years ago, I never thought I'd say such a thing. But times change I guess.
 

Shadow Demon

macrumors member
Oct 24, 2018
92
236
Firstly, there’s absolutely no proof Apple is planning to implement end to end encryption for iCloud.

Secondly, Apple’s backdoor has access to your personal photos before they’re even encrypted, completely defeating the purpose of end to end encryption.

Thirdly, it is not necessary to scan photos on your own device to prevent storing illegal content on iCloud. They can scan it on the iCloud servers. Nobody had a problem with that and nobody asked them for this dangerous backdoor that literally opens the door for mass surveillance on a scale the world has never seen before.
First, it is my hope that it will be implemented.
Second, right now they have the encryption keys and can see all my photos. Whoo-hoo. With end-to-end encryption, they will have access to the checksums of my data. With a match. only a low-resolution will be available since if the picture makes to iCloud it will be encrypted and not accessible.
Third it is necessary because of #2 above.

So, yes this is a reasonable theory but whole lot more credible than conspiracy theory in this thread. In any case. I am good, I trust Apple as for the rest of you, you got options.
 
  • Haha
Reactions: sorgo †

Maconplasma

Cancelled
Sep 15, 2020
2,489
2,215
So? I asked you because I wanted to know if what you said was true, as that would make a difference in what I think.

So I'll ask again if you have any proof that they will never turn anyone into the authorities? (or even anything that says they wont!) I have no proof they will, but that's certainly the direction they are going with this -- I know that from how our government works. The people in charge of this state (SC) just love to legislate and prosecute morality faults that they perceive.
Here's your proof. Right on the front page of MR spoken directly from Apple. Now you will please move on to someone else you're looking to call out?
 

Attachments

  • Screen Shot 2021-08-08 at 3.05.06 PM.png
    Screen Shot 2021-08-08 at 3.05.06 PM.png
    503.7 KB · Views: 89
  • Haha
Reactions: sorgo †

bobcomer

macrumors 601
May 18, 2015
4,949
3,699
Here's your proof. Right on the front page of MR spoken directly from Apple. Now you will please move on to someone else you're looking to call out?
That does not prove what you said. What you said is they would never report you, and this quote just says they wont report you if it doesn't match.
 

Khedron

Suspended
Sep 27, 2013
2,561
5,755
The link below is an excellent analysis. If you trust Apple to do exactly what they said and no more then all is good. Until it can be proven otherwise, the rest is conspiracy theory.


If you get arrested then never ask for a lawyer. Only people with something to hide do that. Just give them all your passwords and biometric data. Ideally before they even ask for it. After all if you didn’t trust the police you would have already moved to another state or country right?
 

Maconplasma

Cancelled
Sep 15, 2020
2,489
2,215
That does not prove what you said. What you said is they would never report you, and this quote just says they wont report you if it doesn't match.
If what doesn't match? Where are you getting these words? Show me where it says any of that on the quote I uploaded? I don't see any of those words. Also your tone is quite aggressive like you just want to argue to be right. Calm down or move on to someone else.
 
Last edited:
  • Haha
Reactions: sorgo †

Maconplasma

Cancelled
Sep 15, 2020
2,489
2,215
I wasn't challenging any of your posts in the first place for you to come at me. So if you have any alternative point then post it, otherwise....
Also Apple is not going to turn anyone into the authorities. So if they flag pics on your account how else would you know unless they informed you? Some logic on your part would greatly help.

That does not prove what you said. What you said is they would never report you, and this quote just says they wont report you if it doesn't match.
Just went back to confirm my own words. Didn't use the word never. You're making stuff up. But I sincerely hope that if someone is committing a crime that you would want Apple to report them to the authorities? Or do you want Apple to remain quiet and keep that type of person on the streets? 🙄
 
  • Haha
Reactions: sorgo †

bobcomer

macrumors 601
May 18, 2015
4,949
3,699
If what doesn't match? Where are you getting these words? Show me where it says any of that on the quote I uploaded? I don't see any of those words. Also your tone is quite aggressive like you just want to argue to be right. Calm down or move on to someone else.
The picture, duh. I was just expanding on something you didn't post, that actually makes a difference. Look at that link a few posts up and read *all* of what Apple says.
 
  • Like
Reactions: sorgo †

bobcomer

macrumors 601
May 18, 2015
4,949
3,699
Just went back to confirm my own words. Didn't use the word never. You're making stuff up. But I sincerely hope that if someone is committing a crime that you would want Apple to report them to the authorities? Or do you want Apple to remain quiet and keep that type of person on the streets? ?
You implied never.
 

Maconplasma

Cancelled
Sep 15, 2020
2,489
2,215
You implied never.
Nah I didn't. Implied and actual words are two different things. It's just what you wanted to believe in order to be right. No worries I understand perfectly now that you don't want Apple to even report the wrong-doers (the one's committing crimes) just because you don't like their process. Well you most certainly don't have to tolerate what they are doing. It's your right. I hear Android is the most secure and private system in the business. Check it out. ?
 

jonblatho

macrumors 68030
Jan 20, 2014
2,529
6,241
Oklahoma
You want they should just accept kiddie porn on their servers?
They have for years; it’s not like user-uploaded CSAM has suddenly become an issue in 2021. In fact, Apple’s generally been behind the curve in adopting measures to combat it. Plus, has any file/photo storage service provider that makes a persistent and good-faith effort to report and destroy CSAM upon discovery been successfully sued for this, or is there any U.S. legislation pending, which would hold Apple liable for user-uploaded CSAM?

By law, Apple must report CSAM to NCMEC upon discovery, but I’ve yet to see any requirement that they must scan user-uploaded content for it. Some take issue with server-side CSAM scanning; personally, I don’t. But when Apple has a history of reenabling iCloud features, including iCloud Photos, after users had disabled them, and virtually all images on iOS somehow flow through the Photos app, I start to wonder why the line’s getting blurred regarding what gets processed where. This appears to add nothing constructive in any fight against CSAM over what’s already being done by Apple and others.

You offer some conjecture that Apple may begin E2E encrypting iCloud Photos, and this feature would better allow them to do so. The latter is true, but the former brings its own risks aside from CSAM, most notably the prospect of unexpected data loss when something goes wrong with iCloud E2E encryption setup and the user is endlessly presented with the option to erase everything — now including their entire photo library. Even setting that aside, Apple has offered absolutely no indication that it intends to offer this feature. If they’d announced such functionality alongside on-device CSAM scanning or otherwise indicated that it’s on the way, I’d be more forgiving with this line of argument…but absent that, sorry, no.

And before you rattle on with some ad hominem drivel about how I don’t understand math, am a shill for an Apple competitor, and/or am secretly trafficking in CSAM: I make my living in a STEM field, minored in mathematics for my undergraduate studies, I own and enjoy numerous Apple products and services, the only non-Apple consumer electronics device I currently own is an LG TV which is used exclusively with an Apple TV, and my photos have been in iCloud Photos and scanned server-side for CSAM for the past couple of years and I’ve yet to receive a knock on the door from cops. I think I’m fine on all three counts.
 

Khedron

Suspended
Sep 27, 2013
2,561
5,755
They have for years; it’s not like user-uploaded CSAM has suddenly become an issue in 2021. In fact, Apple’s generally been behind the curve in adopting measures to combat it. Plus, has any file/photo storage service provider that makes a persistent and good-faith effort to report and destroy CSAM upon discovery been successfully sued for this, or is there any U.S. legislation pending, which would hold Apple liable for user-uploaded CSAM?

By law, Apple must report CSAM to NCMEC upon discovery, but I’ve yet to see any requirement that they must scan user-uploaded content for it. Some take issue with server-side CSAM scanning; personally, I don’t. But when Apple has a history of reenabling iCloud features, including iCloud Photos, after users had disabled them, and virtually all images on iOS somehow flow through the Photos app, I start to wonder why the line’s getting blurred regarding what gets processed where. This appears to add nothing constructive in any fight against CSAM over what’s already being done by Apple and others.

You offer some conjecture that Apple may begin E2E encrypting iCloud Photos, and this feature would better allow them to do so. The latter is true, but the former brings its own risks aside from CSAM, most notably the prospect of unexpected data loss when something goes wrong with iCloud E2E encryption setup and the user is endlessly presented with the option to erase everything — now including their entire photo library. Even setting that aside, Apple has offered absolutely no indication that it intends to offer this feature. If they’d announced such functionality alongside on-device CSAM scanning or otherwise indicated that it’s on the way, I’d be more forgiving with this line of argument…but absent that, sorry, no.

And before you rattle on with some ad hominem drivel about how I don’t understand math, am a shill for an Apple competitor, and/or am secretly trafficking in CSAM: I make my living in a STEM field, minored in mathematics for my undergraduate studies, I own and enjoy numerous Apple products and services, the only non-Apple consumer electronics device I currently own is an LG TV which is used exclusively with an Apple TV, and my photos have been in iCloud Photos and scanned server-side for CSAM for the past couple of years and I’ve yet to receive a knock on the door from cops. I think I’m fine on all three counts.

So far Facebook is reporting 20 million CSAM cases a year compared to Apple’s 200.

So basically they’ve been actively protecting paedophiles for years while they work on a system that only addresses CSAM as an excuse to enhance their power over users.
 
  • Wow
Reactions: sorgo †

CSoren

macrumors newbie
Aug 9, 2020
8
17
I really hate the concept -- it really doesn't change what I want to own or not, but any talk from now on about Apple being for privacy is just marketing talk to be ignored.
Im still confused why people are making a big deal about this. Theres no user data information attached to the photos and apple is scanning them without using the information tied to you or any informationso how is it a problem?
 
  • Haha
Reactions: sorgo †

Maconplasma

Cancelled
Sep 15, 2020
2,489
2,215
Im still confused why people are making a big deal about this. Theres no user data information attached to the photos and apple is scanning them without using the information tied to you or any informationso how is it a problem?
It's not a problem, at least not for the innocent people. Some come here only to post the worst about Apple. Had this been Google who was doing this there wouldn't be a discussion anywhere on the internet about it. Apple is still the top tech company when it comes to privacy and anyone TRULY upset about this is perhaps the very persons Apple is targeting. Some people despise Apple just because the company is successful from the ground up and other companies constantly follow Apple rather than lead the industry. As soon as any news about Apple is posted people take the opportunity to bash the company even when it's really not a big deal. Same ole, same ole.
 

bobcomer

macrumors 601
May 18, 2015
4,949
3,699
just because you don't like their process.
Yep, I don't like their process, and what it portends, and yes, that's my choice, and my right to complain.

I hear Android is the most secure and private system in the business. Check it out. ?
I have an active android phone too and know it quite well. I still prefer iOS, for now.

For inactive phones, I've had just about every phone OS. :) I'm an OS geek, what can I say -- it goes for phones as much as computers...
 

bobcomer

macrumors 601
May 18, 2015
4,949
3,699
Im still confused why people are making a big deal about this. Theres no user data information attached to the photos and apple is scanning them without using the information tied to you or any informationso how is it a problem?
Because what you said isn't true. If the hatch matches on phone, it reports to Apple, along with your contact info and the picture(s) in question. Someone at apple looks at it, including the picture to see if it's a false positive -- if it isn't according to his/her opinion, it gets reported to the authorities, and your account (icloud) gets disabled.
 

Maconplasma

Cancelled
Sep 15, 2020
2,489
2,215
Because what you said isn't true. If the hatch matches on phone, it reports to Apple, along with your contact info and the picture(s) in question. Someone at apple looks at it, including the picture to see if it's a false positive -- if it isn't according to his/her opinion, it gets reported to the authorities, and your account (icloud) gets disabled.
Where does Apple actually state this verbatim?
 

GrumpyCoder

macrumors 68020
Nov 15, 2016
2,127
2,707
Where does Apple actually state this verbatim?
Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the contents of the safety vouchers associated with the matching CSAM images. Apple then manually reviews each report to confirm there is a match, disables the user’s account, and sends a report to NCMEC. If a user feels their account has been mistakenly flagged they can file an appeal to have their account reinstated.
Source: https://www.apple.com/child-safety/
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.