Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Will you leave the Apple ecosystem because of CSAM?


  • Total voters
    1,040
Status
Not open for further replies.
that does NOT mean the scan happens in transit, only that its done one ones marked for transit.
So you think even after the files are uploaded, your device is constantly going to by sniffing your photos and seeing if anything changed? Yeah okay.

No, the hashing process is part of the icloud upload pipeline like Apple says. Therefore, no icloud, no upload, no hashing, no vouchers.
 
  • Haha
Reactions: Mendota and dk001

Page 7

This feature runs exclusively as part of the cloud storage pipeline for images being uploaded to iCloud Photos and cannot act on any other image content on the device. Accordingly, on devices and accounts where iCloud Photos is disabled, absolutely no images are perceptually hashed. There is therefore no comparison against the CSAM perceptual hash database, and no safety vouchers are generated, stored, or sent anywhere.
That's not what the screenshot I posted said. The image I posted says that photos are scanned on device. Period. And the hash is stored "securely on the users device".

Looks to me like Apple is talking out of both sides of their mouth.
 
That's not what the screenshot I posted said. The image I posted says that photos are scanned on device. Period. And the hash is stored "securely on the users device".

Looks to me like Apple is talking out of both sides of their mouth.
I think you're just thinking of it the wrong way. No big deal though, you tried.
 
Serious questions regarding this line of thinking:

Since your photo will not be scanned unless you turn on iCloud photos isn't turning it on the same as 'making a choice'?

How is the choice being taken away if you 100% still have the choice to not turn on iCloud photos and not have your pic scanned?

Whether the scanning is on your phone or in the cloud the result is the same: your pics won't be scanned unless you have iCloud photos turned on so what is all this fuss about?

And I'm not looking for conspiracy based answers. I would like an answer based on the available facts.
OK the simple facts are someone paid about $1000 dollars for a devise that at the point of purchase they know they could use iCloud functionality in complete privacy.

That 100% privacy has now been removed as they can't use the iCloud photos functionality unless they agree to being scanned for CSAM, its no longer the product you paid for.
 
OK the simple facts are someone paid about $1000 dollars for a devise that at the point of purchase they know they could use iCloud functionality in complete privacy.

That 100% privacy has now been removed as they can't use the iCloud photos functionality unless they agree to being scanned for CSAM, its no longer the product you paid for.
... or... just an idea here... don't have CSAM.
 

Page 7

This feature runs exclusively as part of the cloud storage pipeline for images being uploaded to iCloud Photos and cannot act on any other image content on the device. Accordingly, on devices and accounts where iCloud Photos is disabled, absolutely no images are perceptually hashed. There is therefore no comparison against the CSAM perceptual hash database, and no safety vouchers are generated, stored, or sent anywhere.
Page 12:

"To make sure Apple's servers do not have a count of matching images for users below
the match threshold, the on-device matching process will, with a certain probability, re-
place a real safety voucher that's being generated with a synthetic voucher that only
contains noise. This probability is calibrated to ensure the total number of synthetic
vouchers is proportional to the match threshold. Crucially, these synthetic vouchers are
a property of each account, not of the system as a whole. For accounts below the
match threshold, only the user's device knows which vouchers are synthetic; Apple's
servers do not and cannot determine this number, and therefore cannot count the num-
ber of true positive matches.
The code running on the device will never let Apple servers know the number of syn-
thetic vouchers directly; this claim is subject to code inspection by security researchers
like all other iOS device-side security claims. Only once an account exceeds the match
threshold of true matches against the perceptual CSAM hash database can Apple
servers decrypt the contents of the corresponding safety vouchers and obtain the exact
number of true matches (always in excess of the match threshold) – and the visual de-
rivatives that correspond to those vouchers. In other words, even though the creation of
synthetic vouchers is a statistical protection mechanism, it is not a traditional noise-
based approach: under this protocol, it is impossible for servers to distinguish synthetic
vouchers from real ones unless the number of true positive (non-synthetic) matches
cryptographically exceeds the match threshold."

It may not be able to decrypt the vouchers, but it'll try if real+synthetic vouchers exceed the threshold. I still don't understand why they take this method instead of running the comparison entirely on-server for photos that are shared. Why open Pandora's Box when they have the decryption keys for iCloud Photos?
 
OK the simple facts are someone paid about $1000 dollars for a devise that at the point of purchase they know they could use iCloud functionality in complete privacy.

That 100% privacy has now been removed as they can't use the iCloud photos functionality unless they agree to being scanned for CSAM, its no longer the product you paid for.
Except the contents of the picture remain unknown to anyone and nothing leaves the phone unless it is child porn so that still seems like 100% privacy to me (except for a few hypothetical perverts with 30+ illegal images on their phone).
 
Last edited:
I posted all the facts. The part that isn't fact is that people are saying Apple is going to switch scanning on full time no matter if you have iCloud enabled or not, which is not a fact at all.
Not at this point in time, but likely in a future update. Exposure Notifications only work if you have a State app installed. Oh wait...

From https://support.apple.com/en-us/HT210393
iOS 13.7 lets you opt-in to the COVID-19 Exposure Notifications system without the need to download an app.
 
  • Like
Reactions: dk001
Yep, I'll go back to only local storage. Then my only threat vectors are burglary and warrant-in-hand LEOs.

My only real concern is getting my info out of the hands of corporations. We’ve never been able to stop the government, but we can stop for-profit companies from collecting, trafficking, buying and selling our data.
 
  • Like
Reactions: MrTSolar
... or... just an idea here... don't have CSAM.
Oh FFS... I don't believe anyone here is truly worried about being caught with CP on their phones.

It's the PRINCIPAL of the situation. This is SURVEILLANCE. Plain and simple. 100%. No other word for it.

Let's replace the word CSAM in your sentence with things that are illegal in other countries (or even the U.S.)
"... or... just an idea here... don't have any gay photos" (Russia, Nigeria, Jamaica)
"... or... just an idea here... don't have any pictures of anal sex with your wife." (Many states in the US)
"... or... just an idea here... don't have any pot in your house. (still a federal offense)"
"... or... just an idea here... don't have any pictures of Winnie the Pooh" (china)
"... or... just an idea here... don't have any pictures of the Dalai Lama" (china)
"... or... just an idea here... just don't have any soviet union memes" (china)
"... or... just an idea here... just don't have any pictures of the Sistine Chapel" (Vatican City, Rome)
"... or... just an idea here... don't have any photos inside Westminster Abbey" (UK)
"... or... just an idea here... don't have any photos of the Eiffel Tower after dark..." (France)

Do you see how simple it is to have illegal photos on your phone?
 
Oh FFS... I don't believe anyone here is truly worried about being caught with CP on their phones.

It's the PRINCIPAL of the situation. This is SURVEILLANCE. Plain and simple. 100%. No other word for it.

Let's replace the word CSAM in your sentence with things that are illegal in other countries (or even the U.S.)
"... or... just an idea here... don't have any gay photos" (Russia, Nigeria, Jamaica)
"... or... just an idea here... don't have any pictures of anal sex with your wife." (Many states in the US)
"... or... just an idea here... don't have any pot in your house. (still a federal offense)"
"... or... just an idea here... don't have any pictures of Winnie the Pooh" (china)
"... or... just an idea here... don't have any pictures of the Dalai Lama" (china)
"... or... just an idea here... just don't have any soviet union memes" (china)
"... or... just an idea here... just don't have any pictures of the Sistine Chapel" (Vatican City, Rome)
"... or... just an idea here... don't have any photos inside Westminster Abbey" (UK)
"... or... just an idea here... don't have any photos of the Eiffel Tower after dark..." (France)

Do you see how simple it is to have illegal photos on your phone?

There's no evidence that what you say will happen. Literally none. Speculate all you want. It's just your crazy theory.
 
I wouldn't say it's likely. I'd say that's purely speculation.
Yes, but we were here arguing the same semantics a year ago over contact tracing. "No need to worry. It's only an API and requires a 3rd party app to function". I was here trying to warn folks that the API was only a start. Well, 13.7 comes along and guess who was right.
 
Yes, but we were here arguing the same semantics a year ago over contact tracing. "No need to worry. It's only an API and requires a 3rd party app to function". I was here trying to warn folks that the API was only a start. Well, 13.7 comes along and guess who was right.
And it's opt-in right?
 
Serious questions regarding this line of thinking:

Since your photo will not be scanned unless you turn on iCloud photos isn't turning it on the same as 'making a choice'?

How is the choice being taken away if you 100% still have the choice to not turn on iCloud photos and not have your pic scanned?

Whether the scanning is on your phone or in the cloud the result is the same: your pics won't be scanned unless you have iCloud photos turned on so what is all this fuss about?

And I'm not looking for conspiracy based answers. I would like an answer based on the available facts.

Maybe.

More than a few times I have had Settings change as the result of an OS update. Normal and Beta.
Not an uncommon occurrence.
 
More than a few times I have had Settings change as the result of an OS update. Normal and Beta.
Not an uncommon occurrence.
I agree with you on this. That should not happen. If you have a setting turned off, it should not turn itself back on from an update. Windows does the same thing and it's annoying as hell.

Heck, Windows will delete software that it deemed unnecessary even though it's crucial software we need for work.
 
  • Like
Reactions: dk001
Status
Not open for further replies.
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.