Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Do you feel "What happens on your iPhone stays on your iPhone" is false advertising?

  • Yes

    Votes: 157 68.0%
  • No

    Votes: 74 32.0%

  • Total voters
    231
Nope, the announcement this week was this the scanning will happen on your device. But, yes, it only happens if you have iCloud Photos turned on. That's simply a policy decision though, and not some sort of technical limitation; it could change any time to run whether you're using iCloud or not.

Very little on the iPhone is a technical limitations. Apple can almost always just change the software.
 
Consider this, should the makers of indoor security cameras, like Logitech, Eufy, Ring, etc, for consumer home use be able to monitor those cameras for instances of domestic violence, illegal drug use, child abuse, etc. because the video is stored on their servers via a cloud service? That will be the next step.

I wouldn't be illegal reporting it to the police.
What happened to the presumption of innocence?

You are presumed innocent until the detection says you have too many matching photos and a manual review at Apple agrees with the system's assessment.
 
It is false advertising and I would not be surprised if there's a lawsuit coming in place in the future. Hopefully, it goes to the Supreme Court.

At Apple, we believe privacy is a fundamental human right? If that's the case why scan an iPhone without the user's authorization?

Apple will ask for your authorisation. If you deny, you will not be able to use iCloud Photo Library.
 
Due to an alarm for flagged photos in your library we had to review the following pictures: (including copies of 35 very private photos of your wife) Unfortunately they matched the hashes of our database, which usually only happens for one in a Trillion photos, so we checked them and found them legit. We have added a supplementary hidden stamp to them to avoid further control. They are safe to use now.

It's 1 in 1 trillion accounts per year, not photos.

Also Apple will only review the matching photos, not every photo in your iCloud Photo Library.
 
  • Like
Reactions: dk001
So lets say i am a dictator. I want anyone to drink cola, people who drink just plain water are subhumans to me. I know you are an political opponent, or whatsover, and i know this, because you came public with your disagreement in a viral video. I take your picture/video screenshot and/or your voice from the interview, make a hash file of this and Apple will tell me if you are one of their users, and if you are, what your name is, where you live, and so on.

First, these NeuralHash doesn't work on audio. It could work for videos but it's very hard to get these algorithms to work effective for both still images and a series of still images. They probably would have to create a modified version for video. For audio they would need to have a different neural network algorithm.

The CSAM detection system is extremely poor to discover people who drink water. You can't just hash a photo of someone drinking water and tell the system to catch similar photos.

Let's say there exist 10 million images of people drinking water, having water bottles etc. This dictator gets someone to take 100 such picture of water drinking and forces Apple to put those 100 hashes in to their database.

The probability that anyone of the 10 million images will match those 100 is pretty small since the system was designed to avoid finding similar pictures.
 
That is wrong, and you know it.

If iPhone users were presumed innocent the assessment would not be carried out in the first place.
Users ARE assumed guilty until assessment proves otherwise

Well, if no check is required to be presumed innocent, then you are not presumed to be innocent. Just like other situations in life:

* You have to prove you should be allowed to enter into the US by someone checking passport, ID or similar
* You have to prove you are above 21 to be served alcohol by ID
* You have to prove you are allowed to drive a vehicle when renting a car by someone checking your driver license
* You're belongings have to be searched to prove that you aren't in possession of something belonging to your employer
 
The difference is: If they are doing it on their servers, I cannot stop them from doing it. That is ok for me. I do not have to use their server services.

But now, they are scanning "on device" as a default activity and retrieving my private things if a certain value is triggered. So my privacy is gone. If my government decides to introduce an "on device"-scan regarding political activities, I am lost...

You can't really stop Apple from doing it on your phone either. They could at any moment retrieve everything on your phone using existing code already on your iPhone or making a very small change to it.
 
Well, if no check is required to be presumed innocent, then you are not presumed to be innocent. Just like other situations in life:

* You have to prove you should be allowed to enter into the US by someone checking passport, ID or similar
* You have to prove you are above 21 to be served alcohol by ID
* You have to prove you are allowed to drive a vehicle when renting a car by someone checking your driver license
* You're belongings have to be searched to prove that you aren't in possession of something belonging to your employer
Yeah if everyone was always presumed to be innocent, there wouldn't be any safety checks ever.
 
Well, if no check is required to be presumed innocent, then you are not presumed to be innocent. Just like other situations in life:

* You have to prove you should be allowed to enter into the US by someone checking passport, ID or similar
* You have to prove you are above 21 to be served alcohol by ID
* You have to prove you are allowed to drive a vehicle when renting a car by someone checking your driver license
* You're belongings have to be searched to prove that you aren't in possession of something belonging to your employer
Wrong again. Not a thing out of this bullet pointed list does apply when in MY home. Plus: showing ID is not being searched. Totally different thing
 
You can't really stop Apple from doing it on your phone either. They could at any moment retrieve everything on your phone using existing code already on your iPhone or making a very small change to it.
They can do a lot of things. I can not control it, but it might be discovered, they are cheating on me.

With iOS 15 they are communicating, that EVERY picture I upload to my iPhoto Library is scanned and maybe reviewd by default. I am assumed guilty and Apple Control is checking it, because Apple Control assumes, I am a liar.
 
They can do a lot of things. I can not control it, but it might be discovered, they are cheating on me.

With iOS 15 they are communicating, that EVERY picture I upload to my iPhoto Library is scanned and maybe reviewd by default. I am assumed guilty and Apple Control is checking it, because Apple Control assumes, I am a liar.
I think it has nothing to do with you. They probably legally have to scan for CSAM since EVERY major company does it.
 
  • Haha
Reactions: dk001
I think it has nothing to do with you. They probably legally have to scan for CSAM since EVERY major company does it.

They they should do it on server.

Nobody at Apple has confirmed anything about E2EE for iCloud coming and that being part of the story here.

Bloggers/Vloggers/Podders don't count

CSAM on device scanning rollout needs to be stopped for now, pending more discussion debate and explanation.
There's no rush. Apple has been largely ignoring CSAM for years.

Not to mention - such a substantial change shouldn't just come out of left field right before new major OS versions release with it "in there". Not cool. You lose trust, big time, doing things this way.
 
Last edited:
That is utter and complete BS. That's like saying if a business monitors their security cameras for illegal activity and reports it to the police, they are "acting as an agent of law enforcement." No, they're simply reporting a crime they've observed.
No, that’s more like whoever built and sold to you your house bugging it to monitor for a particular illegal activity, and then reporting you to the police when they find something.

There seems to be two groups of people regarding this issue. There are those who think only in the present and believe you have nothing to worry about as long as you don’t download CSAM, and there are those who look ahead and realize what this can turn into, and that’s what we’re worried about.
 
No, that’s more like whoever built and sold to you your house bugging it to monitor for a particular illegal activity, and then reporting you to the police when they find something.

Not even close. Apple is not wanting to know what's on YOUR phone - they're wanting to know whether you upload illegal images to THEIR servers.

There seems to be two groups of people regarding this issue. There are those who think only in the present and believe you have nothing to worry about as long as you don’t download CSAM, and there are those who look ahead and realize what this can turn into, and that’s what we’re worried about.

Slippery slope fallacy.
 
Not even close. Apple is not wanting to know what's on YOUR phone - they're wanting to know whether you upload illegal images to THEIR servers.



Slippery slope fallacy.
That is completely incorrect. If all Apple cared about was whether you were uploading the images to their servers, then they would simply remove the images and prevent you from using their services so you can’t do it again. No, they’re reporting you to the authorities if they find something because they do care about what’s on your phone.
 
That is completely incorrect. If all Apple cared about was whether you were uploading the images to their servers, then they would simply remove the images and prevent you from using their services so you can’t do it again. No, they’re reporting you to the authorities if they find something because they do care about what’s on your phone.

That theory holds no water, otherwise the scanning process wouldn't be tied exclusively to iCloud photos. If you don't use iCloud for photos, you could have 100,000 CSAM images on your phone and Apple would be none the wiser.
 
That is completely incorrect. If all Apple cared about was whether you were uploading the images to their servers, then they would simply remove the images and prevent you from using their services so you can’t do it again. No, they’re reporting you to the authorities if they find something because they do care about what’s on your phone.
What is the likely hood that a collection of innocent images in your iCloud library matching exactly to a collection of images stored in the known CSAM library? 1 in 1 trillion. That makes you scared? You have a better chance being arrested for literal murder and being innocent.
 
What is the likely hood that a collection of innocent images in your iCloud library matching exactly to a collection of images stored in the known CSAM library? 1 in 1 trillion. That makes you scared? You have a better chance being arrested for literal murder and being innocent.
I addressed this a few posts up. Nobody is worried about being falsely flagged.
 
  • Like
Reactions: Jemani
You fall into the first group.
If you're not falsely flagged, then all of your things remain private. What's the issue here? Are you afraid you'll be that 1 in a trillion that somehow has 30+ innocent photos that just so happen to match 30 photos in a known CSAM database and then having those 30 photos reviewed by Apple. Then, if they truly are innocent photos, literally nothing happens.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.