Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Do you feel "What happens on your iPhone stays on your iPhone" is false advertising?

  • Yes

    Votes: 157 68.0%
  • No

    Votes: 74 32.0%

  • Total voters
    231

hans1972

Suspended
Apr 5, 2010
3,759
3,399
Nope, the announcement this week was this the scanning will happen on your device. But, yes, it only happens if you have iCloud Photos turned on. That's simply a policy decision though, and not some sort of technical limitation; it could change any time to run whether you're using iCloud or not.

Very little on the iPhone is a technical limitations. Apple can almost always just change the software.
 

hans1972

Suspended
Apr 5, 2010
3,759
3,399
Consider this, should the makers of indoor security cameras, like Logitech, Eufy, Ring, etc, for consumer home use be able to monitor those cameras for instances of domestic violence, illegal drug use, child abuse, etc. because the video is stored on their servers via a cloud service? That will be the next step.

I wouldn't be illegal reporting it to the police.
What happened to the presumption of innocence?

You are presumed innocent until the detection says you have too many matching photos and a manual review at Apple agrees with the system's assessment.
 

hans1972

Suspended
Apr 5, 2010
3,759
3,399
It is false advertising and I would not be surprised if there's a lawsuit coming in place in the future. Hopefully, it goes to the Supreme Court.

At Apple, we believe privacy is a fundamental human right? If that's the case why scan an iPhone without the user's authorization?

Apple will ask for your authorisation. If you deny, you will not be able to use iCloud Photo Library.
 

hans1972

Suspended
Apr 5, 2010
3,759
3,399
Due to an alarm for flagged photos in your library we had to review the following pictures: (including copies of 35 very private photos of your wife) Unfortunately they matched the hashes of our database, which usually only happens for one in a Trillion photos, so we checked them and found them legit. We have added a supplementary hidden stamp to them to avoid further control. They are safe to use now.

It's 1 in 1 trillion accounts per year, not photos.

Also Apple will only review the matching photos, not every photo in your iCloud Photo Library.
 
  • Like
Reactions: dk001

hans1972

Suspended
Apr 5, 2010
3,759
3,399
So lets say i am a dictator. I want anyone to drink cola, people who drink just plain water are subhumans to me. I know you are an political opponent, or whatsover, and i know this, because you came public with your disagreement in a viral video. I take your picture/video screenshot and/or your voice from the interview, make a hash file of this and Apple will tell me if you are one of their users, and if you are, what your name is, where you live, and so on.

First, these NeuralHash doesn't work on audio. It could work for videos but it's very hard to get these algorithms to work effective for both still images and a series of still images. They probably would have to create a modified version for video. For audio they would need to have a different neural network algorithm.

The CSAM detection system is extremely poor to discover people who drink water. You can't just hash a photo of someone drinking water and tell the system to catch similar photos.

Let's say there exist 10 million images of people drinking water, having water bottles etc. This dictator gets someone to take 100 such picture of water drinking and forces Apple to put those 100 hashes in to their database.

The probability that anyone of the 10 million images will match those 100 is pretty small since the system was designed to avoid finding similar pictures.
 

hans1972

Suspended
Apr 5, 2010
3,759
3,399
That is wrong, and you know it.

If iPhone users were presumed innocent the assessment would not be carried out in the first place.
Users ARE assumed guilty until assessment proves otherwise

Well, if no check is required to be presumed innocent, then you are not presumed to be innocent. Just like other situations in life:

* You have to prove you should be allowed to enter into the US by someone checking passport, ID or similar
* You have to prove you are above 21 to be served alcohol by ID
* You have to prove you are allowed to drive a vehicle when renting a car by someone checking your driver license
* You're belongings have to be searched to prove that you aren't in possession of something belonging to your employer
 

hans1972

Suspended
Apr 5, 2010
3,759
3,399
The difference is: If they are doing it on their servers, I cannot stop them from doing it. That is ok for me. I do not have to use their server services.

But now, they are scanning "on device" as a default activity and retrieving my private things if a certain value is triggered. So my privacy is gone. If my government decides to introduce an "on device"-scan regarding political activities, I am lost...

You can't really stop Apple from doing it on your phone either. They could at any moment retrieve everything on your phone using existing code already on your iPhone or making a very small change to it.
 

Jayson A

macrumors 68030
Sep 16, 2014
2,671
1,935
Well, if no check is required to be presumed innocent, then you are not presumed to be innocent. Just like other situations in life:

* You have to prove you should be allowed to enter into the US by someone checking passport, ID or similar
* You have to prove you are above 21 to be served alcohol by ID
* You have to prove you are allowed to drive a vehicle when renting a car by someone checking your driver license
* You're belongings have to be searched to prove that you aren't in possession of something belonging to your employer
Yeah if everyone was always presumed to be innocent, there wouldn't be any safety checks ever.
 

09872738

Cancelled
Feb 12, 2005
1,270
2,125
Well, if no check is required to be presumed innocent, then you are not presumed to be innocent. Just like other situations in life:

* You have to prove you should be allowed to enter into the US by someone checking passport, ID or similar
* You have to prove you are above 21 to be served alcohol by ID
* You have to prove you are allowed to drive a vehicle when renting a car by someone checking your driver license
* You're belongings have to be searched to prove that you aren't in possession of something belonging to your employer
Wrong again. Not a thing out of this bullet pointed list does apply when in MY home. Plus: showing ID is not being searched. Totally different thing
 

axantas

macrumors 65816
Jun 29, 2015
1,007
1,410
Home
You can't really stop Apple from doing it on your phone either. They could at any moment retrieve everything on your phone using existing code already on your iPhone or making a very small change to it.
They can do a lot of things. I can not control it, but it might be discovered, they are cheating on me.

With iOS 15 they are communicating, that EVERY picture I upload to my iPhoto Library is scanned and maybe reviewd by default. I am assumed guilty and Apple Control is checking it, because Apple Control assumes, I am a liar.
 

Jayson A

macrumors 68030
Sep 16, 2014
2,671
1,935
They can do a lot of things. I can not control it, but it might be discovered, they are cheating on me.

With iOS 15 they are communicating, that EVERY picture I upload to my iPhoto Library is scanned and maybe reviewd by default. I am assumed guilty and Apple Control is checking it, because Apple Control assumes, I am a liar.
I think it has nothing to do with you. They probably legally have to scan for CSAM since EVERY major company does it.
 
  • Haha
Reactions: dk001

turbineseaplane

macrumors P6
Mar 19, 2008
17,472
40,331
I think it has nothing to do with you. They probably legally have to scan for CSAM since EVERY major company does it.

They they should do it on server.

Nobody at Apple has confirmed anything about E2EE for iCloud coming and that being part of the story here.

Bloggers/Vloggers/Podders don't count

CSAM on device scanning rollout needs to be stopped for now, pending more discussion debate and explanation.
There's no rush. Apple has been largely ignoring CSAM for years.

Not to mention - such a substantial change shouldn't just come out of left field right before new major OS versions release with it "in there". Not cool. You lose trust, big time, doing things this way.
 
Last edited:

Channan

macrumors 68030
Mar 7, 2012
2,890
3,119
New Orleans
That is utter and complete BS. That's like saying if a business monitors their security cameras for illegal activity and reports it to the police, they are "acting as an agent of law enforcement." No, they're simply reporting a crime they've observed.
No, that’s more like whoever built and sold to you your house bugging it to monitor for a particular illegal activity, and then reporting you to the police when they find something.

There seems to be two groups of people regarding this issue. There are those who think only in the present and believe you have nothing to worry about as long as you don’t download CSAM, and there are those who look ahead and realize what this can turn into, and that’s what we’re worried about.
 

usagora

macrumors 601
Nov 17, 2017
4,869
4,456
No, that’s more like whoever built and sold to you your house bugging it to monitor for a particular illegal activity, and then reporting you to the police when they find something.

Not even close. Apple is not wanting to know what's on YOUR phone - they're wanting to know whether you upload illegal images to THEIR servers.

There seems to be two groups of people regarding this issue. There are those who think only in the present and believe you have nothing to worry about as long as you don’t download CSAM, and there are those who look ahead and realize what this can turn into, and that’s what we’re worried about.

Slippery slope fallacy.
 

Channan

macrumors 68030
Mar 7, 2012
2,890
3,119
New Orleans
Not even close. Apple is not wanting to know what's on YOUR phone - they're wanting to know whether you upload illegal images to THEIR servers.



Slippery slope fallacy.
That is completely incorrect. If all Apple cared about was whether you were uploading the images to their servers, then they would simply remove the images and prevent you from using their services so you can’t do it again. No, they’re reporting you to the authorities if they find something because they do care about what’s on your phone.
 

usagora

macrumors 601
Nov 17, 2017
4,869
4,456
That is completely incorrect. If all Apple cared about was whether you were uploading the images to their servers, then they would simply remove the images and prevent you from using their services so you can’t do it again. No, they’re reporting you to the authorities if they find something because they do care about what’s on your phone.

That theory holds no water, otherwise the scanning process wouldn't be tied exclusively to iCloud photos. If you don't use iCloud for photos, you could have 100,000 CSAM images on your phone and Apple would be none the wiser.
 

Jayson A

macrumors 68030
Sep 16, 2014
2,671
1,935
That is completely incorrect. If all Apple cared about was whether you were uploading the images to their servers, then they would simply remove the images and prevent you from using their services so you can’t do it again. No, they’re reporting you to the authorities if they find something because they do care about what’s on your phone.
What is the likely hood that a collection of innocent images in your iCloud library matching exactly to a collection of images stored in the known CSAM library? 1 in 1 trillion. That makes you scared? You have a better chance being arrested for literal murder and being innocent.
 

Channan

macrumors 68030
Mar 7, 2012
2,890
3,119
New Orleans
What is the likely hood that a collection of innocent images in your iCloud library matching exactly to a collection of images stored in the known CSAM library? 1 in 1 trillion. That makes you scared? You have a better chance being arrested for literal murder and being innocent.
I addressed this a few posts up. Nobody is worried about being falsely flagged.
 
  • Like
Reactions: Jemani

Jayson A

macrumors 68030
Sep 16, 2014
2,671
1,935
You fall into the first group.
If you're not falsely flagged, then all of your things remain private. What's the issue here? Are you afraid you'll be that 1 in a trillion that somehow has 30+ innocent photos that just so happen to match 30 photos in a known CSAM database and then having those 30 photos reviewed by Apple. Then, if they truly are innocent photos, literally nothing happens.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.