Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Do you feel "What happens on your iPhone stays on your iPhone" is false advertising?

  • Yes

    Votes: 157 68.0%
  • No

    Votes: 74 32.0%

  • Total voters
    231

dszakal

macrumors member
Original poster
Aug 22, 2020
33
266
Let's see the real size of that "screeching minority" and let's see what percentage believes that "one 1 in 3 trillion false positives" and "slightly edited images are also detected" can be true at the same time.
 
  • Haha
Reactions: alpi123

Cobold

macrumors 6502a
Sep 16, 2014
817
1,179
Dieburg, Germany
I still feel it’s true, because scanning of photos is done in iCloud and only applies if you use iCloud photos and thus upload your photos from your device to the cloud (where they‘re scanned).

No scanning on your device - so still true advertising in my opinion.
 

rorschach

macrumors 68020
Jul 27, 2003
2,299
1,977
I still feel it’s true, because scanning of photos is done in iCloud and only applies if you use iCloud photos and thus upload your photos from your device to the cloud (where they‘re scanned).

No scanning on your device - so still true advertising in my opinion.
Nope, the announcement this week was this the scanning will happen on your device. But, yes, it only happens if you have iCloud Photos turned on. That's simply a policy decision though, and not some sort of technical limitation; it could change any time to run whether you're using iCloud or not.
 

MrTSolar

macrumors 6502
Jun 8, 2017
369
444
It’s a connected device, so of course not. What happens on my iPod nano stays there until I plug it into iTunes. Then, diagnostic logs and song plays are synced back and updated in my main library.

My read of the documents suggest the scanning is done by your phone and then uploaded to iCloud. It hasn’t been said outright that this only applies if you use iCloud Photos. Apple’s wording just assumes that you do (like the techs at their stores just assume you have an iCloud backup). Apple, see my signature?
 
  • Like
Reactions: retta283

usagora

macrumors 601
Nov 17, 2017
4,869
4,456
Of course it’s still true. Apple cannot see any of the scanning info on your phone. The only time they would ever see any scan results is if you have multiple CSAM images on your device and you attempt to upload those to iCloud. You never had any true privacy on iCloud to begin with, as Apple can access all your files there if they so desire. Read the iCloud legal document on Apple‘s website.

ironically, things are now going to be even more private than before, yet many of you are acting like this is a step backwards, because, again, you are interpreting things through a distorted lens.
 
Last edited by a moderator:

Mega ST

macrumors 6502
Feb 11, 2021
370
512
Europe
For the time being I disabled my auto updater until I understand better what this next update is about. And I will not update if I don't feel well enough informed to decide.

I am opposing child porn but why should everybody's phone be inspected for it not just the small number of idiot's phones? What is the legal base for this? And I am totally confused by this "coded data send to my phone" thing needed to compare if my photos are on child porn lists. They want to send coded child porn to my phone? Never!
This is really creating some very basic issue for me towards Apple and trusting them for privacy.

Again I have nothing to hide but this is no reason to access my data just because they feel like it. There is a legal system with laws and institutions to protect us from crime. Apple is not required to be another self declared morality police or whatever they want to be now. And it opens many backdoors to search for other topics.

This feels like the wrong way to go and needs a lot of debate before being launched.
 
Last edited:

usagora

macrumors 601
Nov 17, 2017
4,869
4,456
For the time being I disabled my auto updater until I understand better what this next update is about. And I will not update if I don't feel well enough informed to decide.

I am opposing child porn but why should everybody's phone be inspected for it not just the small number of idiot's phones? What is the legal base for this? And I am totally confused by this "coded data send to my phone" thing needed to compare if my photos are on child porn lists. They want to send coded child porn to my phone? Never!
This is really creating some very basic issue for me towards Apple and trusting them for privacy.

Again I have nothing to hide but this is no reason to access my data just because they feel like it.

You are confused. In essence, they are comparing the digital "fingerprint" of CSAM images (NOT the visible image itself) to images on your phone. And this is all happening on your device, so Apple can't see it. The only time Apple will know anything is if you have multiple CSAM images on your phone that are detected and you then try to upload them to iCloud.

Also, you are naïve/sheltered if you think we're talking about a "small number of idiots." CSAM is absolutely RAMPANT and many of them are iPhone users.
 

Mega ST

macrumors 6502
Feb 11, 2021
370
512
Europe
I have no child porn photos, never had and this statement should be enough. I don't need to prove this being an innocent citizen by giving them constant private access to my phone. This entire approach lacks respect in many ways. They sell hardware and software just because of this they could NOT come to my home and inspect any book for comparison. If a judge checks and finally decides I own something forbidden the authorities would be free to go but not Apple.
 

dk001

macrumors demi-god
Oct 3, 2014
11,141
15,494
Sage, Lightning, and Mountains
Of course it’s still true. Apple cannot see any of the scanning info on your phone. The only time they would ever see any scan results is if you have multiple CSAM images on your device and you attempt to upload those to iCloud. You never had any true privacy on iCloud to begin with, as Apple can access all your files there if they so desire. Read the iCloud legal document on Apple‘s website.

I understand the technology is complex, but some of you really lack basic reading comprehension skills. Either that, or you’re so eager to find evidence of conspiracy or corporate overreach, that you interpret everything through that distorted lens.

ironically, things are now going to be even more private than before, yet many of you are acting like this is a step backwards, because, again, you are interpreting things through a distorted lens.

Sorry for the laugh but you are talking A and many of us are talking B.
Two major concerns:
1. Since when has Apple become a LEO? We could even go down the whole rabbit hole about $$ recomp via marketing for future sales and Constitutionality but let’s not.
2. With this infrastructure in place, how long until it is used for other “searches” at the behest of Apple or a State entity?

Not saying this is wrong, it just needs a whole lot more thought built into warrantless surveiling.
 

usagora

macrumors 601
Nov 17, 2017
4,869
4,456
Sorry for the laugh but you are talking A and many of us are talking B.
Two major concerns:
1. Since when has Apple become a LEO? We could even go down the whole rabbit hole about $$ recomp via marketing for future sales and Constitutionality but let’s not.
2. With this infrastructure in place, how long until it is used for other “searches” at the behest of Apple or a State entity?

Not saying this is wrong, it just needs a whole lot more thought built into warrantless surveiling.

sigh...ONCE AGAIN, no one (as in a person or agency of people) is "surveilling" your device. ALL that's happening is CSAM on your phone is being marked as such so if you upload a bunch of it to iCloud, you're going to get caught. If you don't download CSAM to your phone AND upload them to iCloud, then your life will not change between iOS14 and 15.

Also, Apple is not acting as law enforcement here. They are simply reporting CSAM, which is the only right thing to do. What the police do with that report is up to them, not Apple.
 
Last edited by a moderator:

Mega ST

macrumors 6502
Feb 11, 2021
370
512
Europe
Apple runs a program on my iPhone to inspect my pictures. If the apple owned software thinks it found something it informs apple. So this is far from happening on my phone only. They use my hardware and software and bandwidth for their marketing morale without my consent.

If it is so easy to opt out the bad guys will just do that and all of the honorable majority will have their phones inspected for no reason from now on?

I think this is legally on pretty thin ice.

What I think would be better to send copies of all pictures a child's iPhone sends to anybody to the parents of the kid as a copy. So they would know what is being sent. And maybe secretly code these pictures to be able to call them back or destroy the files somehow if needed. The perfect solution would be if the parents could klick on "follow, find and delete" afterwards.
 
Last edited:

usagora

macrumors 601
Nov 17, 2017
4,869
4,456
You’re confusing a passive activity with active participation…

No, I'm talking about actively monitoring surveillance cameras for crime, such as theft of merchandise or other property. Apple is doing the same thing except it's very private because they ONLY know about something if it's illegal - they don't view anything else. They have a right to know if illegal material is being uploaded to their servers and they have a right to report that to law enforcement. Not sure what's so hard to understand here.
 

dk001

macrumors demi-god
Oct 3, 2014
11,141
15,494
Sage, Lightning, and Mountains
sigh...ONCE AGAIN, no one (as in a person or agency of people) is "surveilling" your device. ALL that's happening is CSAM on your phone is being marked as such so if you upload a bunch of it to iCloud, you're going to get caught. If you don't download CSAM to your phone AND upload them to iCloud, then your life will not change between iOS14 and 15.

Also, Apple is not acting as law enforcement here. They are simply reporting CSAM, which is the only right thing to do. What the police do with that report is up to them, not Apple.

Reread my post.

I AM NOT in disagreement. I AM saying it needs a lot more discussion before something like this is performed. Your description on HOW does not preclude the misuse nor answer the question of constitutional legality.
 
Last edited by a moderator:
  • Like
Reactions: The1andOnly

axantas

macrumors 65816
Jun 29, 2015
1,003
1,407
Home
My anser is simply YES. Everyone is getting scanned, because everyone COULD do someting regarding photos, which is not ok. So every single user is getting a suspicious user and has to prove innocence by being scanned without a positive result.

Welcome to 1984 - big Apple is watching you - be aware, what you are doing, you could get into big trouble.

I am fully supporting the fight against these photos, but I am not willing to become controlled continuously, whether I am acting the right way or not. It is just like Apple tells me "we believe you, but you might be lying, so we control it"

First time thinking about leaving that platform. Redying my printer for paper boarding cards, having a look for my plastic credit cards...

I am heavily disappointed and very concerned. No, Mr. Cook - simply no. I used to call you Tim earlier, but I will revert this to "Mr. Cook", as I do not want to be to familiar...
 

icanhazmac

Contributor
Apr 11, 2018
2,912
11,206
That is utter and complete BS. That's like saying if a business monitors their security cameras for illegal activity and reports it to the police, they are "acting as an agent of law enforcement." No, they're simply reporting a crime they've observed.

Wow, that is a stretch! There is a huge difference between a business monitoring video from surveillance cameras they own on property they own/rent/lease and what Apple is doing on a device I own.

Consider this, should the makers of indoor security cameras, like Logitech, Eufy, Ring, etc, for consumer home use be able to monitor those cameras for instances of domestic violence, illegal drug use, child abuse, etc. because the video is stored on their servers via a cloud service? That will be the next step.
 

usagora

macrumors 601
Nov 17, 2017
4,869
4,456
If you don’t understand the difference between passive and active participation, not sure what to tell you other than you're entitled to your opinion. In keeping with your flawed store monitoring analogy, it’s more akin to all customers being searched as they leave the store and if something “suspicious“ is found the customer is turned over to the waiting detail officer. The store didn’t arrest anyone and if you didn’t want to be searched, you obviously didn’t have to enter the store.

Setting aside the bogus analogy, the issue at hand is the fact that the scanning has been pushed to the device level rather than remaining at the service level. As I originally said, Apple is going to have to explain that one better and simple mantra of “privacy” isn’t going to be sufficient.

LOL, no, my analogy is completely apt here. The store is reporting illegal activity in connection to THEIR store, not unrelated illegal activity they find by searching customers. Same with Apple. Someone is attempting to use THEIR servers to distribute illegal material.

So basically you're saying Apple needs to explain why they moved the scanning to the device level, but you won't accept the actual explanation :rolleyes: OF COURSE it's more private, since Apple can't see what's on your phone, but they can see everything that's in the cloud.
 
  • Like
Reactions: NBKindaGirl

usagora

macrumors 601
Nov 17, 2017
4,869
4,456
Wow, that is a stretch! There is a huge difference between a business monitoring video surveillance from cameras they own on property they own/rent/lease and what Apple is doing on a device I own.

Consider this, should the makers of indoor security cameras, like Logitech, Eufy, Ring, etc, for consumer home use be able to monitor those cameras for instances of domestic violence, illegal drug use, child abuse, etc. because the video is stored on their servers via a cloud service? That will be the next step.

Not a stretch at all, though obviously there's no perfect analogy since this is sort of a unique case involving a technology ecosystem. Again, Apple won't see an iota of scan data from your phone unless you attempt to upload a certain number of illegal images to their server. You're acting like they're actively viewing every photo on your phone. So really what Apple is doing is FAR more private than my store analogy!

The flaw with your comparison to Ring, etc. is that there's no way to detect such videos without a human having to review each and every video. Not only is that impractical, but it's also a violation of privacy, which what Apple is doing is NOT since they are only alerted to illegal images uploaded to their servers and not having people peruse all of your photos looking for something.
 
  • Like
Reactions: Knowlege Bomb

Weisswurstsepp

macrumors member
Jul 25, 2020
55
63
Of course it’s still true. Apple cannot see any of the scanning info on your phone. The only time they would ever see any scan results is if you have multiple CSAM images on your device and you attempt to upload those to iCloud. You never had any true privacy on iCloud to begin with, as Apple can access all your files there if they so desire. Read the iCloud legal document on Apple‘s website.

I understand the technology is complex, but some of you really lack basic reading comprehension skills. Either that, or you’re so eager to find evidence of conspiracy or corporate overreach, that you interpret everything through that distorted lens.

ironically, things are now going to be even more private than before, yet many of you are acting like this is a step backwards, because, again, you are interpreting things through a distorted lens.

Apple CSAM detection mechanism is much more than a simple comparison of hash codes (in fact, "hash" isn't even the right word here as these "hashes" are more like keys which contain specific properties of the image so it can be processed by ML). What it does is to scan and correlate image properties to create a similar key, and the specific properties in this key are then weighted against keys from the CSAM database. If a (undisclosed) number of parameters are within an (undisclosed) threshold then the system assumes the image content is similar (*not* identical) and the photo will not get its upload voucher.

If the number of images where the upload voucher is withheld exceeds (another undisclosed) threshold then the account gets flagged.

Now, since you seem to think this is all A-OK lets address the issues here.

First of all there is the reliability of the system. Apple states an error rate of less than 1 in a Trillion scans. That sounds very low at first but it really isn't. First of all, it's a per-scan figure, and users tend to have more than one photo. Second, it's a calculated figure under the assumption that there are no bugs or security holes. Third, and most importantly, it's a figure based on a statistical analysis, which means that in reality the system may mis-ID much more often (a lot more often, in fact), and because of image analysis works it's likely that this will happen with images which share certain properties (for example a series of photos taken by the same user).

Then of course there is the fact that pretty much all of Apple's software has bugs (lots of them) and it would be naive to expect that only this functionality will be the single exception. This of course is already in addition to the fact that image processing and face recognition themselves have a very long and solid track record of completely failing to correctly identify people and objects on photos reliably (and the track history of correctly identifying CSAM by ML algorithms is even worse).

Of course, there's still the second step which is manual checking by a human Apple employee. However, this doesn't really solve the issue. For once, according to Apple, the human check is performed based on the image keys, i.e. Apple's support staff won't see the photos. Therefore it is questionable that an assessment by Apple's staff will come to a different result than the ML algorithm that made the classification. And even if that wasn't the case, it's pretty naive to trust the company that already ruined the life of a teenager after "screening evidence":

https://www.theregister.com/2021/05/29/apple_sis_lawsuit/

If you really want to trust the same people with "doing the right thing" when you're flagged as a serial pedophile then be my guest.

Now, you might still wonder what's all the commotion about since everyone else does the same, right? And it's correct that Google, Microsoft and most other cloud providers perform similar scans on images that people upload to their cloud space. In fact, even Apple has done the same with iCloud. And they all have teams which look at images that get flagged by the system, and that's OK since it's what you agreed to with the T&Cs.

What's new is that Apple now moves the detection process from the cloud to the user's device, which comes with its own problems.

For once, once the system is in place there is little to prevent it from scanning all photos (not just the ones that are uploaded to iCloud), or other content. The algorithm doesn't care, it will scan anything it's directed to and look for anything it's told to look for. Even if you trust Apple to "do the right thing" (which is extremely naive, considering that Apple is happy to rat out its Chinese users to the regime in China as long as the profits stay high), there's still the risk that the mechanism will be abused by 3rd parties. Which isn't really that far-fetched if you have been following the news.

In addition, because the manual check is now performed without looking at the images in question, it's bound to fail at a similar rate as the automatic flagging. Which is bad, as anyone who has ever been accused of being a pedophile while being innocent can tell you. It's one of the very few things where rumors alone can ruin a person's life.

Lastly, there's the question as to "why". Apple claims that this protects user privacy, but that is completely false. First of all, it's a mechanism that accuses the account owner of storing child porn images, an accusation that requires further investigation, and that investigation can not happen without the identity of the account owner and the evidence (the images in question) being known. With the new system, even in the second step of manually checking by Apple staff, it can identify the account owner. So there goes your privacy. Of course you could argue that if you're accused of a crime (even more so that's so horrible as child abuse) then privacy is less important than investigating the crime, and that's OK (I agree with this). However, that is the same with server-side scanning as performed by Google and others, and until now by Apple.

This however raises the question why we even need on-device scanning if the only intention was to check photos that are going to be uploaded to iCloud for CSAM? Because the same result can be achieved with the existing server-side scanning. And people are fine with server-side scanning since they know that the information they upload is shared with the cloud provider, which maintains a separation between data on the device and in the cloud.

And since the very same can already be achieved with existing server-side scanning, the logical conclusion must be that the reason for the introduction of on-device scanning are beyond the search for evidence of CSAM. And people are rightfully angry that a phone manufacturer like Apple, which touts its credentials as protector of it's users' privacy at every opportunity, is installing a facility which allows the remote search of its users' devices, and does so with no control by the users themselves.

Apple's actions leave a very bad taste, and raise a lot of questions. You really have to be willfully ignoring what has been going on the last years to think that this is all just a big nothing-burger.
 
Last edited by a moderator:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.