Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Do you feel "What happens on your iPhone stays on your iPhone" is false advertising?

  • Yes

    Votes: 157 68.0%
  • No

    Votes: 74 32.0%

  • Total voters
    231

usagora

macrumors 601
Nov 17, 2017
4,869
4,456
Apple has always had access to and will continue after this change to have access to the photos in your iCloud Photo Library. Apple hasn't been completely transparent on the details, but most seem to agree they've already been performing the CSAM checks on iCloud Photo Library uploads since at least 2019.

Yes, so this is a move to make the scan more privacy-minded, which is a good thing.
 

icanhazmac

Contributor
Apr 11, 2018
2,895
11,155
The flaw with your comparison to Ring, etc. is that there's no way to detect such videos without a human having to review each and every video. Not only is that impractical, but it's also a violation of privacy, which what Apple is doing is NOT since they are only alerted to illegal images uploaded to their servers and not having people peruse all of your photos looking for something.

Wrong. Any of these manufacturers could make use of "AI" and say device level monitoring of the audio from the cameras is listening for certain sounds or phrases and only sends notification if it hears X number of trigger sounds/phrases, 100% the same thing as all of the video and audio is being sent to their individual cloud servers.
 

usagora

macrumors 601
Nov 17, 2017
4,869
4,456
Wrong. Any of these manufacturers could make use of the same "AI" and say device level monitoring of the audio from the cameras is listening for certain sounds or phrases and only sends notification if it hears X number of trigger sounds/phrases, 100% the same thing as all of the video and audio is being sent to their individual cloud servers.

I'm sorry, but that's not the same thing at all as comparing file hash info. There are SO many different contexts in which things can be said or it could be someone quoting what someone else said. No way in hell that's going to be a viable option and it would be nowhere near as effective and accurate as what Apple is doing here.
 
  • Like
Reactions: huanbrother

usagora

macrumors 601
Nov 17, 2017
4,869
4,456
Moving the CSAM check to the device level doesn't do anything meaningful with regard to privacy since Apple already has access to the raw photo since its part of the user's iCloud Photo Library. Its just a guess on my part, but based on the way they describe the voucher mechanism implementation and the fact they've enabled it at a device level, they are most likely going to try and expand this to the iMessage service at some point in the future. Since iMessages are end-to-end encrypted, the check has to be performed on the individual devices.

So you'd rather Apple be monitoring everything on the cloud where they have access to all scan data than on your phone where they don't see the scan data?
 
Last edited by a moderator:

icanhazmac

Contributor
Apr 11, 2018
2,895
11,155
I'm sorry, but that's not the same thing at all as comparing file hash info.

Yeah, it actually is.

Let's try this on for size... many manufacturers are developing facial recognition AI, what is to stop all of the previously mentioned camera companies from using a device level search for hashes of known criminal faces? Then lets not just stop at criminals, how about political opponents, etc.? All 100% the same thing.
 

AltecX

macrumors 6502a
Oct 28, 2016
550
1,391
Philly
That is utter and complete BS. That's like saying if a business monitors their security cameras for illegal activity and reports it to the police, they are "acting as an agent of law enforcement." No, they're simply reporting a crime they've observed.
Then let them monitor it when it hits their cloud. Until then it's on MY device and that is akin to surveillance in MY house. They can monitor it when it's left MY house(phone) and is in their house(servers). NOT before.
 

Apple_Robert

Contributor
Sep 21, 2012
35,600
52,360
In a van down by the river
You're just babbling at this point... Moving the CSAM check to the device level doesn't do anything meaningful with regard to privacy since Apple already has access to the raw photo since its part of the user's iCloud Photo Library. Its just a guess on my part, but based on the way they describe the voucher mechanism implementation and the fact they've enabled it at a device level, they are most likely going to try and expand this to the iMessage service at some point in the future. Since iMessages are end-to-end encrypted, the check has to be performed on the individual devices.
I can see Apple expanding the scanning in iMessage to include text phrases that a predator seeking to lure a child may use.

I agree with you that Apple didn't have to do on device scanning unless they are seeking to expand it to their other communication apps on iPhone, iPad, and Mac (as well as third party requirement). The reason for on device is for Apple control.
 

forrie

macrumors regular
Mar 6, 2008
169
144
I wonder of potential legal challenges to this, as it applies to personal devices and to general privacy rights (ie: non-cloud). One of the most personal items we have today is our cell phones. Even with aggregated data analysis (ie: tagging or "hashes" of photos) there is still access to same, in this case without consent. The latter part being significant.

What I have found, over the years, is there appears to be an uptick in using child porn or human trafficking as the basis behind taking invasive actions that I surmise have much different agenda and provides for broader abuses of privacy.

Getting the public riled and angry about abuse is an easy way to gain psychological pliancy. This tactic is classic "perception management" used in information warfare, etc.

I just find this move by Apple to be very perplexing and disappointing.

The EFF has chimed in with their take on this:

Apple's Plan to "Think Different" About Encryption Opens a Backdoor to Your Private Life

If there is a larger legal challenge to this, the EFF will most certainly need to get involved.

That Apple continues to hold and control encryption keys to all iCloud content is yet another obvious measure (or failure) on their part, and likely part of the intentional design strategy and is not optional.
 

AstonSmith

macrumors regular
Nov 2, 2016
104
88
UK
The whole thing feels a bit like this to me: it's like if a police officer comes into to your house while you're at work, rummages around your drawers to check for contraband and leaves before you get home. You might not ever know it happened because you have no contraband, but that doesn't make it right.

As for the thing itself, I get some areas, don't understand others and still have questions.

How big is the database of hashes?
How often will the phone(/device) download this file?
When will the phone scan the photo library for matches?
How much battery power, RAM and CPU will be collectively wasted on this process? (Which will also indirectly increase CO2 emissions.)
What exactly are those "safety vouchers" in lay terms?
Will someone be able to extract the program that scans the pictures and independently test it?
 

AltecX

macrumors 6502a
Oct 28, 2016
550
1,391
Philly
I wonder of potential legal challenges to this, as it applies to personal devices and to general privacy rights (ie: non-cloud). One of the most personal items we have today is our cell phones. Even with aggregated data analysis (ie: tagging or "hashes" of photos) there is still access to same, in this case without consent. The latter part being significant.

What I have found, over the years, is there appears to be an uptick in using child porn or human trafficking as the basis behind taking invasive actions that I surmise have much different agenda and provides for broader abuses of privacy.

Getting the public riled and angry about abuse is an easy way to gain psychological pliancy. This tactic is classic "perception management" used in information warfare, etc.

I just find this move by Apple to be very perplexing and disappointing.

The EFF has chimed in with their take on this:

Apple's Plan to "Think Different" About Encryption Opens a Backdoor to Your Private Life

If there is a larger legal challenge to this, the EFF will most certainly need to get involved.

That Apple continues to hold and control encryption keys to all iCloud content is yet another obvious measure (or failure) on their part, and likely part of the intentional design strategy and is not optional.
I suspect that the TOS for iCloud will be changed to include Consent, so you can either cancel your iCloud or accept the new TOS.
 

forrie

macrumors regular
Mar 6, 2008
169
144
I'll guess that this had been in planning stages for a long time, as it involves relationships with governmental agencies, including logistics of data sharing resources, etc. We can't negate that the response by Apple may have been partially affected by pressure from those agencies -- as per my earlier post, these actions often open doors for broader application of information abuse (ie: undisclosed agenda). Wouldn't surprise me.

These details will likely never be willingly disclosed. But good questions. I do wonder.
 

forrie

macrumors regular
Mar 6, 2008
169
144
I suspect that the TOS for iCloud will be changed to include Consent, so you can either cancel your iCloud or accept the new TOS.

That's something else that really irritates me. You sign up for a service, there is a ToS and AUP. But during that time, the vendor can (at any time) change those terms of service whenever they want. So you become dependent on some architecture, they change the ToS and you have no other option but to comply or start all over again (if you can, at all).

Your only recourse is to stop using the service, which doesn't always work with a large ecosystem such as Apple.

This won't be the end of it, but with a larger entity like Apple getting this type of attention, hopefully some limits will be enforced upon them.
 

Abazigal

Contributor
Jul 18, 2011
20,388
23,876
Singapore
Nope, the announcement this week was this the scanning will happen on your device. But, yes, it only happens if you have iCloud Photos turned on. That's simply a policy decision though, and not some sort of technical limitation; it could change any time to run whether you're using iCloud or not.
Either way, you can't judge a person by what they may do in the future, only what they have already done.

So far, Apple's stance is that they will only scan photos destined to be uploaded to cloud storage, no different from what Microsoft and google are already doing, so it seems fair to grade them along these lines. In this regard, Apple is actually behind the curve, and playing catch up to these companies.

We can argue until the cows come home about a thousand different hypothetical scenarios that could happen. While said technology could also in theory be used for other purposes (like say, scan all photos regardless of whether iCloud is turned on or off), the problem with a slippery slope argument like this is what while I cannot prove that it won't happen, you can't prove that it will necessarily happen either (although the ability certainly is there).

In the end, we are just arguing over each other, and nobody is listening.
 
  • Like
Reactions: BigMcGuire

dk001

macrumors demi-god
Oct 3, 2014
11,128
15,478
Sage, Lightning, and Mountains
Either way, you can't judge a person by what they may do in the future, only what they have already done.

So far, Apple's stance is that they will only scan photos destined to be uploaded to cloud storage, no different from what Microsoft and google are already doing, so it seems fair to grade them along these lines. In this regard, Apple is actually behind the curve, and playing catch up to these companies.

We can argue until the cows come home about a thousand different hypothetical scenarios that could happen. While said technology could also in theory be used for other purposes (like say, scan all photos regardless of whether iCloud is turned on or off), the problem with a slippery slope argument like this is what while I cannot prove that it won't happen, you can't prove that it will necessarily happen either (although the ability certainly is there).

In the end, we are just arguing over each other, and nobody is listening.

Can you share where Google is device scanning?
I do know they provide tools to help combat this https://protectingchildren.google/intl/en/ and I know they have been image scanning shared (Cloud, Gmail, etc) for a number of years. I have not heard of on device scanning.
 

Abazigal

Contributor
Jul 18, 2011
20,388
23,876
Singapore
Can you share where Google is device scanning?
I do know they provide tools to help combat this https://protectingchildren.google/intl/en/ and I know they have been image scanning shared (Cloud, Gmail, etc) for a number of years. I have not heard of on device scanning.

Google scans photos you upload to google photos.

Apple scans photos you upload to iCloud, right before they are uploaded (so the scanning occurs on-device), but only for photos that were going to be uploaded anyways.

It’s the same outcome at the end of the day, even if the implementation differs.

I think the more vital question is - why is Apple choosing to go about this in such a controversial manner, when other companies have been doing it for years without people batting an eyelid? What’s wrong with simply scanning iCloud content and calling it a day?

Even the revelation that facebook had an army of employees manually vet controversial images to the point where many of them suffered mental breakdowns largely went under the radar because again, it’s a given that whatever you put on the internet is no longer private and well, it’s a problem that affected them, not us.

One possibility is that this is the first in a number of steps towards offering encrypted iCloud storage, and they can at least tell law enforcement that they are fairly confident that there is no child pornography on their users’ iCloud storage accounts (because there would otherwise be no real way of telling).

Granted, the most disconcerting part is the lack of control on the user’s part and how this largely comes down to “trust us to not abuse this in the future”. But there’s still a few months before this become reality, so expect more information and context to surface, and keep an open mind.
 

Shirasaki

macrumors P6
May 16, 2015
16,250
11,745
Of course it’s still true. Apple cannot see any of the scanning info on your phone. The only time they would ever see any scan results is if you have multiple CSAM images on your device and you attempt to upload those to iCloud. You never had any true privacy on iCloud to begin with, as Apple can access all your files there if they so desire. Read the iCloud legal document on Apple‘s website.

ironically, things are now going to be even more private than before, yet many of you are acting like this is a step backwards, because, again, you are interpreting things through a distorted lens.
Please come forward and be the pioneer to dismantle any form of encryption, outlaw encryption technology outside of military and secret services and encourage full unconditional disclose of any information held by any citizen.
 
  • Haha
Reactions: dk001

tomtattoo

macrumors 6502a
Mar 8, 2013
546
891
Berlin
Letting aside the actual reason - child abuse - this whole thing is only the end of a long road we all have gone together. Since the first allowed cookie at the end of the 90´s, the first syncing of bookmarks (do it, it is so comfy!), the first upload to a cloud system we are going the hill down. We are already in a hell of cookies and trackers, we are bound into systems we can not leave easily, so this step af Apple doesn´t surprise me at all.
It will getting worse and worse with no point of return and the baddest scifi movie is harmless compared to where we all will end.
Yes, I´m pessimistic but the last 20 years I so often have been right with my "dark visions".
Just my 2 cents.
 

Polaroid

macrumors 65816
Oct 1, 2013
1,439
1,575
Why are people so worried - its been done for years on other platforms. If you dont have child porn on phone then why are you worried? Yes it could flag innocent photos but surely if they were to investigate they’d see they are innocent photos or family photos. It’s the context. People need to chill.
 
  • Haha
Reactions: ratspg

matrix07

macrumors G3
Jun 24, 2010
8,226
4,894
Let's see the real size of that "screeching minority" and let's see what percentage believes that "one 1 in 3 trillion false positives" and "slightly edited images are also detected" can be true at the same time.
I felt it’s true, until this “scanning library” surfaced.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.