Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Do you feel "What happens on your iPhone stays on your iPhone" is false advertising?

  • Yes

    Votes: 157 68.0%
  • No

    Votes: 74 32.0%

  • Total voters
    231

Dkka1

Cancelled
Feb 28, 2019
190
309
Privacy on iOS is out of the window as soon as you install any 3rd party apps from the AppStore at the latest. Apple does not prohibit the use of many 3rd party SDKs. For instance, you can find out a lot about users by tools such as "CleverTap" and then there are those tracking specialists like Instagram or Tiktok. Seriously, just search for some dog related things on the web or on YouTube and a few hours later you will get a lot of TikTok videos about funny dogs or dog toys ads on Instagram. Our couch in the living room is also falling apart and I have been talking about it with my BF on WhatsApp and now all I am getting is ads for Sofas on Instagram. It is creepy.
Sofas and washing machines vendors tracking your interests is not creepy. It's how the internet works + how Apple sells the boogeyman.

Apple deploying a mass surveillance code that doesn't do them any favor monetarily during China's tech crackdown: that's creepy.
 
  • Like
Reactions: dk001 and Samdh90

marzfreerider

macrumors 6502
Jun 13, 2014
367
254
Germany
As with any company it's just a marketing gimmick. Give people the feeling you really care about their privacy and all that. Apple doesn't care one way or the other and long as they make money. I have been using lots of different Apple products since 2007 and really enjoy their products, but in no way do I believe everything they say.
 
  • Like
Reactions: jerrytouille

Jemani

macrumors regular
Feb 15, 2012
129
61
sigh...ONCE AGAIN, no one (as in a person or agency of people) is "surveilling" your device. ALL that's happening is CSAM on your phone is being marked as such so if you upload a bunch of it to iCloud, you're going to get caught. If you don't download CSAM to your phone AND upload them to iCloud, then your life will not change between iOS14 and 15.

Also, Apple is not acting as law enforcement here. They are simply reporting CSAM, which is the only right thing to do. What the police do with that report is up to them, not Apple.
Why are you assuming that I have CSAM on my phone? I find that rude and insulting. Everyone should be innocent until proven guilty.
 
  • Like
Reactions: zakarhino

Jemani

macrumors regular
Feb 15, 2012
129
61
To be fair, this is being done “on your phone”
This is the problem. I don’t want Apple using my battery to do their surveillance. It is my battery life they are using for this. If I am at 10% battery and I want to upload wedding photos to iCloud Photos so they are stored and this kicks in. Thanks Apple.
 

zakarhino

Contributor
Sep 13, 2014
2,610
6,961
It's always been false advertising as many privacy and security advocates have been expressing for years but only now are we starting to see other people beginning to wake up to the reality of the situation.

The reality is that this new form of scanning is unlike anything that has been implemented before. MacRumors authors and other media hubs that fail to understand the technology are claiming "this is nothing new, it's been done on other platforms for years!" but that's only true if you ignore the most important detail of the tech which is that it runs on device. Because Apple and others have duped the public into believing anything labeled "on device" is automatically indicative of privacy they're allowing people to think that somehow this is a privacy respecting solution when in fact it's the opposite.

Prior to this technology "what happens on your iPhone stays on your iPhone" was mostly true if you turned off iCloud services but that's no longer true given this tech's ability to operate and report back to Apple/authorities even with iCloud services turned off. Underlining "ability" is critical here because Apple claiming to only scan photos that are about to be uploaded to iCloud is irrelevant if switching a single flag or adding a few lines of code is all that it takes to 1) scan all photos regardless of if they're being uploaded to iCloud or if they came from an end to end encrypted chat service like Signal and 2) scan for things beyond CSAM which is already standard practice in some authoritarian countries.

This entire situation underlines why it's so important for the public to have a baseline understanding of why privacy matters, how privacy is implemented, how encryption works, etc. It's a downright tragedy that Apple customers STILL think that Apple's claim around "end to end encrypted" iCloud services means Apple can't access your stuff.

Some have claimed this is the first step in a larger move to finally truly encrypt iCloud backups and iCloud Photos so therefore this is a worthy tradeoff. That's horse s***. There is absolutely nothing to stop Apple from implementing full zero access encryption to all iCloud services WITHOUT client side content scanning. There's no legal precedent at all in the USA. They are only required to report CSAM if they come across it on their servers, they're not even required to scan for it in the first place. If all iCloud Photos were truly encrypted with no way for Apple to possibly access them then that would legally clear them from any responsibility. All that they would ever be able to see are blobs of encrypted data. Implementing end to end encryption but scanning content before/after it gets sent is the equivalent of an authoritarian saying "You can have a private, encrypted conversation but only if I get to see the contents" (which is what the EU is actively considering in parliament and the USA are advocating for) or a thief commanding "You can have a lock on your door but only if I have a copy of the key to open it."
 

usagora

macrumors 601
Nov 17, 2017
4,869
4,456
Why are you assuming that I have CSAM on my phone? I find that rude and insulting. Everyone should be innocent until proven guilty.

Obviously the "you" in my post was the generic usage of the pronoun meaning anybody, not you personally.
 

Chucks4me

macrumors member
May 12, 2008
60
111
Section 230 needs to go or be rewritten.

47 U.S. Code § 230 - Protection for private blocking and screening of offensive material

The limited liability needs to go and tech companies need to be fully liable for misuse of their services.
For some reason it makes me think the picture screening is just the beginning of something much bigger.
It is the easiest to be implemented as some people will support it.
 
  • Like
Reactions: dk001

MrTSolar

macrumors 6502
Jun 8, 2017
369
444
It's always been false advertising as many privacy and security advocates have been expressing for years but only now are we starting to see other people beginning to wake up to the reality of the situation.

The reality is that this new form of scanning is unlike anything that has been implemented before. MacRumors authors and other media hubs that fail to understand the technology are claiming "this is nothing new, it's been done on other platforms for years!" but that's only true if you ignore the most important detail of the tech which is that it runs on device. Because Apple and others have duped the public into believing anything labeled "on device" is automatically indicative of privacy they're allowing people to think that somehow this is a privacy respecting solution when in fact it's the opposite.

Prior to this technology "what happens on your iPhone stays on your iPhone" was mostly true if you turned off iCloud services but that's no longer true given this tech's ability to operate and report back to Apple/authorities even with iCloud services turned off. Underlining "ability" is critical here because Apple claiming to only scan photos that are about to be uploaded to iCloud is irrelevant if switching a single flag or adding a few lines of code is all that it takes to 1) scan all photos regardless of if they're being uploaded to iCloud or if they came from an end to end encrypted chat service like Signal and 2) scan for things beyond CSAM which is already standard practice in some authoritarian countries.

This entire situation underlines why it's so important for the public to have a baseline understanding of why privacy matters, how privacy is implemented, how encryption works, etc. It's a downright tragedy that Apple customers STILL think that Apple's claim around "end to end encrypted" iCloud services means Apple can't access your stuff.

Some have claimed this is the first step in a larger move to finally truly encrypt iCloud backups and iCloud Photos so therefore this is a worthy tradeoff. That's horse s***. There is absolutely nothing to stop Apple from implementing full zero access encryption to all iCloud services WITHOUT client side content scanning. There's no legal precedent at all in the USA. They are only required to report CSAM if they come across it on their servers, they're not even required to scan for it in the first place. If all iCloud Photos were truly encrypted with no way for Apple to possibly access them then that would legally clear them from any responsibility. All that they would ever be able to see are blobs of encrypted data. Implementing end to end encryption but scanning content before/after it gets sent is the equivalent of an authoritarian saying "You can have a private, encrypted conversation but only if I get to see the contents" (which is what the EU is actively considering in parliament and the USA are advocating for) or a thief commanding "You can have a lock on your door but only if I have a copy of the key to open it."
Agreed. What use is E2E encrypted iCloud if some secret blacklist of hashes can search for whatever it wants before the encryption kicks in? By the way, I think folks like usagora are missing the fact that some 3rd party is providing Apple with the blacklist of hashes. Apple cannot legally possess known CSAM material to generate the hashes from. They have to be provided with hashes from a relevant authority. That's where the abuse potential lies.

I also share the fear that a future update of iOS will make this function without iCloud. The premise of it all just deeply concerns me as a US citizen.
 

MrTSolar

macrumors 6502
Jun 8, 2017
369
444
sigh...ONCE AGAIN, no one (as in a person or agency of people) is "surveilling" your device. ALL that's happening is CSAM on your phone is being marked as such so if you upload a bunch of it to iCloud, you're going to get caught. If you don't download CSAM to your phone AND upload them to iCloud, then your life will not change between iOS14 and 15.

Also, Apple is not acting as law enforcement here. They are simply reporting CSAM, which is the only right thing to do. What the police do with that report is up to them, not Apple.
Apple's cause is good, but HOW they are doing it is all wrong. Having the software do the work of constantly looking for illegal content doesn't justify it in the least. The potential for abuse by state actors is ripe, and even Apple has no way to verify the hashes being searched for are, in-fact, CSAM.

How would you feel if your phone auto-reported you every time you broke the speed limit while driving? How about when your phone reports a false-positive of running a red light? That would provide more immediate benefit as it would make driving safer, but who in their right mind would actively advocate for that? It's a ridiculous method of fighting those (dramatically more widespread) cases of breaking the law. A more effective and more private way to cut down on inattentive driving would be to have the phone disable itself when moving faster than 10 MPH. No reporting to any outside server or authority needed. Emergency dialing only when moving. When one is piloting 3,000+ lbs. of steel at high speeds, one should be focused on the task of driving. Distracted driving by cell phone usage injures upwards of almost 390,000 people each year in the US, and causes 25% of all auto collisions. Near-misses are astronomically higher than that.

It's why I don't have a connected car. I don't need the manufacturer of my car selling my driving data to my insurance company and raising rates because I accelerate and brake hard, usually to avoid collisions from other inattentive drivers that run red lights or brake at the last second. I get penalized because an algorithm thinks I'm a reckless driver when in reality I'm just doing what I can to PREVENT collisions. Why would I want a device that's constantly looking for something to get me in trouble?
 

steve62388

macrumors 68040
Apr 23, 2013
3,100
1,962
One possibility is that this is the first in a number of steps towards offering encrypted iCloud storage, and they can at least tell law enforcement that they are fairly confident that there is no child pornography on their users’ iCloud storage accounts (because there would otherwise be no real way of telling).
Encrypting iCloud accounts would be handy. It's long bothered me that they are not, but I have had to weigh up the sheer convenience of what it offers and so have opted in.
 

axantas

macrumors 6502a
Jun 29, 2015
996
1,404
Home
Why are you assuming that I have CSAM on my phone? I find that rude and insulting. Everyone should be innocent until proven guilty.
Even if the "you" in the original post is generic in this case and not pointing at you personally: This is exactly, where it starts.

You are against it? You do not support it? So you have something to hide and we need to check, whether you are lying or not. We do not believe you. You need to prove your innocence. Dangerous development.
 
  • Like
Reactions: Jemani

Mega ST

macrumors 6502
Feb 11, 2021
368
510
Europe
Why does Apple do all this? Are they pressured by the government to do it, do they think they make the world better? Have they lost in court somehow? Is this about their China business?
I don't understand why they ruin their reputation for -with all respect for the fight against child abuse- nothing.

I hope they abandon this. No search databases will work on my phone please.
 
Last edited:

lah

macrumors 6502
Mar 22, 2010
384
290
Agreed. What use is E2E encrypted iCloud if some secret blacklist of hashes can search for whatever it wants before the encryption kicks in? By the way, I think folks like usagora are missing the fact that some 3rd party is providing Apple with the blacklist of hashes. Apple cannot legally possess known CSAM material to generate the hashes from. They have to be provided with hashes from a relevant authority. That's where the abuse potential lies.

I also share the fear that a future update of iOS will make this function without iCloud. The premise of it all just deeply concerns me as a US citizen.

This is where we have to ask ourselves how much do we trust Apple to properly implement their own program. Seeing Apple is handling all the matches before being sent to said organizations, in the extreme scenario that the hash library is nefariously updated to catch any other type of images, let’s hope this will trip off some circuit breaker to see what is causing the uptick in flagged images. I expect/hope that Apple would immediately stop the program while sorted out. But again, this all lies in how much we trust Apple. Let’s hope once started, they will provide quarterly updates on the program showing how effective it is.
 

Jemani

macrumors regular
Feb 15, 2012
129
61
Even if the "you" in the original post is generic in this case and not pointing at you personally: This is exactly, where it starts.

You are against it? You do not support it? So you have something to hide and we need to check, whether you are lying or not. We do not believe you. You need to prove your innocence. Dangerous development.
It is a slippery slope. “We need to scrub your phone of the offensive images of the anti-government protest”. What is next?
 

Jayson A

macrumors 68030
Sep 16, 2014
2,671
1,935
It is a slippery slope. “We need to scrub your phone of the offensive images of the anti-government protest”. What is next?
Can't they already do all of this server-side (if they wanted to)? Whats the difference here?
 
  • Like
Reactions: Mega ST

Jayson A

macrumors 68030
Sep 16, 2014
2,671
1,935
Why does Apple do all this? Are they pressured by the government to do it, do they think they make the world better? Have they lost in court somehow? Is this about their China business?
I don't understand why they ruin their reputation for -with all respect for the fight against child abuse- nothing.

I hope they abandon this. No search databases will work on my phone please.
Don't update to iOS 15 then. Nobody is forcing you to do anything. Don't use iCloud or don't update to iOS 15 if you want things to stay the way they are.
 
  • Like
Reactions: hans1972

axantas

macrumors 6502a
Jun 29, 2015
996
1,404
Home
Can't they already do all of this server-side (if they wanted to)? Whats the difference here?
The difference is: If they are doing it on their servers, I cannot stop them from doing it. That is ok for me. I do not have to use their server services.

But now, they are scanning "on device" as a default activity and retrieving my private things if a certain value is triggered. So my privacy is gone. If my government decides to introduce an "on device"-scan regarding political activities, I am lost...
 

Jayson A

macrumors 68030
Sep 16, 2014
2,671
1,935
The difference is: If they are doing it on their servers, I cannot stop them from doing it. That is ok for me. I do not have to use their server services.

But now, they are scanning "on device" as a default activity and retrieving my private things if a certain value is triggered. So my privacy is gone. If my government decides to introduce an "on device"-scan regarding political activities, I am lost...
But you CAN stop it on your device by not using iCloud photo library. Then there's no scanning as the scanning only takes place WHILE the PHOTO is being UPLOADED to iCLOUD. Get it now?
 

axantas

macrumors 6502a
Jun 29, 2015
996
1,404
Home
But you CAN stop it on your device by not using iCloud photo library. Then there's no scanning as the scanning only takes place WHILE the PHOTO is being UPLOADED to iCLOUD. Get it now?
What is the use of "on device" scanning then?

They are scanning it. They are not uploading anything, unless a certain (unknown) value is triggered. You are given "tickets" of matches. If this exceeds - lets say - 10 positive tickets, then your images are reviewed by Apple Control.

Why do I have that hash-database on my device? If it only happens, if I use iCloud, they could do it on their servers. But no: They are doing it on my phone. Why do we ALL have to carry around that database of illicit pictures?
 
Last edited:
  • Like
Reactions: Pummers

Jayson A

macrumors 68030
Sep 16, 2014
2,671
1,935
What is the use of "on device" scanning then?

They are scanning it. They are not uploading anything, unless a certain (unknown) value is triggered. You are given "tickets" of matches. If this exceeds - lets say - 10 positive tickets, then your images are reviewed by Apple Control.

Why do I have that hash-database on my device? If it only happens, if I use iCloud, they could do it on their servers. But no: They are doing it on my phone. Why do we ALL have to carry around that database of illicit pictures?
They're numbers and they can't be converted back into photos. I don't know what you're worried about.
 
  • Haha
Reactions: Pummers

axantas

macrumors 6502a
Jun 29, 2015
996
1,404
Home
They're numbers and they can't be converted back into photos. I don't know what you're worried about.
I do not care about these "numbers" - it is about the fact that there is a routine on my device that calculates "numbers" of my images and compares it with the "numbers" of these illicit pictures, which I am suspected to own, like you. Lets hope, you have not too many images (of one in the trillion) that match theses "numbers" on your device.
 
Last edited:

PsykX

macrumors 68030
Sep 16, 2006
2,735
3,912
I still feel it’s true, because scanning of photos is done in iCloud and only applies if you use iCloud photos and thus upload your photos from your device to the cloud (where they‘re scanned).

No scanning on your device - so still true advertising in my opinion.
It's exactly the opposite, it's much worse if things are scanned in the Cloud.
It's always better when things happen on-device because there is no leak of information. Apple often puts emphasis on this in their keynotes.

For example, if Siri processes my requests on-device, not only will it be faster, and will not require being online, but it should also NOT send either a transcript or an audio file of my voice for someone at Apple to listen to.
Because it's not a secret to anyone anymore - they do listen to our Siri requests.
And when people have "Hey Siri" enabled, they do hear audio snippets of people having sex and stuff like that.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.