Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Do you feel "What happens on your iPhone stays on your iPhone" is false advertising?

  • Yes

    Votes: 157 68.0%
  • No

    Votes: 74 32.0%

  • Total voters
    231
It is false advertising and I would not be surprised if there's a lawsuit coming in place in the future. Hopefully, it goes to the Supreme Court.

At Apple, we believe privacy is a fundamental human right? If that's the case why scan an iPhone without the user's authorization?

1628405572511.png
 
Why are people so worried - its been done for years on other platforms. If you dont have child porn on phone then why are you worried? Yes it could flag innocent photos but surely if they were to investigate they’d see they are innocent photos or family photos. It’s the context. People need to chill.
Dear Polaroid
Due to an alarm for flagged photos in your library we had to review the following pictures: (including copies of 35 very private photos of your wife) Unfortunately they matched the hashes of our database, which usually only happens for one in a Trillion photos, so we checked them and found them legit. We have added a supplementary hidden stamp to them to avoid further control. They are safe to use now.

Our apologies for any inconveniences
We are happy to know you as valued customer of our company

Sincerely
Apple Inc. Control section.
 
Why are people so worried - its been done for years on other platforms. If you dont have child porn on phone then why are you worried? Yes it could flag innocent photos but surely if they were to investigate they’d see they are innocent photos or family photos. It’s the context. People need to chill.
Exactly the opposite. People should care when their privacy and rights are violated.

People should be outraged and protest if it happens. Because if not… look up Germany in the 1930s to find out what can happen when people are complacent
 
Last edited:
It is false advertising and I would not be surprised if there's a lawsuit coming in place in the future. Hopefully, it goes to the Supreme Court.

At Apple, we believe privacy is a fundamental human right? If that's the case why scan an iPhone without the user's authorization?

View attachment 1816157
No because there was never a change in stand on your PII. And if I were a fly on the wall when these conversations were taking place in management headquarters , they didn’t want to be accused of “aiding and abetting.”
 
What people don't seem to understand is that this is not going about child pornography.

The technique is basically this, that anyone who wants to (or can do in Apples eyes) can send Apple hashes of pictures, text, voice etc. and the new system will scan silently all iphone users on the globe.

So lets say i am a dictator. I want anyone to drink cola, people who drink just plain water are subhumans to me. I know you are an political opponent, or whatsover, and i know this, because you came public with your disagreement in a viral video. I take your picture/video screenshot and/or your voice from the interview, make a hash file of this and Apple will tell me if you are one of their users, and if you are, what your name is, where you live, and so on.

And Apple will tell me all his global users who have your picture, text, audio or video, i will know how many people know this guy or are in contact with him.

With all this information, i can take my police or others to hunt you and all the others down. Or i do this silently, just to confirm the data, and then i send drones to shoot you from the skies or fire a rocket on your house.

This is not science fiction, all these techniques do already exist. But yet there was no Apple or Google who implemented the last puzzle piece to make this dictators dream come true.

Apple was so to say the last one who could prevent global dictatorship. And they decided to let this happen now. Once this is implemented, everybody will be a possible target to his government or even big tech companies, if they take over more and more government tasks.

So you have to decide if you want to pay with your freedom for using smartphones, or if you will throw your smartphone into trash to keep your freedom.

We all must decide this now. It is elementary!
 
Google scans photos you upload to google photos.

Apple scans photos you upload to iCloud, right before they are uploaded (so the scanning occurs on-device), but only for photos that were going to be uploaded anyways.

It’s the same outcome at the end of the day, even if the implementation differs.

I think the more vital question is - why is Apple choosing to go about this in such a controversial manner, when other companies have been doing it for years without people batting an eyelid? What’s wrong with simply scanning iCloud content and calling it a day?

Even the revelation that facebook had an army of employees manually vet controversial images to the point where many of them suffered mental breakdowns largely went under the radar because again, it’s a given that whatever you put on the internet is no longer private and well, it’s a problem that affected them, not us.

One possibility is that this is the first in a number of steps towards offering encrypted iCloud storage, and they can at least tell law enforcement that they are fairly confident that there is no child pornography on their users’ iCloud storage accounts (because there would otherwise be no real way of telling).

Granted, the most disconcerting part is the lack of control on the user’s part and how this largely comes down to “trust us to not abuse this in the future”. But there’s still a few months before this become reality, so expect more information and context to surface, and keep an open mind.

On device vs Server side.
Future Apple vs Others.

Not the same outcome. Google (and others) are looking at shared photos. Apple would be looking at shared and personal photos. I see Apple’s move at looking (future) in Messages or other shared items that are encrypted on the device before sharing. In effect bypassing your devices encryption.

That is very concerning.
 
Last edited:
Why are people so worried - its been done for years on other platforms. If you dont have child porn on phone then why are you worried? Yes it could flag innocent photos but surely if they were to investigate they’d see they are innocent photos or family photos. It’s the context. People need to chill.

Where on other platforms?
Server side yes. Device side? Can you share where it shows this?
 
It is false advertising for sure. All kinds of information from your actual device is send to Apple servers and, especially if you use services of other companies like Google for example. Your device is sending data daily no matter what you do. Apple simply lies and many are dumb enough to actually believe and argue about it.
 
What people don't seem to understand is that this is not going about child pornography.

The technique is basically this, that anyone who wants to (or can do in Apples eyes) can send Apple hashes of pictures, text, voice etc. and the new system will scan silently all iphone users on the globe.

...

We all must decide this now. It is elementary!
The problem is even worse: Apple bases its activities on these illicit activities and pictures, which is absolutely correct. We have to fight it. Stating, that you do not support scanning your private content immediately pushes you into the position

"so, you are supporting these illegal activities?"
"If you have no illegal content, just let it happen"
"I do not want my private content being checkt by some people"
"Ah, so you fear, someone might discover something, you want to hide"
...and so on.
 
I feel like this whole issue would go away if Apple just moved the scanning to the server-side. Those servers are part of Apple’s infrastructure, and they have a right to know and filter what’s being stored there. That’s what all the other services do anyway.
 
  • Like
Reactions: Mega ST and dk001
That statement was already misleading from the beginning, since it would only apply to an iPhone without a SIM card and with all methods of communication turned off, rendering it effectively useless. With the introduction of iOS 15, however, that statement becomes grossly fraudulent and criminal. Your iPhone will soon have a system designed to monitor and send your private photos to Apple, which really is the introduction of mass surveillance under the disguise of protecting innocent children.
 
I’m curious if the people that are cool with this also wanted Apple to develop a back door for the FBI to unlock Syed Farook’s iPhone.
 
I don’t feel it’s false advertising, because nothing is leaving my iPhone that I didn’t consent to.

If I’m uncomfortable with my Photos being checked for CSAM content, then I can disable iCloud Photos. iCloud Photos aren’t something that stay on my iPhone. They get uploaded to the cloud, willingly.
 
I don’t feel it’s false advertising, because nothing is leaving my iPhone that I didn’t consent to.

If I’m uncomfortable with my Photos being checked for CSAM content, then I can disable iCloud Photos. iCloud Photos aren’t something that stay on my iPhone. They get uploaded to the cloud, willingly.
The problem is that Apple has developed a method to scan content on your device, before it is ever sent to the cloud. Now that this capacity exists, we just have to trust that it won’t be abused.

The whole reason they refused to develop a way to extract data from terrorists phones was because they argued once such a method exists, then we should assume it would be abused in the future. That’s why they only gave the FBI the cloud backups.

Scanning cloud storage has always been different from scanning content on your device. This is like a total reversal.
 
The problem is that Apple has developed a method to scan content on your device, before it is ever sent to the cloud. Now that this capacity exists, we just have to trust that it won’t be abused.

The whole reason they refused to develop a way to extract data from terrorists phones was because they argued once such a method exists, then we should assume it would be abused in the future. That’s why they only gave the FBI the cloud backups.

Scanning cloud storage has always been different from scanning content on your device. This is like a total reversal.

Apple has been scanning your photos for faces and objects for years, even if you don’t use iCloud Photo Library. This data is not end to end encrypted and accessible by Apple. I’m assuming you’ve trusted Apple to not abuse that feature, no?

What’s stopping Apple from using their on-device machine learning to identify photos you might have saved of a political target? The reality is that there isn’t. Apple and governments don’t need this CSAM hashing technology if they wanted to abuse technology to oppress you because so many things on device are already not end to end encrypted, or fully protected by your passcode.

For example, Messages in the Cloud is end to end encrypted, but you know what isn’t? Your iCloud backup. You know what’s in your iCloud backup, available to Apple? The key to decrypt your Messages. Exactly why I don’t use iCloud Backups. I don’t want Apple or anybody to have any potential access to my messages, so I back up locally, password protected.

I guess I don’t see it as a reversal because I never thought or believed Apple was trying to protect me from the government. Ever since it was revealed they actively chose to not fully encrypt iCloud backups I knew they were the same as every other tech company, they just decided to profit off of the ecosystem instead of advertising.
 
  • Like
Reactions: huanbrother
Privacy on iOS is out of the window as soon as you install any 3rd party apps from the AppStore at the latest. Apple does not prohibit the use of many 3rd party SDKs. For instance, you can find out a lot about users by tools such as "CleverTap" and then there are those tracking specialists like Instagram or Tiktok. Seriously, just search for some dog related things on the web or on YouTube and a few hours later you will get a lot of TikTok videos about funny dogs or dog toys ads on Instagram. Our couch in the living room is also falling apart and I have been talking about it with my BF on WhatsApp and now all I am getting is ads for Sofas on Instagram. It is creepy.
 
But this should be no permit for Apple to invade my phone with their own separate software to search for things they consider forbidden.
After I pay the phone it is mine and my private area. I don't want to be searched and inspected by the phone manufacturer with his spyware without permit. And I don't want to have their search list of bad things on my phone.
In many countries Apple would violate laws up to constitutional guarantees doing this. They said they limit it first to the US but we will see.

Where is this all coming from? Why would Apple offend all users and u-turn on privacy like this?
 
It is not false advertising. I want privacy. But I'm sorry, child pornography is a big thing, and it is so important that it is not funny. They are only seeing if your photos match a local database on your phone. If it does not, then no big deal. If it does, Apple gets a message. 1 false positive against 100s of real ones would make me better sleep at night. The slippery slope argument is problematic within itself. Everything can be applied to an extreme. Could the database be used for bad things – of course. But it does not mean the tech is bad itself. It needs to be in the right hands – perhaps government regulation – by a US government, for instance, which is better than nothing. We need privacy, but we can't allow certain people to hide behind closed doors. We have to find a way to ensure privacy without allowing horrific people to have privacy.
 
We all know what could happen when the US police storm your apartment for an arrest. Remember the guy who played COD and was intentionally reported to the police by his opponent? The problem is who gives Apple the right to determine who's good and who's bad, and the right to inform law enforcement? What if they got it wrong?

And today, Apple could use this CSAM excuse to ask you kindly surrender your rights, and Apple works with LOCAL governments worldwide, what if you're an ex-pat staying in a country where your beliefs and ideologies are not tolerated? Get yourself automatically turned in by Apple?
 
  • Like
Reactions: Mega ST
It is false advertising and I would not be surprised if there's a lawsuit coming in place in the future. Hopefully, it goes to the Supreme Court.

At Apple, we believe privacy is a fundamental human right? If that's the case why scan an iPhone without the user's authorization?

View attachment 1816157

Probably Apple has always been referring to the US government's definition of "fundamental human rights"

--- as long as we can and no one else can mess around with.
 
When was the last time that ad even ran? As obnoxious an ad as it was, it was true at the last time I saw it in market. You can't hold a company accountable now for messaging they no longer use.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.