Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Do you feel "What happens on your iPhone stays on your iPhone" is false advertising?

  • Yes

    Votes: 157 68.0%
  • No

    Votes: 74 32.0%

  • Total voters
    231
Nobody is afraid. People are not willing to get inspected by Apple on their own device.
To be fair, Apple has no idea what's being inspected, they'll only be aware of any of your content if 30+ matches occur.

To put it quite simply, Apple has absolutely no knowledge of anything that's on your device UNTIL 30+ images are found to match the CSAM database. 99.99% of us have nothing to worry about as we don't have anything like that on our devices, but if you're really worried, get rid of your smartphone all together. It's for the best.
 
They come on my device, install their search database, scan my pictures, establish communications to their HQ, might inspect them by humans and regard them if their software thinks they are on their database and even call the police if they think my content is wrong, flagged, illegal or whatever.
(I have no illegal content and never had, you could print all my pictures on postcards I still oppose this)
Who do they think they are? We have laws and a legal system in place and don't need apple to spy on us on top.
 
  • Like
Reactions: dk001 and 09872738
They come on my device, install their search database, scan my pictures, establish communications to their HQ, might inspect them by humans and regard them if their software thinks they are on their database and even call the police if they think my content is wrong, flagged, illegal or whatever.
(I have no illegal content and never had, you could print all my pictures on postcards I still oppose this)
Who do they think they are? We have laws and a legal system in place and don't need apple to spy on us on top.
There's a way out. Turn off iCloud Photos because the system only works WHILE your photo is being UPLOADED to iCloud. At that point, Apple has every right to scan the file being uploaded to THEIR server.

It's not like their scanning service is running in the background inspecting all of your photos for illegal stuff.
 
They still install their database and likely search my photos without consent to create their hashes for comparison in case I switch on iCloud.

I want a clear opt out option to keep my phone totally private from their database and inspection software.
 
They still install their database and likely search my photos without consent to create their hashes for comparison in case I switch on iCloud.

I want a clear opt out option to keep my phone totally private from their database and inspection software.
Yes, there is an opt out. Don’t use iCloud Photo Library. Is it that hard to understand? The hash database will be on everyone’s phone regardless but it won’t be visible to the user and even if you could see it, it wouldn’t make any sense because it’s literally 1s and 0s and they can’t be converted back into the original photos or else that would be the biggest blunder of all time of Apple put child porn on everyone’s device. Haha

edit: Apple has been very clear about how the hashes are only generated and checked during the upload process, so if you don’t want anything to happen, don’t upload photos to Apple.

They have every right to control the content you choose to store on their servers. Sorry.
 
If you're not falsely flagged, then all of your things remain private. What's the issue here? Are you afraid you'll be that 1 in a trillion that somehow has 30+ innocent photos that just so happen to match 30 photos in a known CSAM database and then having those 30 photos reviewed by Apple. Then, if they truly are innocent photos, literally nothing happens.
I already stated what the issue is. It’s not the current implementation. It’s what it could lead into.

After Apple releases this, what’s to stop them from scanning other types of content in the future? What’s to stop governments from pressuring Apple to search for anything they want?

I would so much rather Apple unlock someone’s phone due to a warrant than Apple scan everyone’s pictures without provocation.
 
  • Love
  • Haha
Reactions: Jemani and alpi123
I already stated what the issue is. It’s not the current implementation. It’s what it could lead into.

After Apple releases this, what’s to stop them from scanning other types of content in the future? What’s to stop governments from pressuring Apple to search for anything they want?

I would so much rather Apple unlock someone’s phone due to a warrant than Apple scan everyone’s pictures without provocation.
Again. Apple has our backs. They still care about privacy. If they say they won’t abuse it, I believe them. You may not, but I do.
 
  • Like
Reactions: alpi123
Or… you know they scan it but keep everything private until 30+ images are found to match CSAM which means 99.99% of people won’t ever have any of their data known.
While I acknowledge the iOS user agreement points out that Apple is already scanning, I don't think Apple should be scanning anything on-device, as there is no legal need to. Apple is not held accountable for anything on the device. If they knowingly turn a blind eye to any illegal content on the iCloud server, that is a different matter.
 
While I acknowledge the iOS user agreement points out that Apple is already scanning, I don't think Apple should be scanning anything on-device, as there is no legal need to. Apple is not held accountable for anything on the device. If they knowingly turn a blind eye to any illegal content on the iCloud server, that is a different matter.
But they aren’t scanning it unless you’re trying to upload the file to their servers which in that case, they have the right to check it. They’re not going to have some background process that’s constantly monitoring your photos. This happens ONLY at the time of the upload.
 
But they aren’t scanning it unless you’re trying to upload the file to their servers which in that case, they have the right to check it. They’re not going to have some background process that’s constantly monitoring your photos. This happens ONLY at the time of the upload.
I know how the process works. There should be no on-device scanning at all. There is no reason for it on-device. That was my point. Scanning should be saved for iCloud where it can become a legal problem for Apple.

There is also no need to capitalize words. I can read just fine without capitals.
 
I know how the process works. There should be no on-device scanning at all. There is no reason for it on-device. That was my point. Scanning should be saved for iCloud where it can become a legal problem for Apple.

There is also no need to capitalize words. I can read just fine without capitals.
It’s for emphasis.

Also, Siri is moving to “on device” in iOS 15. Now your phone will be listening to you all on device? Whoa. Invasion of privacy much? That stuff should only be processed on apples servers! How do I even know if it stops listening when I turn it off and what if other governments force Apple to send those queries to the feds? Oh no I’m so worried.
 
It’s for emphasis.

Also, Siri is moving to “on device” in iOS 15. Now your phone will be listening to you all on device? Whoa. Invasion of privacy much? That stuff should only be processed on apples servers! How do I even know if it stops listening when I turn it off and what if other governments force Apple to send those queries to the feds? Oh no I’m so worried.
Since you can't refute an honest retort, you resort to ridicule and mocking me over points I didn't bring up, even though I was on point and civil with my previous post. If that is the best you have to offer, don't bother responding next time.
 
Since you can't refute an honest retort, you resort to ridicule and mocking me over points I didn't bring up, even though I was on point and civil with my previous post. If that is the best you have to offer, don't bother responding next time.
I just don’t understand why everyone thinks Apple just gave up on privacy. You’re telling me that everyone involved in this thought it was a good idea and nobody thought there could be any bad consequences to this? It’s probably not something they just brainstormed for an hour and said, yup, let’s go through with it.

Then again, I’m just speculating here just like how everyone is speculating that apple invaded your privacy.
 
  • Like
Reactions: alpi123
I just don’t understand why everyone thinks Apple just gave up on privacy. You’re telling me that everyone involved in this thought it was a good idea and nobody thought there could be any bad consequences to this? It’s probably not something they just brainstormed for an hour and said, yup, let’s go through with it.

Then again, I’m just speculating here just like how everyone is speculating that apple invaded your privacy.
In my opinion, it is not logical to lump individual responses and positions together and then ask why everyone feels the same and has the exact same position.

I don't believe Apple has given up on privacy per se. However, I don't believe Apple has been consistent in this area in regards to marketing and personal corporate voiced interviews versus actual implementation, execution, and corporate response action. I don't believe Apple is invading user privacy per se because Apple has clearly stated their intent and purpose with OS updates which many, many people don't bother to read and I think that is why so many of these very same people see it as an invasion when it really isn't.

In my opinion, Apple is looking for the offense when it resides on the server. As such, that is where the scanning should occur and not be on-device.

The argument that on-device scanning protects user privacy versus scanning alone on the cloud is simply not true. It is nothing more than marketing speak from Apple. Apple is able to gather a lot pertinent information for a possible NCMEC report simply from the on-device scanning, as well as from the functioning and reporting of the OS itself. I believe the reason Apple is doing on-device scanning in this area, is to create a more detailed tech trail of the users care, custody and control of CSAM as it relates to iCloud storage and the possible distribution of said material amongst other pedophiles.
 
  • Love
Reactions: Jemani
In my opinion, it is not logical to lump individual responses and positions together and then ask why everyone feels the same and has the exact same position.

I don't believe Apple has given up on privacy per se. However, I don't believe Apple has been consistent in this area in regards to marketing and personal corporate voiced interviews versus actual implementation, execution, and corporate response action. I don't believe Apple is invading user privacy per se because Apple has clearly stated their intent and purpose with OS updates which many, many people don't bother to read and I think that is why so many of these very same people see it as an invasion when it really isn't.

In my opinion, Apple is looking for the offense when it resides on the server. As such, that is where the scanning should occur and not be on-device.

The argument that on-device scanning protects user privacy versus scanning alone on the cloud is simply not true. It is nothing more than marketing speak from Apple. Apple is able to gather a lot pertinent information for a possible NCMEC report simply from the on-device scanning, as well as from the functioning and reporting of the OS itself. I believe the reason Apple is doing on-device scanning in this area, is to create a more detailed tech trail of the users care, custody and control of CSAM as it relates to iCloud storage and the possible distribution of said material amongst other pedophiles.
I respect that
 
While I acknowledge the iOS user agreement points out that Apple is already scanning, I don't think Apple should be scanning anything on-device, as there is no legal need to. Apple is not held accountable for anything on the device. If they knowingly turn a blind eye to any illegal content on the iCloud server, that is a different matter.
Say a stranger decided to send to your child's iMessage a nudity pic, would you be okay if they happen to see it without any interference?

That's the on-device scanning. It doesn't send or show anything to Apple, it just prevents certain media from immediately being visible, they would be blurred out until consent is given for them to be viewed. Much like what Facebook, Instagram, Twitter, Youtube etc. already do. Gosh.
 
Say a stranger decided to send to your child's iMessage a nudity pic, would you be okay if they happen to see it without any interference?

That's the on-device scanning. It doesn't send or show anything to Apple, it just prevents certain media from immediately being visible, they would be blurred out until consent is given for them to be viewed. Much like what Facebook, Instagram, Twitter, Youtube etc. already do. Gosh.
The on-device scanning also involves the other new action from Apple, which there is no opt-in option, as far as I am aware.

As to your question, I don't have a problem with the upcoming parental feature because it is opt-in, outside of thinking the age should be set higher than 13.
 
The on-device scanning also involves the other new action from Apple, which there is no opt-in option, as far as I am aware.

As to your question, I don't have a problem with the upcoming parental feature because it is opt-in, outside of thinking the age should be set higher than 13.
Don’t use iCloud photos and nothing will be sent to Apple just like it’s always been. Nobody is forcing you to give apple your files.
 
Don’t use iCloud photos and nothing will be sent to Apple just like it’s always been. Nobody is forcing you to give apple your files.
I represent myself when I post. You keep retorting against positions I haven't made which tells me you keep seeing everyone else when you see my post and end up addressing my post with the everyone labeled retort. That is a bad habit.
 
  • Love
Reactions: Jemani
I think it has nothing to do with you. They probably legally have to scan for CSAM since EVERY major company does it.

Not true.
”What is not required is that companies actively seek out CSAM on their services:
“(f)Protection of Privacy.—Nothing in this section shall be construed to require a provider to—

(1) monitor any user, subscriber, or customer of that provider;
(2) monitor the content of any communication of any person described in paragraph (1); or
(3) affirmatively search, screen, or scan for facts or circumstances described in sections (a) and (b).

 
  • Love
Reactions: Jemani
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.