Sofas and washing machines vendors tracking your interests is not creepy. It's how the internet works + how Apple sells the boogeyman.Privacy on iOS is out of the window as soon as you install any 3rd party apps from the AppStore at the latest. Apple does not prohibit the use of many 3rd party SDKs. For instance, you can find out a lot about users by tools such as "CleverTap" and then there are those tracking specialists like Instagram or Tiktok. Seriously, just search for some dog related things on the web or on YouTube and a few hours later you will get a lot of TikTok videos about funny dogs or dog toys ads on Instagram. Our couch in the living room is also falling apart and I have been talking about it with my BF on WhatsApp and now all I am getting is ads for Sofas on Instagram. It is creepy.
Why are you assuming that I have CSAM on my phone? I find that rude and insulting. Everyone should be innocent until proven guilty.sigh...ONCE AGAIN, no one (as in a person or agency of people) is "surveilling" your device. ALL that's happening is CSAM on your phone is being marked as such so if you upload a bunch of it to iCloud, you're going to get caught. If you don't download CSAM to your phone AND upload them to iCloud, then your life will not change between iOS14 and 15.
Also, Apple is not acting as law enforcement here. They are simply reporting CSAM, which is the only right thing to do. What the police do with that report is up to them, not Apple.
This is the problem. I don’t want Apple using my battery to do their surveillance. It is my battery life they are using for this. If I am at 10% battery and I want to upload wedding photos to iCloud Photos so they are stored and this kicks in. Thanks Apple.To be fair, this is being done “on your phone”
Why are you assuming that I have CSAM on my phone? I find that rude and insulting. Everyone should be innocent until proven guilty.
Agreed. What use is E2E encrypted iCloud if some secret blacklist of hashes can search for whatever it wants before the encryption kicks in? By the way, I think folks like usagora are missing the fact that some 3rd party is providing Apple with the blacklist of hashes. Apple cannot legally possess known CSAM material to generate the hashes from. They have to be provided with hashes from a relevant authority. That's where the abuse potential lies.It's always been false advertising as many privacy and security advocates have been expressing for years but only now are we starting to see other people beginning to wake up to the reality of the situation.
The reality is that this new form of scanning is unlike anything that has been implemented before. MacRumors authors and other media hubs that fail to understand the technology are claiming "this is nothing new, it's been done on other platforms for years!" but that's only true if you ignore the most important detail of the tech which is that it runs on device. Because Apple and others have duped the public into believing anything labeled "on device" is automatically indicative of privacy they're allowing people to think that somehow this is a privacy respecting solution when in fact it's the opposite.
Prior to this technology "what happens on your iPhone stays on your iPhone" was mostly true if you turned off iCloud services but that's no longer true given this tech's ability to operate and report back to Apple/authorities even with iCloud services turned off. Underlining "ability" is critical here because Apple claiming to only scan photos that are about to be uploaded to iCloud is irrelevant if switching a single flag or adding a few lines of code is all that it takes to 1) scan all photos regardless of if they're being uploaded to iCloud or if they came from an end to end encrypted chat service like Signal and 2) scan for things beyond CSAM which is already standard practice in some authoritarian countries.
This entire situation underlines why it's so important for the public to have a baseline understanding of why privacy matters, how privacy is implemented, how encryption works, etc. It's a downright tragedy that Apple customers STILL think that Apple's claim around "end to end encrypted" iCloud services means Apple can't access your stuff.
Some have claimed this is the first step in a larger move to finally truly encrypt iCloud backups and iCloud Photos so therefore this is a worthy tradeoff. That's horse s***. There is absolutely nothing to stop Apple from implementing full zero access encryption to all iCloud services WITHOUT client side content scanning. There's no legal precedent at all in the USA. They are only required to report CSAM if they come across it on their servers, they're not even required to scan for it in the first place. If all iCloud Photos were truly encrypted with no way for Apple to possibly access them then that would legally clear them from any responsibility. All that they would ever be able to see are blobs of encrypted data. Implementing end to end encryption but scanning content before/after it gets sent is the equivalent of an authoritarian saying "You can have a private, encrypted conversation but only if I get to see the contents" (which is what the EU is actively considering in parliament and the USA are advocating for) or a thief commanding "You can have a lock on your door but only if I have a copy of the key to open it."
Apple's cause is good, but HOW they are doing it is all wrong. Having the software do the work of constantly looking for illegal content doesn't justify it in the least. The potential for abuse by state actors is ripe, and even Apple has no way to verify the hashes being searched for are, in-fact, CSAM.sigh...ONCE AGAIN, no one (as in a person or agency of people) is "surveilling" your device. ALL that's happening is CSAM on your phone is being marked as such so if you upload a bunch of it to iCloud, you're going to get caught. If you don't download CSAM to your phone AND upload them to iCloud, then your life will not change between iOS14 and 15.
Also, Apple is not acting as law enforcement here. They are simply reporting CSAM, which is the only right thing to do. What the police do with that report is up to them, not Apple.
Encrypting iCloud accounts would be handy. It's long bothered me that they are not, but I have had to weigh up the sheer convenience of what it offers and so have opted in.One possibility is that this is the first in a number of steps towards offering encrypted iCloud storage, and they can at least tell law enforcement that they are fairly confident that there is no child pornography on their users’ iCloud storage accounts (because there would otherwise be no real way of telling).
Even if the "you" in the original post is generic in this case and not pointing at you personally: This is exactly, where it starts.Why are you assuming that I have CSAM on my phone? I find that rude and insulting. Everyone should be innocent until proven guilty.
Agreed. What use is E2E encrypted iCloud if some secret blacklist of hashes can search for whatever it wants before the encryption kicks in? By the way, I think folks like usagora are missing the fact that some 3rd party is providing Apple with the blacklist of hashes. Apple cannot legally possess known CSAM material to generate the hashes from. They have to be provided with hashes from a relevant authority. That's where the abuse potential lies.
I also share the fear that a future update of iOS will make this function without iCloud. The premise of it all just deeply concerns me as a US citizen.
I agree. However, they have cornered themselves. They cannot pull out of this nor can they continue without loosing faceI hope they abandon this. No search databases will work on my phone please.
It is a slippery slope. “We need to scrub your phone of the offensive images of the anti-government protest”. What is next?Even if the "you" in the original post is generic in this case and not pointing at you personally: This is exactly, where it starts.
You are against it? You do not support it? So you have something to hide and we need to check, whether you are lying or not. We do not believe you. You need to prove your innocence. Dangerous development.
Can't they already do all of this server-side (if they wanted to)? Whats the difference here?It is a slippery slope. “We need to scrub your phone of the offensive images of the anti-government protest”. What is next?
Don't update to iOS 15 then. Nobody is forcing you to do anything. Don't use iCloud or don't update to iOS 15 if you want things to stay the way they are.Why does Apple do all this? Are they pressured by the government to do it, do they think they make the world better? Have they lost in court somehow? Is this about their China business?
I don't understand why they ruin their reputation for -with all respect for the fight against child abuse- nothing.
I hope they abandon this. No search databases will work on my phone please.
The difference is: If they are doing it on their servers, I cannot stop them from doing it. That is ok for me. I do not have to use their server services.Can't they already do all of this server-side (if they wanted to)? Whats the difference here?
But you CAN stop it on your device by not using iCloud photo library. Then there's no scanning as the scanning only takes place WHILE the PHOTO is being UPLOADED to iCLOUD. Get it now?The difference is: If they are doing it on their servers, I cannot stop them from doing it. That is ok for me. I do not have to use their server services.
But now, they are scanning "on device" as a default activity and retrieving my private things if a certain value is triggered. So my privacy is gone. If my government decides to introduce an "on device"-scan regarding political activities, I am lost...
What is the use of "on device" scanning then?But you CAN stop it on your device by not using iCloud photo library. Then there's no scanning as the scanning only takes place WHILE the PHOTO is being UPLOADED to iCLOUD. Get it now?
They're numbers and they can't be converted back into photos. I don't know what you're worried about.What is the use of "on device" scanning then?
They are scanning it. They are not uploading anything, unless a certain (unknown) value is triggered. You are given "tickets" of matches. If this exceeds - lets say - 10 positive tickets, then your images are reviewed by Apple Control.
Why do I have that hash-database on my device? If it only happens, if I use iCloud, they could do it on their servers. But no: They are doing it on my phone. Why do we ALL have to carry around that database of illicit pictures?
I do not care about these "numbers" - it is about the fact that there is a routine on my device that calculates "numbers" of my images and compares it with the "numbers" of these illicit pictures, which I am suspected to own, like you. Lets hope, you have not too many images (of one in the trillion) that match theses "numbers" on your device.They're numbers and they can't be converted back into photos. I don't know what you're worried about.
It's exactly the opposite, it's much worse if things are scanned in the Cloud.I still feel it’s true, because scanning of photos is done in iCloud and only applies if you use iCloud photos and thus upload your photos from your device to the cloud (where they‘re scanned).
No scanning on your device - so still true advertising in my opinion.