Nobody is afraid. People are not willing to get inspected by Apple on their own device. It's offensive and intrusive and lacks basic respect.
To be fair, Apple has no idea what's being inspected, they'll only be aware of any of your content if 30+ matches occur.Nobody is afraid. People are not willing to get inspected by Apple on their own device.
There's a way out. Turn off iCloud Photos because the system only works WHILE your photo is being UPLOADED to iCloud. At that point, Apple has every right to scan the file being uploaded to THEIR server.They come on my device, install their search database, scan my pictures, establish communications to their HQ, might inspect them by humans and regard them if their software thinks they are on their database and even call the police if they think my content is wrong, flagged, illegal or whatever.
(I have no illegal content and never had, you could print all my pictures on postcards I still oppose this)
Who do they think they are? We have laws and a legal system in place and don't need apple to spy on us on top.
Yes, there is an opt out. Don’t use iCloud Photo Library. Is it that hard to understand? The hash database will be on everyone’s phone regardless but it won’t be visible to the user and even if you could see it, it wouldn’t make any sense because it’s literally 1s and 0s and they can’t be converted back into the original photos or else that would be the biggest blunder of all time of Apple put child porn on everyone’s device. HahaThey still install their database and likely search my photos without consent to create their hashes for comparison in case I switch on iCloud.
I want a clear opt out option to keep my phone totally private from their database and inspection software.
I already stated what the issue is. It’s not the current implementation. It’s what it could lead into.If you're not falsely flagged, then all of your things remain private. What's the issue here? Are you afraid you'll be that 1 in a trillion that somehow has 30+ innocent photos that just so happen to match 30 photos in a known CSAM database and then having those 30 photos reviewed by Apple. Then, if they truly are innocent photos, literally nothing happens.
Again. Apple has our backs. They still care about privacy. If they say they won’t abuse it, I believe them. You may not, but I do.I already stated what the issue is. It’s not the current implementation. It’s what it could lead into.
After Apple releases this, what’s to stop them from scanning other types of content in the future? What’s to stop governments from pressuring Apple to search for anything they want?
I would so much rather Apple unlock someone’s phone due to a warrant than Apple scan everyone’s pictures without provocation.
If Apple cared about privacy they wouldn't scan your private photos.B-but they still care about privacy!
Or… you know they scan it but keep everything private until 30+ images are found to match CSAM which means 99.99% of people won’t ever have any of their data known.Blah blah blah
Stopped reading there. Again, if they cared about privacy they wouldn't give a single sh*t what users have on their own devices.Or… you know they scan it
While I acknowledge the iOS user agreement points out that Apple is already scanning, I don't think Apple should be scanning anything on-device, as there is no legal need to. Apple is not held accountable for anything on the device. If they knowingly turn a blind eye to any illegal content on the iCloud server, that is a different matter.Or… you know they scan it but keep everything private until 30+ images are found to match CSAM which means 99.99% of people won’t ever have any of their data known.
But they aren’t scanning it unless you’re trying to upload the file to their servers which in that case, they have the right to check it. They’re not going to have some background process that’s constantly monitoring your photos. This happens ONLY at the time of the upload.While I acknowledge the iOS user agreement points out that Apple is already scanning, I don't think Apple should be scanning anything on-device, as there is no legal need to. Apple is not held accountable for anything on the device. If they knowingly turn a blind eye to any illegal content on the iCloud server, that is a different matter.
I know how the process works. There should be no on-device scanning at all. There is no reason for it on-device. That was my point. Scanning should be saved for iCloud where it can become a legal problem for Apple.But they aren’t scanning it unless you’re trying to upload the file to their servers which in that case, they have the right to check it. They’re not going to have some background process that’s constantly monitoring your photos. This happens ONLY at the time of the upload.
It’s for emphasis.I know how the process works. There should be no on-device scanning at all. There is no reason for it on-device. That was my point. Scanning should be saved for iCloud where it can become a legal problem for Apple.
There is also no need to capitalize words. I can read just fine without capitals.
Since you can't refute an honest retort, you resort to ridicule and mocking me over points I didn't bring up, even though I was on point and civil with my previous post. If that is the best you have to offer, don't bother responding next time.It’s for emphasis.
Also, Siri is moving to “on device” in iOS 15. Now your phone will be listening to you all on device? Whoa. Invasion of privacy much? That stuff should only be processed on apples servers! How do I even know if it stops listening when I turn it off and what if other governments force Apple to send those queries to the feds? Oh no I’m so worried.
I just don’t understand why everyone thinks Apple just gave up on privacy. You’re telling me that everyone involved in this thought it was a good idea and nobody thought there could be any bad consequences to this? It’s probably not something they just brainstormed for an hour and said, yup, let’s go through with it.Since you can't refute an honest retort, you resort to ridicule and mocking me over points I didn't bring up, even though I was on point and civil with my previous post. If that is the best you have to offer, don't bother responding next time.
In my opinion, it is not logical to lump individual responses and positions together and then ask why everyone feels the same and has the exact same position.I just don’t understand why everyone thinks Apple just gave up on privacy. You’re telling me that everyone involved in this thought it was a good idea and nobody thought there could be any bad consequences to this? It’s probably not something they just brainstormed for an hour and said, yup, let’s go through with it.
Then again, I’m just speculating here just like how everyone is speculating that apple invaded your privacy.
I respect thatIn my opinion, it is not logical to lump individual responses and positions together and then ask why everyone feels the same and has the exact same position.
I don't believe Apple has given up on privacy per se. However, I don't believe Apple has been consistent in this area in regards to marketing and personal corporate voiced interviews versus actual implementation, execution, and corporate response action. I don't believe Apple is invading user privacy per se because Apple has clearly stated their intent and purpose with OS updates which many, many people don't bother to read and I think that is why so many of these very same people see it as an invasion when it really isn't.
In my opinion, Apple is looking for the offense when it resides on the server. As such, that is where the scanning should occur and not be on-device.
The argument that on-device scanning protects user privacy versus scanning alone on the cloud is simply not true. It is nothing more than marketing speak from Apple. Apple is able to gather a lot pertinent information for a possible NCMEC report simply from the on-device scanning, as well as from the functioning and reporting of the OS itself. I believe the reason Apple is doing on-device scanning in this area, is to create a more detailed tech trail of the users care, custody and control of CSAM as it relates to iCloud storage and the possible distribution of said material amongst other pedophiles.
Say a stranger decided to send to your child's iMessage a nudity pic, would you be okay if they happen to see it without any interference?While I acknowledge the iOS user agreement points out that Apple is already scanning, I don't think Apple should be scanning anything on-device, as there is no legal need to. Apple is not held accountable for anything on the device. If they knowingly turn a blind eye to any illegal content on the iCloud server, that is a different matter.
The on-device scanning also involves the other new action from Apple, which there is no opt-in option, as far as I am aware.Say a stranger decided to send to your child's iMessage a nudity pic, would you be okay if they happen to see it without any interference?
That's the on-device scanning. It doesn't send or show anything to Apple, it just prevents certain media from immediately being visible, they would be blurred out until consent is given for them to be viewed. Much like what Facebook, Instagram, Twitter, Youtube etc. already do. Gosh.
Don’t use iCloud photos and nothing will be sent to Apple just like it’s always been. Nobody is forcing you to give apple your files.The on-device scanning also involves the other new action from Apple, which there is no opt-in option, as far as I am aware.
As to your question, I don't have a problem with the upcoming parental feature because it is opt-in, outside of thinking the age should be set higher than 13.
I represent myself when I post. You keep retorting against positions I haven't made which tells me you keep seeing everyone else when you see my post and end up addressing my post with the everyone labeled retort. That is a bad habit.Don’t use iCloud photos and nothing will be sent to Apple just like it’s always been. Nobody is forcing you to give apple your files.
I think it has nothing to do with you. They probably legally have to scan for CSAM since EVERY major company does it.
”(1) monitor any user, subscriber, or customer of that provider;
(2) monitor the content of any communication of any person described in paragraph (1); or
(3) affirmatively search, screen, or scan for facts or circumstances described in sections (a) and (b).