Let me ask a hypothetical.
If the government wanted to install a camera in your home, to make sure that your child wasn't being abused in any way, with the promise that nobody would ever look at your spouse walking around in their underwear, would you be OK with it?
That's what Apple is doing...
I guess you didn't know there is machine learning AI running in your photos all the time finding pictures of cars and cats, but I guess that could never be used for any other type of image eh?
This is your blind spot! You don't know what they are doing... You only know what they are telling you and have no idea what they are concealing. So, they can do anything they want, and you don't know that they aren't.
Because now Apple is setting a precedent, by asserting the right to search your personal device for incriminating evidence. This can’t be compared to simple imagine analysis to help organize your vacation photos. This is a search for criminal behavior, and the search is occurring on your private property.
Most people in modern western society acknowledge that a warrant would be needed for that type of thing, and that innocent citizens should not be subjected to warrantless search and seizure, especially when they haven’t been accused of or suspected of committing a crime.
Apple is a corporation led by unelected individuals with no public oversight. If they assert this right without any pushback, then Pandora’s box will be open.
If Apple's solution didn't put every user under the microscope and actually fought the root problem, we'd likely not have any issues. If someone is stupid enough to share known CSAM through iCloud, they'd be caught with just server-side scanning. What we object to is having the scanning system ON OUR DEVICES. That crosses borders that even Google wouldn't (since base Android is open-source, that can be verified). The only sure opt-out is to stop using Apple devices. I foresee the detection software becoming fully standalone on-device whether you use iCloud or not in a future update (because software never develops into something more powerful, right?)No-one made any such argument, those are your words.
A company proposes a solution to an existing problem, everyone runs to the hills and screams loss of privacy. None of you (people against it) as Adults can do everything or have the power to protect your kids yourselves, yet your'e all outraged when a company offering solutions that require services you already have enabled. Ironically, then you'd probably blame (or in the the USA, sue) the company because you did what iCloud on to protect your kids...
You're already submitted to accepting iCloud, it needs to be on to use Apple Pay, to use AirDrop, to save passwords to Keychains etc.
In the 80's TV had Charlie the cat child safety awareness programmes, such as 'charley said, always tell your mummy before you go off somewhere'. You think parents stop their kids watching because they thought the big baddie TV companies were trying to control them???
lol settle down
I guess you must've missed where they said the scan only occurs while your photo is being uploaded to iCloud. Your local library isn't scanned at all until an upload occurs.
So far 25% of users here will move away from Apple and 75% will stay.
Have Apple stated they didn't have perceptual hashing code in iOS 14.x? Where?
Have Apple stated explicitly that privacy is the most important thing? Where?
It doesn't have selective backup, but it also doesn't right now either, so whatever can be done to our photo libraries could be done all along in the cloud, but what happens in the cloud we have no control over and whatever software they're using to scan our stuff. However, with the software on-device, it can be picked apart and audited by an independent security group or groups to make sure it's doing exactly what they say it's doing. At least that's what how I understood it. All of this information is in Apple's white papers.I wasn’t aware iOS and iPadOS had selective photo backup.
Not per se. But they've implied the code wouldn't be in there until the next major release. It still isn't--well, not that we know of, but part of it is. Bad optics, if nothing else.
Srsly? Have you been living under a rock? It's been one of Apple's big selling points over the competition. Heck, they're even trying too claim they're doing CSAM scanning on the devices to preserve privacy. (Which is about as tortured logic as ever I've seen.)
I think that's the best approach. No sense in losing sleep over it until there's an issue. Apple has always taken our privacy seriously.I voted not leaving, but not comfortable with it. I’m not going to turn off iCloud photos because I’m not worried about the feature as it currently stands, but I’m definitely concerned over what it could develop into in the future. When they start detecting other types of content on my phone is when I’ll turn it off.
So if I have Photo Backup on, by default all of my photos get scanned by this proposed solution.It doesn't have selective backup, but it also doesn't right now either, so whatever can be done to our photo libraries could be done all along in the cloud, but what happens in the cloud we have no control over and whatever software they're using to scan our stuff. However, with the software on-device, it can be picked apart and audited by an independent security group or groups to make sure it's doing exactly what they say it's doing. At least that's what how I understood it. All of this information is in Apple's white papers.
Yeah, but isn't the whole point that Apple is going through all this trouble precisely to set themselves apart from Facebook and google by not scanning your photos server-side?If Apple's solution didn't put every user under the microscope and actually fought the root problem, we'd likely not have any issues. If someone is stupid enough to share known CSAM through iCloud, they'd be caught with just server-side scanning. What we object to is having the scanning system ON OUR DEVICES. That crosses borders that even Google wouldn't (since base Android is open-source, that can be verified). The only sure opt-out is to stop using Apple devices. I foresee the detection software becoming fully standalone on-device whether you use iCloud or not in a future update (because software never develops into something more powerful, right?)
I have no idea.@Jayson A
So if I have Photo Backup off, then manually copy photos to the iCloud, are they still scanned?
1. But the underlying mechanism has reportedly been discovered in iOS as far back as 14.3. If true, then it has already been put on our phones with NO notice (I read all the release notes, and there was no mention). It won't be active until iOS 15, but the fact it was part of an update without being mentioned isn't cool.1. They have said the CSAM detection system would not be there until iOS 15 which is true.
2. Again, if it's plentiful, please provide a statement with a source where Apple said it was the most important thing. A statement from Apple saying "We believe privacy is a fundamental human right" is not enough evidence.
I have no idea.
It's not the half-naked Britney Spears you have to worry about... it's that vintage Traci Lords stuff....Jesus, you people are scared.
I have formed a better opinion since my previous comment - I think that I might actually have become too dependent on Apple's ecosystem.
I haven't got much against iOS or macOS or their services.. there's just something gone awry in my choices along during my computing years.
Movies and music bought within Apple's services that I can't get out easily because of DRM. Movies are the worst. Music is doable, but I don't have 500 million CD's to burn my music library to.
Scanning for Child Pornography images that are in the CSAM registry is the least of my worries. I don't mind.. it's the half-naked Britney Spears images I'm worried about that could get me in the hands of the CIA; stepping out of the plane in the US; orange jumpsuit and chains around my legs and torso.
I wish Britney Spears could see me now. Using my flesh spear.
And yes, call me what you wish. I will stay using Apple's iCloud photos and I will suck on Apple's breasts until someone slices off my lips and cut out my tongue.
Bottomfeeders... hah.. believe in aliens too? UFOs? You're wack - all of you(meaning those who believe in aliens and ufos - not all people everywhere and not everyone on this forum)
It's sometimes like hearing tears through a waterfall on this forum.. so much crying from scared and often hostile babies that have a habit of inducing fear in the common people.
You're right. Maybe it's time (since i *DO* own the hardware) to find an open-source phone software that will run on an iPhone.I'm not going to read 23 pages, but it's likely been said, but...
You do realize that you do not own iOS? You are merely granted a license. You are at the mercy of Apple. Good or bad, thats how licensing rolls.
Don't like it? Here is a roll of wire and a pile of sand.. build your own
Only partially true. See: [P] AppleNeuralHash2ONNX: Reverse-Engineered Apple NeuralHash, in ONNX and Python1. They have said the CSAM detection system would not be there until iOS 15 which is true.
I guess you missed Apple CEO Tim Cook's open letter on Apple's (alleged) dedication to privacy. Here ya go: https://appleinsider.com/articles/1...-privacy-policies-in-open-letter-to-customers2. Again, if it's plentiful, please provide a statement with a source where Apple said it was the most important thing. A statement from Apple saying "We believe privacy is a fundamental human right" is not enough evidence.