Once I get my eyesight back (went in for a follow-up eye exam this morning) I may have to add an Apple logo to Darth 
What happens on your iPhone, stays on your iPhone (unless you plan on uploading said content to Apple's servers, then it's no longer on your iPhone is it?)
The statement on the poster was not conditional. (There's no "unless" or "except" or whatever in it.) It clearly reads "What happens on your iPhone, stays on your iPhone." The proposed scanning occurs on your iPhone. If the scanner gets a "hit" it sends the results off your iPhone. Therefore, in that case: What happens on your iPhone clearly does not stay on your iPhone.What happens on your iPhone, stays on your iPhone (unless you plan on uploading said content to Apple's servers, then it's no longer on your iPhone is it?)
Others are not privacy advocates and don't use privacy to build their image. That's the big difference.I am not worried about the CSAM scanning because other vendors have been doing this for years. Apple decides to do it and everyone freaks out...
The first pass happens on your phone yes, but it only happens if you're uploading data to the cloud, so while the processing is done on your device, you are still uploading that data to the cloud regardless, so it's not a part of the "stays on your phone" statement.The statement on the poster was not conditional. (There's no "unless" or "except" or whatever in it.) It clearly reads "What happens on your iPhone, stays on your iPhone." The proposed scanning occurs on your iPhone. If the scanner gets a "hit" it sends the results off your iPhone. Therefore, in that case: What happens on your iPhone clearly does not stay on your iPhone.
Your logic is faulty on two counts: Again: Apple's claim was not conditional. It didn't claim "What happens on your iPhone, stays on your iPhone unless..." It asserted "stays on your iPhone," period. The analysis happens on your phone. The analysis may leave your iPhone. It can't possibly get any more the opposite of "stays on your iPhone."The first pass happens on your phone yes, but it only happens if you're uploading data to the cloud, so while the processing is done on your device, you are still uploading that data to the cloud regardless, so it's not a part of the "stays on your phone" statement.
Yes, the nice Apple police helps you put your shoes on before you go outside.The first pass happens on your phone yes, but it only happens if you're uploading data to the cloud, so while the processing is done on your device, you are still uploading that data to the cloud regardless, so it's not a part of the "stays on your phone" statement.
With your logic, anything to do with iCloud doesn't stay on your phone. If everything "stayed on your iPhone" then you wouldn't have an internet connection at all. It would be an iPod.Your logic is faulty on two counts: Again: Apple's claim was not conditional. It didn't claim "What happens on your iPhone, stays on your iPhone unless..." It asserted "stays on your iPhone," period. The analysis happens on your phone. The analysis may leave your iPhone. It can't possibly get any more the opposite of "stays on your iPhone."
You can argue night is day and day is night, what's white is black and what's black is white, until the end of time, but that won't make your arguments ever be true.
The content-matching process could have false positives, and malicious users could game the system to subject innocent users to scrutiny.
No doubt the FBI loves it.... but you're asking for credible organizations, so no - none.Has there been a single credible security professional, organization, or group, or privacy group or organization, that's expressed support for Apple's plans?
Apple already has systems in place to scan iCloud, and they have for years. Just like everyone else. This is a step much further in scanning devices.Yeah, but isn't the whole point that Apple is going through all this trouble precisely to set themselves apart from Facebook and google by not scanning your photos server-side?
Currently, iOS scans your photos on-device to tag them, and this information does not leave your device.
At the same time, Apple doesn't scan the contents of your iCloud library, and evidently has no intention of doing so anytime soon.
Instead, they go out of their way to design such a convoluted system that is effectively designed to scan your iCloud photos for child porn, without actually scanning for it in the same invasive manner that other companies currently do. And for what?
The only reason I can think of is that Apple is laying the ground for offering fully encrypted iCloud storage one day. They reportedly backed off on this initiative in 2018 at the behest of the FBI, and I am guessing that from Apple's POV, this proposed CSAM detection method offers users the best of both worlds. Nobody can access your iCloud photos that are fully encrypted (if and when Apple does offer such a feature), the data scanned from your photos never leaves your iPhone, but Apple probably still needs a way to convince law enforcement that they are not acting as a safe haven for child pornography so as to get them off their backs.
To me, while the scanning technically will take place on my device itself, it applies only to photos about to be uploaded to iCloud, so in terms of outcome, there really is no difference regardless of whether it happens on my iPhone or in the cloud from Apple's servers. So while the technology may have the potential to be way more invasive, I believe the reality is that it will still end up being a lot less invasive than what Facebook and Google are currently doing to weed out child pornography.
I acknowledge that nobody really knows how such technology could evolve or be abused in the future, but is this not the very definition of a slippery slope argument? The key talking points aren’t so much about CSAM detection as it is situated today (primarily because there isn’t much controversy found there) but rather the slippery slope argument about how by launching CSAM detection now, Apple will find itself in compromising situations in the future. The fear is that countries will force Apple to find non-CSAM material. This feels a bit like a straw man argument to me, not least because there is no mechanism in place for countries to have Apple use CSAM detection to start searching for non-CSAM, and these articles just talk around all of the safeguards Apple has built into CSAM detection like they don't exist.
Even if they wanted to, there are probably way easier ways to "fix" a problematic individual than to frame them via tainting the US CSAM database (which entails so many steps that even a mission impossible movie based on this would be deemed too implausible).
Of course, there is always the possibility that I am totally and completely wrong, and 2-3 years later, Apple still hasn't offered encrypted iCloud storage, and I am left wondering "So what was the point of it all really?"
Trusting Apple is irrelevant. Either buy there products or don't.I never trusted Apple. I knew they were just good at marketing and lying. They’re doing a lot more behind the scenes that we don’t know about.
Trusting Apple is irrelevant. Either buy there products or don't.
Actually it’s completely relevant…to me NOT buying their products.Trusting Apple is irrelevant. Either buy there products or don't.
Except that giant Apple Privacy billboard I drove past for a bit.
I'm glad to see it looks like some people were already starting to think about moving away from OSX, and this has brought that consideration back to the front of their minds.True. I'm phasing out all of my Apple products.
I guess you missed Apple CEO Tim Cook's open letter on Apple's (alleged) dedication to privacy. Here ya go: https://appleinsider.com/articles/1...-privacy-policies-in-open-letter-to-customers
I guess you likewise missed this, too:
![]()
Only partially true. See: [P] AppleNeuralHash2ONNX: Reverse-Engineered Apple NeuralHash, in ONNX and Python
Apple has already begun putting the code into iOS as of iOS 14.3.
They have commitments to protecting privacy, they have respect for customers. Nothing here that privacy is the most important thing.
True. I'm phasing out all of my Apple products.
100%. Gotta do what’s right for you. I believe the majority of the potential hundreds of millions won’t care…but we all gotta do what makes us happy.Actually it’s completely relevant…to me NOT buying their products.
I'm glad to see it looks like some people were already starting to think about moving away from OSX, and this has brought that consideration back to the front of their minds.
That's the situation I'm in. My laptop is 5 years old now and a new purchase isn't too far away. The new architecture sounded intriguing, but I really hate the Big Sur interface changes. Aesthetically it just looks like ****, basically. Plus it's gotten pretty damned offensive with the level of lock-down and opaqueness in communication with Apple servers. And considering the current topic, this trend is only going to accelerate.
It's looking almost certain that sometime in 2022 or maybe even before that, I'll be getting a System76 laptop.
The good news for me is that I'm not an "ecosystem" guy. I don't give a damn about the phones at all, and rarely use mine for anything except old fashioned voice calls. I get my wife's hand-me-downs because that's how much I just don't care. The watch... never even considered it.
Let's say Elizabeth Hurley, Emily Scott and Gemma Atkinson insteadIt's not the half-naked Britney Spears you have to worry about... it's that vintage Traci Lords stuff....