Of course they do and likely more than anybody but this is an on going thing that I don’t see stopping anybody from adding it.
Exactly and let’s be real the largest market can dictate the terms if apple wants access. Like they do already storing data centers in those countries. So what’s the difference what’s ok in one country will not be ok in another. The high and mighty attitude in a dogooder smokes screen offends me. If you want money over principle say it! I respect that.The attitude of their own privacy head already showed what they think by implying "well don't do illegal things," which is highly offensive considering what considered illegal in other countries can differ. In some countries, homosexuality is illegal. The implication of an in-device mass scanning system is chilling. From all tech companies, Apple should've been the wiser.
A publicly traded company can change hands/management in a blink. What assurance that Apple users have that the next management team will keep the pinky promise? Look at Google, and how they deleted their own "do no evil" motto.
I agree that the proof that Apple will hold the line is in China, and it will be very telling what happens there, but this bit above I disagree with. Apple so far has defended your ability to "do illegal stuff" on their phones by fighting the FBI; they are not willing to sacrifice your privacy, and if that means that some bad stuff happens that they won't uncover then so be it.Considering how Apple is willing to break the line by baking in such system into iOS itself, and then tell people to just not do illegal stuff, is mind boggling.
Next time it may be too late to do anything about it. That Apple is checking for CSAM (and nudity sent to minors) is purely a matter of policy. The mechanism that they intend to install on their customers' devices doesn't know what CSAM or porn is. All that's necessary to make it check for something else (dissent, homosexual images, etc.), or to check when the device isn't about to send something to iCloud, or to notify someone other than parents or Apple, is a configuration change and a different database.[T]his time I agree that this is something where your privacy should come second.
You get that this scanning happens on device, right? It’s a process that’s invisible to the user, cannot identify any data other than known hash values, and the results of which are not shared with anyone unless there are multiple hash matches to known CSAM. it’s not in any way whatsoever the equivalent of cops showing up at your house.It's the equivalent of the cops showing up once at week at your house to check if you have something to hide. You're guilty until proven indocent.
Surely you see the false equivalence between on-device scanning for known hash values within a single app (photos) and “sharing all my data with Tim Cook on iCloud”?Why not submit your own personal data voluntarily to Apple? Put all your data into a shared folder on iCloud and share it with Tim Cook. You got nothing to hide, no?
The current Apple already allowed things...
For example, if Apple were coerced into reporting every user with the Chinese Tank Man photo, they couldn't do it with the CSAM system without re-engineering the entire thing and rebuilding every user's safety vouchers
A publicly traded company can change hands/management in a blink. What assurance that Apple users have that the next management team will keep the pinky promise? Look at Google, and how they deleted their own "do no evil" motto.
You get that this scanning happens on device, right? It’s a process that’s invisible to the user, cannot identify any data other than known hash values, and the results of which are not shared with anyone unless there are multiple hash matches to known CSAM. it’s not in any way whatsoever the equivalent of cops showing up at your house.
Surely you see the false equivalence between on-device scanning for known hash values within a single app (photos) and “sharing all my data with Tim Cook on iCloud”?
I don't think you have read much on how it works. According to Apple, the hash of the database itself would be published on Apple's website so owners and researchers can verify the same database is used on any phone throughout the world. I didn't say swapping a database needed reengineering, I said changing the threshold from 30 to 1 would, and it would because the nature of the threshold is baked into the encryption of the safety vouchers themselves when they are created on the phone.That's literally not how it works lol. If they were "coerced into reporting every user" (which by the way they were coerced in private by senators into implementing this system in the first place, so much for the "Apple will resist government demands" theory) they simply use a different hash database for a given region. Swapping out a database remotely doesn't constitute a "re-engineering." It's not outside the realm of possibility that China (for example) would request Apple use their government provided database instead of Apple's. What do you think Apple would say then? No?
They aren't scanning the device. They are scanning the images you are sending to iCloud. There's a huge difference and you know it.You do get that the device is the most private thing you got and they are scanning it? You might as well install a camera in your bedroom, but don't worry, it'll only send videos if it thinks something illegal happened....
Yes, it's the same as the cops showing up, the device is looking for wrong doings in one of your most private spaces. it assumed any picture you take is CSAM, why else would it scan it? Currently it doesn't care about you speeding or downloading a torrent, are your sure it'll stop at CSAM pictures or is this just another step towards the Orwellian surveillance?
It’s not “assuming” your pictures are CSAM. It’s checking the hash values of pictures you uploaded to iCloud against a list of known hash values for CSAM. The check is done on your device, and if you don’t have known CSAM, literally nothing gets sent to Apple. I’m really, really struggling to see the issue here.You do get that the device is the most private thing you got and they are scanning it? You might as well install a camera in your bedroom, but don't worry, it'll only send videos if it thinks something illegal happened....
Yes, it's the same as the cops showing up, the device is looking for wrong doings in one of your most private spaces. it assumed any picture you take is CSAM, why else would it scan it? Currently it doesn't care about you speeding or downloading a torrent, are your sure it'll stop at CSAM pictures or is this just another step towards the Orwellian surveillance?
You should have added a "I couldn't care less" option. I honestly couldn't care less about any of this. I don't do anything in my life or with my phone that would have me worrying about much. There are far too many other areas of our day to day life that are tracked and monitored that go way beyond our phones. I don't think people truly realize just how little privacy most of us actually have.
I believe Apple does care about privacy, but I also feel that it needs to be better nuanced.Very interested to see if people still believe that Apple is a privacy-oriented company. Yes, in this day and age, it is impossible to have 100% privacy. But when you think about the competition, the other companies, do you still believe that Apple cares about privacy?
I think that trust--explicit or implicit, purposeful or subconscious--is part of our very decision to be online at all. I don't doubt there are a bazillion ways Apple could be searching our iCloud-uploaded photos for anything they like, without telling us, without having to release a new feature on our phones. But I trust that they're not doing that. Perhaps they could have done the same with this feature, just doing it on the server instead of building something on the phone, except for the publicity opportunity or perhaps they've learned not to try to do it clandestinely. Or perhaps they actually do care about privacy.Next time it may be too late to do anything about it. That Apple is checking for CSAM (and nudity sent to minors) is purely a matter of policy. The mechanism that they intend to install on their customers' devices doesn't know what CSAM or porn is. All that's necessary to make it check for something else (dissent, homosexual images, etc.), or to check when the device isn't about to send something to iCloud, or to notify someone other than parents or Apple, is a configuration change and a different database.
Not yet anyway.Yes, but their woke agenda trumps it. Now CSAM isn't exactly woke, but it shows they're willing to play cops on your device if they think they have the moral high ground. CSAM or not, im going to upgrade / buy a new iPhone, but I'll never look at Apple the same way again and Android is suddenly an alternative I never would have considered before. It's the equivalent of the cops showing up once at week at your house to check if you have something to hide. You're guilty until proven indocent.
I think, this is also a key pivot point to the argument. Whether accepting of the current, immediately upcoming implementation or not, most of us agree that Apple needs to be extra careful keeping the concept in check.I think that trust--explicit or implicit, purposeful or subconscious--is part of our very decision to be online at all. I don't doubt there are a bazillion ways Apple could be searching our iCloud-uploaded photos for anything they like, without telling us, without having to release a new feature on our phones. But I trust that they're not doing that. Perhaps they could have done the same with this feature, just doing it on the server instead of building something on the phone, except for the publicity opportunity or perhaps they've learned not to try to do it clandestinely. Or perhaps they actually do care about privacy.
Though I was dismayed when I heard about all of this, and I remain open to the arguments that you and others are making, I'm satisfied that Apple is doing its best on this, just like it did its best (along with Google, credit where it's due) on the contact tracing framework. I don't think this is a new thing, as if Apple didn't have the ability to do privacy-awful things before this. The real test will be how it stands up against governments around the world, and I'll be watching that nervously, but that worry exists with our without this implementation.
This is the first time your Apple devices have ever communicated with Apple’s servers.Imo the question should've been "Do you care having a black-box in-device mass scanning system that can communicate to the cloud on your phone?"
The issue is not about the CSAM, but it's about the system baked into iOS itself.
And still, the “nothing to hide, nothing to fear” approach was championed by Joseph Goebbels, “Nazi Minister of Propaganda.” You’re aligned.Jimmy James, That has to be the biggest load of horse **** I've read all year. Thanks for the laugh.