Thank you. At least I wasn't told to go find it myself. ?
Thank you. At least I wasn't told to go find it myself. ?
As a father of a child that was sexually abused at the age of 2 (yes 2 years old) by a male and assisted by my daughters own mother, i fully support any actions by Apple and others to trap these sick SOBs and get them locked up. BRAVO Apple!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
Wow. That's a stretch. When you go out in public, there is no expectation of privacy. People expect that what is on their phone is private.I associate this with the license plate readers police use on their cars. If you are not driving a stolen car or have outstanding warrants, then there would not be a match to any database.
My concern is less with the scanning and more that once this tool exists it can be used to scan and compare against any database of image hashes. Then instead of Apple telling a government that wants to search for something, no we can't search for that because the tool does not exist, the response is we could, but we won't. Then Apple gets served with a court order and a gag order and suddenly they can use this tool to search for anything.
As I understand it, this is a purpose built search tool, designed to search for one type of materials only, and nothing else. It cannot be used as a general purpose query tool by Apple. It can only change it's behaviour with an OS update.My concern is less with the scanning and more that once this tool exists it can be used to scan and compare against any database of images hashes. Then instead of Apple telling a government that wants to search for something, no we can't search for that because the tool does not exist, the response is we could, but we won't. Then Apple gets served with a court order and a gag order and suddenly they can use this tool to search for anything.
As I understand it, this is a purpose built search tool, designed to search for one type of materials only, and nothing else. It cannot be used as a general purpose query tool by Apple. It can only change it's behaviour with an OS update.
Well, that's what I understood after reading the post from John Gruber. It basically scans a device's iOS photos based on the hash algorithm of a authoritative database such that it can be compared. And as @cmaier explained in this post in this thread, it only generate the hash when photos before being uploaded to iCloud. I don't think iOS will scan all photos of a device automatically.Are you able to please link to anything specifically discussing this point?
I'm interested to read more about that particular point.
Thank you!
Well, that's what I understood after reading the post from John Gruber. It basically scans a device's iOS photos based on the hash algorithm of a authoritative database such that it can be compared. And as @cmaier explained in this post in this thread, it only generate the hash when photos before being uploaded to iCloud. I don't think iOS will scan all photos of a device automatically.
I agree with @cmaier reasoning that this is likely the first step in E2EE for iCloud, as it makes sense to me.
So what confirms for you that it's limited in scope to one specific type of data or even type of photo?
Sounds like matching against a database...really just depends upon which database we are talking about
Let's say -- the Chinese government -- would like to find photos of a specific type of thing they don't like.
You think this technology would be fundamentally unable to adapt to that demand? (which Apple must honor by the way)
Build the "hammer" and all of the sudden lots of people are going to be compelling you to hit "nails" they are looking for.
Well, China would have to let Apple know of the detailed algorithm to scan and generate the required hash values, and China would have to provide the database to compare the hash with. Then Apple would have to add this algorithm into the OS and push an update to their supported devices. And I believe Apple would then have a press release indicating such an update (like what they are doing now), since it would be required by law if Apple wants to operate in the Chinese market, much like what Apple is required to do in the US market, with regards to the issue discussed in this topic.So what confirms for you that it's limited in scope to one specific type of data or even type of photo?
Sounds like matching against a database...really just depends upon which database we are talking about
Let's say -- the Chinese government -- would like to find photos of a specific type of thing they don't like.
You think this technology would be fundamentally unable to adapt to that demand? (which Apple must honor by the way)
Build the "hammer" and all of the sudden lots of people are going to be compelling you to hit "nails" they are looking for.
Lots of big company's consult their legal team and still get in trouble with their actions. It will be the little employee who Apple will throw under the bus.I did not miss your point but you neglected to read what I said when I stated that Apple no doubt has consulted their legal team. You're acting like Apple is some mom & pop shop that doesn't know what they are doing. No major corporation is stupid enough not to consult their legal team before taking any action.
All China get is iCloud backup etc unencrypted, which is basically what it is now. They’d rather demand Apple to install spyware on Chinese iPhone than getting basically limited info on a backup. They are already mandating facial recognition on all smartphones to prevent kids playing “too many hours” of game.Why would China bother, when they can just walk into the Chinese data center Apple has to use and demand everything in anybody’s account? Wouldn’t that be a lot easier?
And I believe Apple would then have a press release indicating such an update (like what they are doing now), since it would be required by law if Apple wants to operate in the Chinese market
Why would China bother, when they can just walk into the Chinese data center Apple has to use and demand everything in anybody’s account? Wouldn’t that be a lot easier?
As far as I understand Apple's technical summary, the hashing algorithm - NeuralHash - is always the same. The hash number providers (currently NCMEC and some other child safety organizations) run the NeuralHash algorithm in their own facilities and deliver the resulting hash numbers to Apple, without Apple ever seeing the pictures. That would mean adding new pictures to search for, including pictures of an entirely different subject matter, is as simple as the hash number providers running NeuralHash on whatever pictures they have and delivering the new numbers to Apple. Only the database of numbers in iOS needs to be updated, not the algorithm.Well, China would have to let Apple know of the detailed algorithm to scan and generate the required hash values, and China would have to provide the database to compare the hash with. Then Apple would have to add this algorithm into the OS and push an update to their supported devices. And I believe Apple would then have a press release indicating such an update (like what they are doing now), since it would be required by law if Apple wants to operate in the Chinese market, much like what Apple is required to do in the US market, with regards to the issue discussed in this topic.
This is what I think a much better and a bit more insightful explanation of how the hash works. Had I read this earlier I’d had a little bit of breather knowing it might not be as scary as it sounds, though slippery slope stance remains, but that’s the same for all dangerous tools.As far as I understand Apple's technical summary, the hashing algorithm - NeuralHash - is always the same. The hash number providers (currently NCMEC and some other child safety organizations) run the NeuralHash algorithm in their own facilities and deliver the resulting hash numbers to Apple, without Apple ever seeing the pictures. That would mean adding new pictures to search for, including pictures of an entirely different subject matter, is as simple as the hash number providers running NeuralHash on whatever pictures they have and delivering the new numbers to Apple. Only the database of numbers in iOS needs to be updated, not the algorithm.
An interesting effect of this setup is that Apple does not really know what it is searching for. It merely gets a list of numbers, and has no means to check whether these numbers come from CSA-pictures or pictures of other subject matters. Of course, Apple then has the manual review stage for flagged pictures, so Apple would notice if iPhones suddenly started flagging swastikas or whatever. (Though I suspect that in more totalitarian countries the manual review would be done by the state rather than Apple, or at least under state supervision.)
A particular conundrum I wonder about in this is how Apple's manual reviewers judge the pictures coming from the iPhones. As Apple has no access to the original CSA-pictures from NCMEC, the Apple reviewers will have to judge the iOS-pictures solely by their content. Given the "semantic similarity" part of the matching algorithm, I could see Apple ending up with a lot of pictures where they really cannot be sure, e.g. is the person in the picture young enough for it to constitute CSA. Likewise, I do not know how blatant all the original CSA-pictures are - are they all very obvious offences, or are some seemingly harmless when viewed in isolation, and NCMEC only knows of their CSA-status because they were part of a known series? I have also read claims that Apple reviewers will only see a low resolution version of the pictures, which on the one hand may be good to protect both the reviewers and the privacy of iPhone-owners with mistakenly flagged harmless pictures, but on the other hand it would compound the reviewing uncertainty. Apple will have to decide whether to err on the side of caution for the user, or for the authorities and let them sort it out.
i was saying that Apple does a press release, not the Chinese government.whoa...wait a minute.
You believe Apple would be required, by the Chinese government, to do a press release to announce a new way the Chinese government will be spying on its own people?
For the sake of fun let’s say that’s true…
Do you actually believe it would be truthful information?
I hate to break it to you but this is not how China works - lol
Unlikely for Apple to store the databases of hash values, especially in iOS devices. The devices just compute the hashes. It’ll likely be sent to Apple’s server and it’ll just check the hashes against NCMEC’s web-services.As far as I understand Apple's technical summary, the hashing algorithm - NeuralHash - is always the same. The hash number providers (currently NCMEC and some other child safety organizations) run the NeuralHash algorithm in their own facilities and deliver the resulting hash numbers to Apple, without Apple ever seeing the pictures. That would mean adding new pictures to search for, including pictures of an entirely different subject matter, is as simple as the hash number providers running NeuralHash on whatever pictures they have and delivering the new numbers to Apple. Only the database of numbers in iOS needs to be updated, not the algorithm.
An interesting effect of this setup is that Apple does not really know what it is searching for. It merely gets a list of numbers, and has no means to check whether these numbers come from CSA-pictures or pictures of other subject matters. Of course, Apple then has the manual review stage for flagged pictures, so Apple would notice if iPhones suddenly started flagging swastikas or whatever. (Though I suspect that in more totalitarian countries the manual review would be done by the state rather than Apple, or at least under state supervision.)
A particular conundrum I wonder about in this is how Apple's manual reviewers judge the pictures coming from the iPhones. As Apple has no access to the original CSA-pictures from NCMEC, the Apple reviewers will have to judge the iOS-pictures solely by their content. Given the "semantic similarity" part of the matching algorithm, I could see Apple ending up with a lot of pictures where they really cannot be sure, e.g. is the person in the picture young enough for it to constitute CSA. Likewise, I do not know how blatant all the original CSA-pictures are - are they all very obvious offences, or are some seemingly harmless when viewed in isolation, and NCMEC only knows of their CSA-status because they were part of a known series? I have also read claims that Apple reviewers will only see a low resolution version of the pictures, which on the one hand may be good to protect both the reviewers and the privacy of iPhone-owners with mistakenly flagged harmless pictures, but on the other hand it would compound the reviewing uncertainty. Apple will have to decide whether to err on the side of caution for the user, or for the authorities and let them sort it out.
Right now it’s done, in those places, at the cloud level. Is that any better for you? At least if it’s done at the device level you can disconnect to prevent it.
But license plates numbers are obvious and fixed, and is public record. If someone put in a wrong license plate, it can be easily proven otherwise.I associate this with the license plate readers police use on their cars. If you are not driving a stolen car or have outstanding warrants, then there would not be a match to any database.