Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

turbineseaplane

macrumors P6
Mar 19, 2008
17,412
40,223
As a father of a child that was sexually abused at the age of 2 (yes 2 years old) by a male and assisted by my daughters own mother, i fully support any actions by Apple and others to trap these sick SOBs and get them locked up. BRAVO Apple!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

My deepest sympathies are with you for having gone through that situation..
For the sake of debate - in your situation - assuming it happened "next year"...

How would "searching everyone's iPhone photos" have helped when your daughters own mother was involved?
It's hard to see how that situation would have been prevented by rifling through everyone's photos.

Or are you more speaking from a place of wanting retribution?
 
Last edited:

cwosigns

macrumors 68020
Jul 8, 2008
2,266
2,744
Columbus,OH
I associate this with the license plate readers police use on their cars. If you are not driving a stolen car or have outstanding warrants, then there would not be a match to any database.
Wow. That's a stretch. When you go out in public, there is no expectation of privacy. People expect that what is on their phone is private.
 

rick3000

macrumors 6502a
May 6, 2008
648
298
West Coast
My concern is less with the scanning and more that once this tool exists it can be used to scan and compare against any database of image hashes. Then instead of Apple telling a government that wants to search for something, no we can't search for that because the tool does not exist, the response is we could, but we won't. Then Apple gets served with a court order and a gag order and suddenly they can use this tool to search for anything.
 

turbineseaplane

macrumors P6
Mar 19, 2008
17,412
40,223
My concern is less with the scanning and more that once this tool exists it can be used to scan and compare against any database of image hashes. Then instead of Apple telling a government that wants to search for something, no we can't search for that because the tool does not exist, the response is we could, but we won't. Then Apple gets served with a court order and a gag order and suddenly they can use this tool to search for anything.

The "tool" and the methodology is for sure the issue here.

Particularly when Apple has said and shown that complying with anything a jurisdiction wants is - as one would expect - par for the course and required.

I mean - you have to do what you're told by governments - but for gods sake don't build things into peoples phones that will help them exploit in ways they may never have even dreamed of.
 

quarkysg

macrumors 65816
Oct 12, 2019
1,247
841
My concern is less with the scanning and more that once this tool exists it can be used to scan and compare against any database of images hashes. Then instead of Apple telling a government that wants to search for something, no we can't search for that because the tool does not exist, the response is we could, but we won't. Then Apple gets served with a court order and a gag order and suddenly they can use this tool to search for anything.
As I understand it, this is a purpose built search tool, designed to search for one type of materials only, and nothing else. It cannot be used as a general purpose query tool by Apple. It can only change it's behaviour with an OS update.

Other than Linux and it's associated open source apps and utils that you have access to the source codes, almost all OSes (mobile or otherwise) contains propriety closed source OS, utils and app that are black boxes to its users. Even for commercial Linux distributions, there could be parts where it is shipped with closed source binaries, as long as those binaries do not use and GPL source codes.

So food for thoughts.
 

turbineseaplane

macrumors P6
Mar 19, 2008
17,412
40,223
As I understand it, this is a purpose built search tool, designed to search for one type of materials only, and nothing else. It cannot be used as a general purpose query tool by Apple. It can only change it's behaviour with an OS update.

Are you able to please link to anything specifically discussing this point?
I'm interested to read more about that particular point.

Thank you!
 

quarkysg

macrumors 65816
Oct 12, 2019
1,247
841
Are you able to please link to anything specifically discussing this point?
I'm interested to read more about that particular point.

Thank you!
Well, that's what I understood after reading the post from John Gruber. It basically scans a device's iOS photos based on the hash algorithm of a authoritative database such that it can be compared. And as @cmaier explained in this post in this thread, it only generate the hash when photos before being uploaded to iCloud. I don't think iOS will scan all photos of a device automatically.

I agree with @cmaier reasoning that this is likely the first step in E2EE for iCloud, as it makes sense to me.
 

turbineseaplane

macrumors P6
Mar 19, 2008
17,412
40,223
Well, that's what I understood after reading the post from John Gruber. It basically scans a device's iOS photos based on the hash algorithm of a authoritative database such that it can be compared. And as @cmaier explained in this post in this thread, it only generate the hash when photos before being uploaded to iCloud. I don't think iOS will scan all photos of a device automatically.

I agree with @cmaier reasoning that this is likely the first step in E2EE for iCloud, as it makes sense to me.

So what confirms for you that it's limited in scope to one specific type of data or even type of photo?

Sounds like matching against a database...really just depends upon which database we are talking about

Let's say -- the Chinese government -- would like to find photos of a specific type of thing they don't like.
You think this technology would be fundamentally unable to adapt to that demand? (which Apple must honor by the way)

Build the "hammer" and all of the sudden lots of people are going to be compelling you to hit "nails" they are looking for.
 

cmaier

Suspended
Jul 25, 2007
25,405
33,474
California
So what confirms for you that it's limited in scope to one specific type of data or even type of photo?

Sounds like matching against a database...really just depends upon which database we are talking about

Let's say -- the Chinese government -- would like to find photos of a specific type of thing they don't like.
You think this technology would be fundamentally unable to adapt to that demand? (which Apple must honor by the way)

Build the "hammer" and all of the sudden lots of people are going to be compelling you to hit "nails" they are looking for.

Why would China bother, when they can just walk into the Chinese data center Apple has to use and demand everything in anybody’s account? Wouldn’t that be a lot easier?
 

quarkysg

macrumors 65816
Oct 12, 2019
1,247
841
So what confirms for you that it's limited in scope to one specific type of data or even type of photo?

Sounds like matching against a database...really just depends upon which database we are talking about

Let's say -- the Chinese government -- would like to find photos of a specific type of thing they don't like.
You think this technology would be fundamentally unable to adapt to that demand? (which Apple must honor by the way)

Build the "hammer" and all of the sudden lots of people are going to be compelling you to hit "nails" they are looking for.
Well, China would have to let Apple know of the detailed algorithm to scan and generate the required hash values, and China would have to provide the database to compare the hash with. Then Apple would have to add this algorithm into the OS and push an update to their supported devices. And I believe Apple would then have a press release indicating such an update (like what they are doing now), since it would be required by law if Apple wants to operate in the Chinese market, much like what Apple is required to do in the US market, with regards to the issue discussed in this topic.
 

UH8183

macrumors member
Jul 27, 2021
30
15
I did not miss your point but you neglected to read what I said when I stated that Apple no doubt has consulted their legal team. You're acting like Apple is some mom & pop shop that doesn't know what they are doing. No major corporation is stupid enough not to consult their legal team before taking any action.
Lots of big company's consult their legal team and still get in trouble with their actions. It will be the little employee who Apple will throw under the bus.
 

Shirasaki

macrumors P6
May 16, 2015
16,263
11,765
Why would China bother, when they can just walk into the Chinese data center Apple has to use and demand everything in anybody’s account? Wouldn’t that be a lot easier?
All China get is iCloud backup etc unencrypted, which is basically what it is now. They’d rather demand Apple to install spyware on Chinese iPhone than getting basically limited info on a backup. They are already mandating facial recognition on all smartphones to prevent kids playing “too many hours” of game.
 

turbineseaplane

macrumors P6
Mar 19, 2008
17,412
40,223
And I believe Apple would then have a press release indicating such an update (like what they are doing now), since it would be required by law if Apple wants to operate in the Chinese market

whoa...wait a minute.

You believe Apple would be required, by the Chinese government, to do a press release to announce a new way the Chinese government will be spying on its own people?

For the sake of fun let’s say that’s true…
Do you actually believe it would be truthful information?

I hate to break it to you but this is not how China works - lol
 

indychris

macrumors 6502a
Apr 19, 2010
703
1,527
Fort Wayne, IN
Why would China bother, when they can just walk into the Chinese data center Apple has to use and demand everything in anybody’s account? Wouldn’t that be a lot easier?

Many countries like china, turkey and other full blown or quasi-dictatorships have both technology and teams of people who monitor countless terabytes of information as it occurs. China would 'bother' because it isn't a bother at all. It is actually much more efficient and can cover infinitely more information than just a 'walk into the Chinese data center'.
 

Grey Area

macrumors 6502
Jan 14, 2008
433
1,030
Well, China would have to let Apple know of the detailed algorithm to scan and generate the required hash values, and China would have to provide the database to compare the hash with. Then Apple would have to add this algorithm into the OS and push an update to their supported devices. And I believe Apple would then have a press release indicating such an update (like what they are doing now), since it would be required by law if Apple wants to operate in the Chinese market, much like what Apple is required to do in the US market, with regards to the issue discussed in this topic.
As far as I understand Apple's technical summary, the hashing algorithm - NeuralHash - is always the same. The hash number providers (currently NCMEC and some other child safety organizations) run the NeuralHash algorithm in their own facilities and deliver the resulting hash numbers to Apple, without Apple ever seeing the pictures. That would mean adding new pictures to search for, including pictures of an entirely different subject matter, is as simple as the hash number providers running NeuralHash on whatever pictures they have and delivering the new numbers to Apple. Only the database of numbers in iOS needs to be updated, not the algorithm.

An interesting effect of this setup is that Apple does not really know what it is searching for. It merely gets a list of numbers, and has no means to check whether these numbers come from CSA-pictures or pictures of other subject matters. Of course, Apple then has the manual review stage for flagged pictures, so Apple would notice if iPhones suddenly started flagging swastikas or whatever. (Though I suspect that in more totalitarian countries the manual review would be done by the state rather than Apple, or at least under state supervision.)

A particular conundrum I wonder about in this is how Apple's manual reviewers judge the pictures coming from the iPhones. As Apple has no access to the original CSA-pictures from NCMEC, the Apple reviewers will have to judge the iOS-pictures solely by their content. Given the "semantic similarity" part of the matching algorithm, I could see Apple ending up with a lot of pictures where they really cannot be sure, e.g. is the person in the picture young enough for it to constitute CSA. Likewise, I do not know how blatant all the original CSA-pictures are - are they all very obvious offences, or are some seemingly harmless when viewed in isolation, and NCMEC only knows of their CSA-status because they were part of a known series? I have also read claims that Apple reviewers will only see a low resolution version of the pictures, which on the one hand may be good to protect both the reviewers and the privacy of iPhone-owners with mistakenly flagged harmless pictures, but on the other hand it would compound the reviewing uncertainty. Apple will have to decide whether to err on the side of caution for the user, or for the authorities and let them sort it out.
 
Last edited:

Shirasaki

macrumors P6
May 16, 2015
16,263
11,765
As far as I understand Apple's technical summary, the hashing algorithm - NeuralHash - is always the same. The hash number providers (currently NCMEC and some other child safety organizations) run the NeuralHash algorithm in their own facilities and deliver the resulting hash numbers to Apple, without Apple ever seeing the pictures. That would mean adding new pictures to search for, including pictures of an entirely different subject matter, is as simple as the hash number providers running NeuralHash on whatever pictures they have and delivering the new numbers to Apple. Only the database of numbers in iOS needs to be updated, not the algorithm.

An interesting effect of this setup is that Apple does not really know what it is searching for. It merely gets a list of numbers, and has no means to check whether these numbers come from CSA-pictures or pictures of other subject matters. Of course, Apple then has the manual review stage for flagged pictures, so Apple would notice if iPhones suddenly started flagging swastikas or whatever. (Though I suspect that in more totalitarian countries the manual review would be done by the state rather than Apple, or at least under state supervision.)

A particular conundrum I wonder about in this is how Apple's manual reviewers judge the pictures coming from the iPhones. As Apple has no access to the original CSA-pictures from NCMEC, the Apple reviewers will have to judge the iOS-pictures solely by their content. Given the "semantic similarity" part of the matching algorithm, I could see Apple ending up with a lot of pictures where they really cannot be sure, e.g. is the person in the picture young enough for it to constitute CSA. Likewise, I do not know how blatant all the original CSA-pictures are - are they all very obvious offences, or are some seemingly harmless when viewed in isolation, and NCMEC only knows of their CSA-status because they were part of a known series? I have also read claims that Apple reviewers will only see a low resolution version of the pictures, which on the one hand may be good to protect both the reviewers and the privacy of iPhone-owners with mistakenly flagged harmless pictures, but on the other hand it would compound the reviewing uncertainty. Apple will have to decide whether to err on the side of caution for the user, or for the authorities and let them sort it out.
This is what I think a much better and a bit more insightful explanation of how the hash works. Had I read this earlier I’d had a little bit of breather knowing it might not be as scary as it sounds, though slippery slope stance remains, but that’s the same for all dangerous tools.
 

quarkysg

macrumors 65816
Oct 12, 2019
1,247
841
whoa...wait a minute.

You believe Apple would be required, by the Chinese government, to do a press release to announce a new way the Chinese government will be spying on its own people?

For the sake of fun let’s say that’s true…
Do you actually believe it would be truthful information?

I hate to break it to you but this is not how China works - lol
i was saying that Apple does a press release, not the Chinese government.
 

Mac4Mat

Suspended
May 12, 2021
168
466
I applaud Apple's policy on privacy, often standing up against tech companies and others, but I think they have really shot themselves in the foot on the latest scanning of photos situation, although I wouldn't call it a scandal as they have not done this surreptitiously but openly, therein hopefully giving them time to reflect on this very poor decision.

I cannot criticise them for their wish to rid the Internet of Child Abuse, and I doubt many could. However in this instance they have really wrong footed themselves as in one fell swoop they bring into question any concern Apple has for privacy of data, even after making so many attempts to safeguard that data from others.

Personally I would have no objection to Apple scanning pictures, but Apple should have realised what a blunder it was to even go there, because its rather like the old adage where I have amended the wording to reflect the Apple situation and where no disrespect is intended to the original author or any organisation or religion:

First they came for the 'suspect' Children's Pictures
And I did not speak out
Because I was not a Child Abuser

Then they came for the 'suspect' Adult pictures
And I did not speak out
Because my pictures were not those

Then they came for 'suspect' Animal Abuse pictures
And I did not speak out
Because I was not an Animal Abuser

Then they came for 'suspect' Law Breakers
And I did not speak out
Because I was not a Law Breaker

Then they came to control Everyone's Data
And there was no one left who could speak out
Not even Me!
 

quarkysg

macrumors 65816
Oct 12, 2019
1,247
841
As far as I understand Apple's technical summary, the hashing algorithm - NeuralHash - is always the same. The hash number providers (currently NCMEC and some other child safety organizations) run the NeuralHash algorithm in their own facilities and deliver the resulting hash numbers to Apple, without Apple ever seeing the pictures. That would mean adding new pictures to search for, including pictures of an entirely different subject matter, is as simple as the hash number providers running NeuralHash on whatever pictures they have and delivering the new numbers to Apple. Only the database of numbers in iOS needs to be updated, not the algorithm.

An interesting effect of this setup is that Apple does not really know what it is searching for. It merely gets a list of numbers, and has no means to check whether these numbers come from CSA-pictures or pictures of other subject matters. Of course, Apple then has the manual review stage for flagged pictures, so Apple would notice if iPhones suddenly started flagging swastikas or whatever. (Though I suspect that in more totalitarian countries the manual review would be done by the state rather than Apple, or at least under state supervision.)

A particular conundrum I wonder about in this is how Apple's manual reviewers judge the pictures coming from the iPhones. As Apple has no access to the original CSA-pictures from NCMEC, the Apple reviewers will have to judge the iOS-pictures solely by their content. Given the "semantic similarity" part of the matching algorithm, I could see Apple ending up with a lot of pictures where they really cannot be sure, e.g. is the person in the picture young enough for it to constitute CSA. Likewise, I do not know how blatant all the original CSA-pictures are - are they all very obvious offences, or are some seemingly harmless when viewed in isolation, and NCMEC only knows of their CSA-status because they were part of a known series? I have also read claims that Apple reviewers will only see a low resolution version of the pictures, which on the one hand may be good to protect both the reviewers and the privacy of iPhone-owners with mistakenly flagged harmless pictures, but on the other hand it would compound the reviewing uncertainty. Apple will have to decide whether to err on the side of caution for the user, or for the authorities and let them sort it out.
Unlikely for Apple to store the databases of hash values, especially in iOS devices. The devices just compute the hashes. It’ll likely be sent to Apple’s server and it’ll just check the hashes against NCMEC’s web-services.
 

leman

macrumors Core
Oct 14, 2008
19,522
19,679
Right now it’s done, in those places, at the cloud level. Is that any better for you? At least if it’s done at the device level you can disconnect to prevent it.

Cliff, come on, you are a lawyer. You know that these things are fluent.

I mean, I totally trust Apple when they say that they are only using it to check iCloud uploads (and frankly, it is indeed a more privacy-oriented solution than server-side checking), and when they say that they use NCMEC databases. However, why should I trust it to remain this way? Client-side scanning sets a dangerous precedent — once the system is in place, it is trivial to extend it to other purposes. What if Tim has an accident in a year or so and a new Apple CEO is more open to making a deal with totalitarian regimes in exchange for market access and tax benefits? Or if they decide to extend the scanning to your private pictures that are not even stored on the cloud (which would be a trivial thing once the framework is in place). Or even more, scan all all the image data that goes through the APIs? Or what if a new governing body comes to power that redefines what "child endangerment" mean (there was already an example of Hungarian law that makes "gay propaganda" a criminal offense)? These are the real issues. It's not what we have now, it's what becomes easily possible in the future.

For now, I am not affected by these changes. I live in Europe and even though there is a lot of pressure to make surveillance more prevalent, European politicians are generally much more privacy-oriented and generally sane than their colleagues in the USA. But they are also massively incompetent and I worry that these developments will nudge our legislation in the wrong direction.

And let's not even start with the philosophical and moral part of the issue, as this kind of technology comes very close to violating the presumption of innocence.
 
Last edited:

ian87w

macrumors G3
Feb 22, 2020
8,704
12,638
Indonesia
I associate this with the license plate readers police use on their cars. If you are not driving a stolen car or have outstanding warrants, then there would not be a match to any database.
But license plates numbers are obvious and fixed, and is public record. If someone put in a wrong license plate, it can be easily proven otherwise.

The database used here is a black box, nobody knows what's the algorithm and data used to train it. If there are errors, nobody would know.
 

leman

macrumors Core
Oct 14, 2008
19,522
19,679
Well, there is a probably fun way to protest these changes. If enough of our American friends would flood their iCloud photos with pictures of cats that have been digitally manipulated to match known hashes in the NCMEC, this would overload the human-controlled part of the system and likely lead Apple to abandon this enterprise.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.