Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I suspect many people simply weren’t aware that google was already doing such a thing, or have become numb to it because you lose all expectations of privacy with a company like google. As with many things, it only becomes a problem when Apple does it, but I suppose this is a happy problem of sorts. Hopefully, it will spur more discussion in this area, but just for Apple but for the entire industry.

You can’t condemn only Apple while giving every other tech company a free pass in this area.

Don't know, or don't care because it's exactly what they expect from Google. This is a much bigger story because it's Apple the privacy company doing it. I think overall I am surprised they've gone ahead and done this, because aspects such as any image flagged in your library for whatever reason is made available for a human to review is more than a bit creepy.
 
Don't know, or don't care because it's exactly what they expect from Google. This is a much bigger story because it's Apple the privacy company doing it. I think overall I am surprised they've gone ahead and done this, because aspects such as any image flagged in your library for whatever reason is made available for a human to review is more than a bit creepy.

The image is not reviewed by a human. The hash values are.
 
No, because they agree to it for you. They don’t ask or even tell you about it. And since it’s the carrier doing it, and technically theirs until they hand it off, they can get away with it.

I think you're confused about something, because EVERY time I've ordered a new phone from a carrier and have it shipped to me, I ALWAYS have to set up iOS and agree to the terms.
 
It may change some behaviour if people recognise the chances of them being caught in possession have been increased with Apple implementing this technology. Any reduction in the trade of this material is better than none.

But as I said in the sections you snipped out, I equally recognise the dangers of how this technology may be adapted and extended in future to meet the demands of governments around the world.

I doubt it was an easy judgement call for Apple to go ahead and do this. I'm sure there were many heated meetings in which people on both sides debated ferociously where this may lead.
I focused on that statement because I worry we will see an increase in content and by extension more children being victimized. People supporting this content aren’t rebelling against their parents, anf they are as likely as a heroin addict to quit because risks of being caught have increased.
 
I think you're confused about something, because EVERY time I've ordered a new phone from a carrier and have it shipped to me, I ALWAYS have to set up iOS and agree to the terms.
Why would I order it? I walk into the store and buy it same day.
 
The image is not reviewed by a human. The hash values are.

not or never?

edit: sounds like it may be never based on another thread. That's quite impressive, and would certainly eliminate any of my concerns about what Apple is doing here in this specific instance.
 
Last edited:
Why would I order it? I walk into the store and buy it same day.

You missed my point. You were arguing the carrier MUST set up the phone before it leaves their possession. Obviously my experience proves that is not true, as they send me the phone without iOS set up yet. So either you're confused or someone at the store is feeding you a line of BS.
 
Pretty scary that technology like this even exists, let alone that Apple is enabling it without strict user permission. As many others have outlined, it creates really dangerous precedents on multiple levels of privacy rights. The backdoors Apple was essentially against months prior, are now being marketed as child-safe protections. Sure, the technology is different, but functionally, it's exactly the same. Apple is now building backdoors into end user's phones. It's that simple.

It makes me sad (and terrified) that Apple is doing this. A trusted company turning. I think someone up thread said it best, we were so impressed with Apple fighting big brother, we failed to see Apple becoming the biggest brother of all.

A casual look through infrastructure history is the motto: build it and they shall come. Build prisons, you'll find criminals. Build plantations, you'll find slaves. Build ghettos, you'll find impoverished. Build surveillance systems, you'll find "criminals".

Apple has said privacy is a human right. This technology has the all the trademarks of human rights violations.
 
Pretty scary that technology like this even exists, let alone that Apple is enabling it without strict user permission. As many others have outlined, it creates really dangerous precedents on multiple levels of privacy rights. The backdoors Apple was essentially against months prior, are now being marketed as child-safe protections. Sure, the technology is different, but functionally, it's exactly the same. Apple is now building backdoors into end user's phones. It's that simple.

It makes me sad (and terrified) that Apple is doing this. A trusted company turning. I think someone up thread said it best, we were so impressed with Apple fighting big brother, we failed to see Apple becoming the biggest brother of all.

A casual look through infrastructure history is the motto: build it and they shall come. Build prisons, you'll find criminals. Build plantations, you'll find slaves. Build ghettos, you'll find impoverished. Build surveillance systems, you'll find "criminals".

Apple has said privacy is a human right. This technology has the all the trademarks of human rights violations.

There is no back door at all. This is a mechanism on the phone, itself, that creates a hash value and compares it to a list of hash values. And only if you have iCloud syncing turned on - in which case Apple already has all your photos. If you care about this alleged back door (which is not a back door), why would you leave the front door wide open (iCloud syncing)?
 
Boycotted in favor of… what exactly?
The Google pixel 6 series?
Lmao get serious for a second, 99.99999999999999% of people will never ever know this feature exists, unless they’re doing something wrong
Is it wrong for two consenting adults, sometimes long distance, like to share some adult content? And is it 100% guaranteed that those images cannot be mistaken for the child kind? As someone mentioned, this also catches color changes, croppings and other adjustments to the picture. Otherwise a hash check is pointless if someone just changes one pixel.
 
There will always be a case to invade peoples privacy to stop something bad from happening from people in power and they will abuse it. People trusted apple to protect their privacy, once trust is broken it’s hard to regain. What’s next, they’ll use this for an excuse to monitor the camaras in your house that are being uploaded to homekit “secure” video?
 
  • Like
Reactions: ssgbryan
Is it wrong for two consenting adults, sometimes long distance, like to share some adult content? And is it 100% guaranteed that those images cannot be mistaken for the child kind? As someone mentioned, this also catches color changes, croppings and other adjustments to the picture. Otherwise a hash check is pointless if someone just changes one pixel.
image hashing is easily able to avoid such mistakes
 
image hashing is easily able to avoid such mistakes
And that is a serious limitation. What if the image is cropped? Single pixel modified? Rotated? Manipulated in any way? Direct hash matching won't be good either.
 
So, apparently just saving the wrong picture in your photo library is enough for your account to be flagged.

This cannot be right. I see so many ways this could go wrong and people could get easily framed for committing a crime that they didn’t commit.

As an example: what if somebody sent you a nude from a girl that is supposed to be 18, but instead it turns out to be 15 and that image was flagged as child pornography?
Just by saving such a picture to your photo library your account is going to be reported.

Also keep in mind that there are other things that are considered equally illegal, such as storing copyrighted material on your computer without having a license to keep a private copy. How long until Apple scans your whole hard drive and reports you because you have a copy of an old movie sitting somewhere in your filesystem?
I understand the core of your comment, but I don't think someone should save a picture when they find out that person is 15!
 
And that is a serious limitation. What if the image is cropped? Single pixel modified? Rotated? Manipulated in any way? Direct hash matching won't be good either.

I am guessing you’re not involved in the technology of image hashing. Rotations, mirroring, cropping, alpha channel changes, pixel changes, etc. are all handled by mathematics. If you change it enough, then it won’t match, and you get false negatives. But false positives are incredibly difficult to produce, occurring extremely rarely (1 in a trillion, for example, so long as the hash length is long enough.)
 
Apple's method here is far more private than that used by others in Big Tech, it hilarious to read all this controversy from people who clearly didn't read the article. I guess that's what happens when the American education system is a joke.

I just assume that since Google was doing this sort of thing for a long time, all the kiddie porn traders moved to iOS, and now they are worried.
 
I am guessing you’re not involved in the technology of image hashing. Rotations, mirroring, cropping, alpha channel changes, pixel changes, etc. are all handled by mathematics. If you change it enough, then it won’t match, and you get false negatives. But false positives are incredibly difficult to produce, occurring extremely rarely (1 in a trillion, for example, so long as the hash length is long enough.)
Oh I understand, I am leading you to the core argument. If all those modifications can still cause a match to the original uncropped, unmodified image, then explain to me how a consenting adult image is guaranteed to not be a false positive?
 
Apple's method here is far more private than that used by others in Big Tech, it hilarious to read all this controversy from people who clearly didn't read the article. I guess that's what happens when the American education system is a joke.
Happens here daily. Sometimes I think they actually do read the article but react purposely against Apple just because.
 
Oh I understand, I am leading you to the core argument. If all those modifications can still cause a match to the original uncropped, unmodified image, then explain to me how a consenting adult image is guaranteed to not be a false positive?

Because image recognition algorithms rely on the relationship between image elements, and not on the precise RGBA value of any particular pixel. Two nodal graphs are identical even if they are physically rearranged. They are designed to avoid false positives at the cost of false negatives. I’m happy to go into the detailed math of some of these algorithms, but I doubt it would change your preconceived notion that somehow this is nefarious.

In the real world these digital fingerprinting algorithms are used every day and are highly successful. They are used for all sorts of things - for example, similar algorithms are used to identify songs. So the concert and album versions of a song are recognized to be the same song, but other songs by the same band are not captured. Similar algorithms are used for video, so that removal of frames, mirror imaging, etc. do not evade recognition, but Charlie and the Chocolate Factory is not confused with The Godfather. There are companies you’ve never heard of that sell very effective software to studios, etc. to accomplish these things.
 
Let me ask this question, then add a bit of a San Bernardino twist to it (Resident lawyers, chime in).

Let's assume the following:
  • The data is at rest on Apple's servers,
  • The data is indeed encrypted, and
  • A given suspect has iCloud Photos disabled.
The hashes for these pictures would either have to be taken when the data is in an unencrypted format, then matched, then encrypted. Therefore, no human will ever see the actual picture, because they won't have the key or algorithm used to encrypt the data to decrypt it, especially if it were a two-way encryption method.

By that extension, it would be nearly impossible for a hash on an encrypted file to match a hash on an unencrypted file. But let's say that one does. A person couldn't get an conviction, let alone an indictment on a hash, because there is no way that a hash is the actual evidence that is needed, correct? I mean, a hash isn't the picture, and the picture is the proof, right?

Additionally, should the investigators need that proof, a person would still have the the 4A and 5A to rely on, correct? The investigators would still need the phone if they were looking for those pictures, because the hashes won't work alone. Like the San Bernardino case, the suspects would not unlock the phone (they couldn't; they were dead, IIRC), and the government had no way to unlock the phone, and Apple refused to help them unlock it.

If that were the case, and all were true, then we're back to where we already are at right now.

And then to add the Phil Zimmerman/PGP twist to it, would it be criminal if a method used to counter this outright were discovered at the code level, and, oh... I don't know... added to a jailbreak should one be found for iOS 15?

BL.
 
Because image recognition algorithms rely on the relationship between image elements, and not on the precise RGBA value of any particular pixel. Two nodal graphs are identical even if they are physically rearranged. They are designed to avoid false positives at the cost of false negatives. I’m happy to go into the detailed math of some of these algorithms, but I doubt it would change your preconceived notion that somehow this is nefarious.

In the real world these digital fingerprinting algorithms are used every day and are highly successful. They are used for all sorts of things - for example, similar algorithms are used to identify songs. So the concert and album versions of a song are recognized to be the same song, but other songs by the same band are not captured. Similar algorithms are used for video, so that removal of frames, mirror imaging, etc. do not evade recognition, but Charlie and the Chocolate Factory is not confused with The Godfather. There are companies you’ve never heard of that sell very effective software to studios, etc. to accomplish these things.
And there is no possible way a similarly arranged photo with an older/legal subject can be mistaken? If it truly can detect croppings, color changes, manipulations then there is still a possibility that a similar style picture with a legal subject can be mistaken.

And seriously, I don't find your tone very helpful.

"but I doubt it would change your preconceived notion that somehow this is nefarious."

Did I ever state that somewhere? If I am missing something crucial to this technology, by all means. But just don't assume I will just cover my ears.
 
Last edited:
  • Like
Reactions: 00sjsl
The image is not reviewed by a human. The hash values are.

that is NOT correct. the hash values are automatically compared to a list of hash values of known child abuse images. if there are multiple matches of multiple photos in your database (more than one, they don't reveal how many you have to get flagged) a human being will be allowed to manually view the actual images in your icloud account to be 100% sure it isn't a fluke error with the hash code and the images actually are criminal child abuse images. at that time apple legal will formally contact law enforcement and they will take it from there. apple claims there is a 1 in a TRILLION chance that the hashes will be wrong and a human will catch that error (preventing false positives to law enforcement) so essentially they claim there is no way a human will ever lay eyes on your photos UNLESS they contain illegal images in which case... yes some unlucky employee at Apple who has the worst job ever will see them.

i have no problem with the technology, it seems thought through and secure. i AM worried about the slippery slope because in theory what is next... what if apple / the government / etc. suddenly decides they want to look for OTHER types of photos (political etc.) well they will already have all your images encoded as hashes and it would be simple to compare against other known photos... like a meme of a president, a stolen photo of a product, and on and on. they will say they won't... but the best way to ensure that is to make it impossible, and now they have built the back door and are asking us to trust us they won't use it "except for this" understandably heinous thing, child abuse.
 
Apple is concerned for child safety, yet they steadfastly refuse to implement a volume limit for iOS devices.

Volume limiting headphones usually have a defeat switch, so they're worthless. And/or they only limit to 95db, which is far too loud.

Volume limit also means speakers. My son is autistic and unable to comply, so he often blasts at full volume. This is not popular in restaurants or airplanes or when I'm trying to work.

The Volume Sanity app was great, until my son learned he could just kill it.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.