Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
This is from the EFF:

Pretty good article all should read.

It's not that "oh you think child porn is good." NO ONE here is saying that or would agree.

It is more 'what comes next?" Also remember, not everyone lives n the US with our laws and freedoms. What if China tomorrow writes a law that possession of any mocking photo of their president is illegal. Maybe it's life n prison; maybe more harsh of a crime than CSAM. They tell Apple you can scan for these and alert our authorities or don't sell phones here. We know you can do it.

How about Myanmar or Belarus get to insist they see iCloud photos for political dissidents? Those are terrorists to their goverment.

Or if the US government mandates scanning for illegal drug photos or other types of illegal things. How about photos of terroristic things like bombs or writings?

Lets not forget the old Patriot Act because "9/11 bad and unless you're a terrorist too then you should have nothing to hide." How many freedoms did people give up, and in exchange for?? I'd LOVE to see real data how many actual real-life terrorist plots were foiled under the Patriot Act clauses n the name of privacy invasions of citizens. My guess- very very few stemmed from that law (if any, no way to confirm or deny that).

The issue is once you give an inch they will take a mile. One illegal photo type may lead to photos of anything illegal and bye-bye privacy.

And no, this isn't if you're doing nothing wrong you have nothing to worry bout fake arguments.

The other stuff like Messaging is great if its a kid account to block and warn for photos sent. That only affects a child's account. (I do question the end-to-end encryption we cannot see your content part then). But to scan all of our accounts looking for the needle in the haystack is a slippery slope.

That's the dangerous part to me, where does this lead next. You're right. I have nothing to hide and have nothing illegal; that doesn't mean don't be concerned what else this could be used for down the line once they have that door.
 
Last edited:
This right here...someone gets it...
I get it, the problem for me is, for the first time that we know of, Apple will install a hidden surveillance system on every device without user approval. Saying every IOS user is guilty before proving their innocence. This system is only for connecting to law enforcement. Remember when authorities used “it’s about the children” to make major intrusions into privacy and surveillance. This framework can be used for other purposes unrelated to protecting children. Will Apple closeup shop in a country that demands use of this newly constructed framework? So far the big three don’t have any credibility on that front. Is it better than how Microsoft does it, or Google? Maybe initially, but the end result is the same. usual comeback is don’t view child porn and you have nothing to worry about. Which translates to trust us. Trust the 5, 9, or 14 eyes, what could possibly go wrong.
 
Last edited:
2F5C31FB-1A24-4278-A3DF-D770432D5E78.jpeg
 
You missed my point. You were arguing the carrier MUST set up the phone before it leaves their possession. Obviously my experience proves that is not true, as they send me the phone without iOS set up yet. So either you're confused or someone at the store is feeding you a line of BS.
Or maybe buying a phone online isn’t the same as in a store.

All that matters is that there is a way to buy a phone and not agree with the T&C. 10’s of thousands of people a day buy phones in retail stores and if any of them don’t see the T&C than it should be on Apple to prove anyone saw them.
 
Last edited:
I wonder just how “fuzzy” the hashing logic is? If any change to a pixel changes the hash then this system is completely useless and easy to defeat.

Chrome does something similar with color profiles (range and frequency) of websites to detect known phishing sites.


Edit:
So it sounds pretty fuzzy to me. Apple's summary:

NeuralHash is a perceptual hashing function that maps images to numbers. Perceptual hashing bases this number on features of the image instead of the precise values of pixels in the image.
 
Last edited:
This is extremely disturbing, to the point where I might have to turn the ship 180 and ditch.

My wife, grandparents, and I probably have dozens of naked pictures of our 1 year old daughter. Just the other day we took a video of her after her bath trying to walk in my shoes. Is this going to be classed as ‘abusive or exploitative’?

Just the other month my wife was scanning boxes of old family photos where I’m pretty sure some were of her and her brothers naked.

I do not want anyone seeing private and completely harmless family photos without my express permission.

What on earth is Apple thinking. Do they really think people involved in heinous acts like this are uploading them to iCloud? This is so smooth brained it must have been suggest by Eddie Cue.
 
  • Like
Reactions: MarsMignon
This is extremely disturbing, to the point where I might have to turn the ship 180 and ditch.

My wife, grandparents, and I probably have dozens of naked pictures of our 1 year old daughter. Just the other day we took a video of her after her bath trying to walk in my shoes. Is this going to be classed as ‘abusive or exploitative’?

Just the other month my wife was scanning boxes of old family photos where I’m pretty sure some were of her and her brothers naked.

I do not want anyone seeing private and completely harmless family photos without my express permission.

What on earth is Apple thinking. Do they really think people involved in heinous acts like this are uploading them to iCloud? This is so smooth brained it must have been suggest by Eddie Cue.

Again, your personal pictures won't play a major part in this, as what would be happening is that the hashes created from your pictures would be compared to a database of other hashes of known child abusive or child pornographic pictures.

That is at least how I have read the article. You will already know that your pictures are not going to be part of those, especially in seeing that you have taken them personally and those pictures were not distributed anywhere. So it's safe to say that your personal pictures are safe.

What is concerning everyone is the slippery slope effect and how that slippery slope affects any privacy or security rights people have.

BL.
 
Except they're not "scanning" photos, they're matching hashes from the NCMEC's database.
Did you intentionally misunderstand my post or did you simply ignore everything I wrote, just to bring home your “correction”?

Today they may ”only“ match hashes from the NCMEC’s database - but what if someone in the future uses the procedure to match something entirely different, in order to scan (if this is the wrong word, I beg your pardon, as I’m not a native speaker) for persons that can be identified using a different hash that represents e.g. rainbow-colored watch bands … ?
 
  • Like
Reactions: ssgbryan
As literally pointed out earlier in this thread, other companies have been already doing this for decades!!!
Do you think Google photos users are worried about people storming into the house, killing them over false positives? Of course not.
Completely ridiculous argument, apples not losing billions over anything, they’re not the first huge company to implement this
So because there is precedent, everything is fine? “Eat **** - millions of flies can’t be wrong“ …
 
Last edited:
  • Like
Reactions: ssgbryan
That's definitely too far, Apple. The more entries to the database, the more false positives, the more the chance that a human reviewer is going to see innocent nudes or baby photos.
 
As literally pointed out earlier in this thread, other companies have been already doing this for decades!!!
Do you think Google photos users are worried about people storming into the house, killing them over false positives? Of course not.
Completely ridiculous argument, apples not losing billions over anything, they’re not the first huge company to implement this
That's why I don't use Google photos and google mail for anything serious.
 
Again, your personal pictures won't play a major part in this, as what would be happening is that the hashes created from your pictures would be compared to a database of other hashes of known child abusive or child pornographic pictures.

That is at least how I have read the article. You will already know that your pictures are not going to be part of those, especially in seeing that you have taken them personally and those pictures were not distributed anywhere. So it's safe to say that your personal pictures are safe.
A computer can actually “see” matching hashes, where a human observer would clearly see two different images. Or your private photos have been compromised by some data leak and used as basis for some Photoshop job, which was spread on certain websites and eventually ended up as hash in the database.

And when there is an alarm, a human being will verify (according to the article). This is only possible, though, by actually viewing the real picture.

Apple usual “security by obscurity” approach here is not helpful for evaluating the robustness of the hash matching in the first place. So it’s completely unclear how often that may happen - especially as the user would not be informed about a “false alarm”.

The ”slippery slope” is just the negative icing on the cake …
 
  • Like
Reactions: ssgbryan
that is NOT correct. the hash values are automatically compared to a list of hash values of known child abuse images. if there are multiple matches of multiple photos in your database (more than one, they don't reveal how many you have to get flagged) a human being will be allowed to manually view the actual images in your icloud account to be 100% sure it isn't a fluke error with the hash code and the images actually are criminal child abuse images. at that time apple legal will formally contact law enforcement and they will take it from there. apple claims there is a 1 in a TRILLION chance that the hashes will be wrong and a human will catch that error (preventing false positives to law enforcement) so essentially they claim there is no way a human will ever lay eyes on your photos UNLESS they contain illegal images in which case... yes some unlucky employee at Apple who has the worst job ever will see them.

i have no problem with the technology, it seems thought through and secure. i AM worried about the slippery slope because in theory what is next... what if apple / the government / etc. suddenly decides they want to look for OTHER types of photos (political etc.) well they will already have all your images encoded as hashes and it would be simple to compare against other known photos... like a meme of a president, a stolen photo of a product, and on and on. they will say they won't... but the best way to ensure that is to make it impossible, and now they have built the back door and are asking us to trust us they won't use it "except for this" understandably heinous thing, child abuse.

The images are only reviewed by a human if you have multiple images (beyond some unknown value N), and each has been already determined to be a match. That would be quite a set of bad luck to have multiple images which each are somehow magically confused with kiddie porn…/
 
Or maybe buying a phone online isn’t the same as in a store.

All that matters is that there is a way to buy a phone and not agree with the T&C. 10’s of thousands of people a day buy phones in retail stores and if any of them don’t see the T&C than it should be on Apple to prove anyone saw them.

There's no logical difference, therefore I conclude, as I said before, that you are confused or have been misled. Besides, using your logic, you're basically saying that any prosecutions that arise for illegal images discovered by this scanning and flagging by Apple would be thrown out of court, because how could they ever "prove" who actually clicked on "agree" to the terms of the software update. Yeah, I'm sure Apple is going through all this trouble for nothing :rolleyes:
 
Last edited:
View attachment 1815235

I find it hilarious that I simply post a screenshot from the article and get all these negative reactions. WTF is wrong with some of you people? 🤣
same company that had to limit phone performance because its batteries were substandard, or had bad keyboards, or bad display connectors, etc etc.

if you actually believe that error rate I have a bridge to sell you.

And for anyone claiming they arent scanning photos, just hashes... how is that hash made? By scanning photos.

And finally how can apple claim anything is encrypted if they are able to scan it?
 
  • Like
Reactions: ssgbryan
"Visually similar" meaning the hash can't be defeated by cropping, changing to black and white, etc. It doesn't mean "both pictures have people in them". Here's the paper: https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf
but it can, it all depends on the threshold for 'visually similar'. The summary you linked doesnt make apple sound better. For startes there none match images are trees vs a city scape. There example is something that no one would think are similar.
 
The image is not reviewed by a human. The hash values are.
that is
Apple's method here is far more private than that used by others in Big Tech, it hilarious to read all this controversy from people who clearly didn't read the article. I guess that's what happens when the American education system is a joke.
just because its better doesnt mean its good. Do you try to pick turds up by the clean end?
 
  • Like
Reactions: ssgbryan
same company that had to limit phone performance because its batteries were substandard, or had bad keyboards, or bad display connectors, etc etc.

if you actually believe that error rate I have a bridge to sell you.

And for anyone claiming they arent scanning photos, just hashes... how is that hash made? By scanning photos.

And finally how can apple claim anything is encrypted if they are able to scan it?

Apple only claimed that transmission from end to end is encrypted when it reaches them. They never really claimed that it is encrypted when it is in a restful state.

Case in point: are the pictures on your phone encrypted? are they sitting in the secure enclave? If not, they are unencrypted.

BL.
 
My bomb making and gunsmithing guides are cool though right? Along with photos of various addresses?
 
There's no logical difference, therefore I conclude, as I said before, that you are confused or have been misled. Besides, using your logic, you're basically saying that any prosecutions that arise for illegal images discovered by this scanning and flagging by Apple would be thrown out of court, because how could they ever "prove" who actually clicked on "agree" to the terms of the software update. Yeah, sure :rolleyes:
All I have to do to prove my point is show evidence of one example where someone could not have reasonably read the T&C. That's it. You have to prove that everyone had a chance to read it. You can't "prove" that. I can.
 
same company that had to limit phone performance because its batteries were substandard, or had bad keyboards, or bad display connectors

You're comparing manufacturing defects to a scanning technology? Ok...

But heck, even if the error rate were 1/10th of what they said (1 in 100 billion), it would still be excellent.
 
All I have to do to prove my point is show evidence of one example where someone could not have reasonably read the T&C. That's it. You have to prove that everyone had a chance to read it. You can't "prove" that. I can.
I got you the first time. So you're saying that Apple is going through all this trouble for nothing. Not just implementing the scanning technology, but all the legal costs of developing those terms of service (not just for this upcoming update, but for all time past, present and future) was all for nothing because any case arising from something authorized in the terms of service will be thrown out of court unless Apple has an actual video showing the defendant tapping on "agree".

I don't think so.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.