Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I read the article and still have a concern. It's a slippery slope from identifying hashes in a database to other photos on your iphone, to speeding and detecting drunk driving patterns. (not that that would be a bad thing).
I agree, I’m not saying that we should not have concerns about this. It opens doors to privacy invasion, with less altruistic purposes than fighting child abuse. I just think that we have to understand well what this is (and what this isn’t) about
 
The same thing can happen right now - if they are downloading the image, it had to come from someplace, and that may be monitored.

How about using a strong passcode on your phone?
This seems a little too "Red Light Camera" automated law enforcement for my taste. If this risk is suddenly 100% my responsibility I think it's time to reevaluate how much we really need to have smartphones.
 
Comments on this post are half of the reason why fake news spreads. Sometimes, organisations spread sensationalist headlines to increase virality.

But sometimes, people don't even read the damn article - ONLY THE HEADLINE. @arn and others have been pretty clear in some responses, "the ONLY images being flagged are those whose hashes match a database that is maintained by those who combat child exploitation."
 
This is the beginning of the end to any hope that Apple would continue to differentiate themselves as an entity interested in protecting their user’s privacy. No, they don’t monitor our devices for the purposes of selling us stuff like FB, Google, etc.. but they do monitor for the purpose of identifying “good” or “bad” people or actions? Of course, they start with a use case like child abuse, since you’d have to be a monster to argue against it. But this sets a precedent and would begin to normalize monitoring actions for further use cases.

Also I noticed that MacRumors didn’t run anything on this until Apple themselves “previewed” the feature. They’ll repost some rumor on the next round of emoji’s in a matter of minutes, but a topic like this that has been published by FT, The Verge, CNN, and many others they stay quiet on? I can only assume that Apple has worked with publishers like MacRumors to control the messaging on this. There’s nothing that reminds me more of Big Brother, than steps to erode personal privacy, while making sure that they control all communication around that erosion.

This needs to be stopped.

So you prefer that Apple allow its iCloud network and infrastructure to be used to commit crimes?
 
so if I understand correctly, the communications one (suspected photos attached to messages) is a parental control thing - think that is actually ok.
The comparison of iCloud photos to a known database (which I'm guessing is controlled by some government authority) seems to be a good thing too.

So oppose to so many here - I think this is actually a good move ....
 
This seems a little too "Red Light Camera" automated law enforcement for my taste. If this risk is suddenly 100% my responsibility I think it's time to reevaluate how much we really need to have smartphones.
It’s always been “100% your responsibility” to not possess child pornography. This is not a change, other than increasing the likelihood you will be caught. And if you do possess such images, I want you to be caught.
 
I hope these coders realize that in other countries taking pictures of children running around in the grass without any clothes on is normal during a holiday or family picnic, Europeans and Russians do this.
Sure they may not match a known hash but still.

I had to tell my friend from Russia please be careful what they send to me from their picnic and holiday and they were like what?!? Westerners are weird.
 
The only people who should be opposed to these specific features are people who are committing crimes, and people who don’t care about the safety of their children.
That’s a pretty right wing view of the issue and something a cop would say. You think all the people in jail are actually guilty? Do you think (in the USA) people on death row (or worse, those killed) are all guilty? Innocent people get their lives ruined all the time! Is that worth it to catch horrible people? Most people say ‘yes’… until THEY are wrongfully accused.

I’m sure this can be done well. It seems that they only look for *known* childhood exploitation images to find matches so that’s fine. As long as their false positive rate is zero - and no third party gets to ‘review them manually’, I have no problem with this.

Let’s just wait until all the facts are known, the dust has settled, and people (on both ‘sides’) just calm down. As a medical specialist I have many (deidentified) medical photos, some of children… secure & encrypted in iCloud so nobody can see them but me. I don’t want Apple to damage this trust.
 
That’s a pretty right wing view of the issue and something a cop would say. You think all the people in jail are actually guilty? Do you think (in the USA) people on death row (or worse, those killed) are all guilty? Innocent people get their lives ruined all the time! Is that worth it to catch horrible people - most people say ‘yes’… until it’s THEY are wrongfully accused.

I’m sure this can be done well. It seems that they only look for *known* childhood exploitation images to find matches so that’s fine. As long as their false positive rate is zero - and no third party gets to ‘review them manually’, I have no problem with this.

Let’s just wait until all the facts are known, the dust has settled, and people (on both ‘sides’) just calm down.

If you possess child porn, you are guilty. It’s not that complicated.

It’s something a parent would say, not something a cop would say.
 
One thing everyone has to realize: you do not own your data in the Cloud.

Yes, you heard me right.

When you upload your data to any cloud service, including iCloud, the data for that sits on that provider's servers, putting them in possession of it. So when any federal entity investigating you or any person, they do not require a warrant to access your data there, because since the data is owned by a 3rd party, that 3rd party is not privy to that warrant, so only a subpoena from a clerk of the court is needed.

Oh, PSA: Any lawyer is a clerk of the court. They can create their own subpoena and execute it (it does not need to be signed off by the court), and delivered to that 3rd party for whatever purposes are listed in that subpoena, including providing any and all data needed for their investigation. They do not need to go directly to you for that. A report was done on this and posted in PRSI about it roughly 8 years ago:


How that is relevant here: if such abusive photos or worse is located on Apple's servers, Apple could possibly be charged with the harboring of such photos, putting them in legal liability, along with being subpoenaed to answer questions about who uploaded those pictures, when, etc. Then the investigators could go after the person that did it. In the end, Apple gets dinged on legal charges, as well as the uploader. Apple is looking at how to get out of those legal issues with this.

Again, I'm not saying whether it is morally or ethically right or wrong, but they also are looking after themselves with this.

BL.

Your post sets up the tragic punchline: this system scans for things on YOUR phone, not just in iCloud thereby refuting the whole “Apple will get in trouble if these images are on their servers” thing. We already knew iCloud photos and documents were scanned but doing it on device even when iCloud is disabled is a whole other kettle of fish.
 
  • Love
Reactions: peanuts_of_pathos
It’s always been “100% your responsibility” to not possess child pornography. This is not a change, other than increasing the likelihood you will be caught. And if you do possess such images, I want you to be caught.
The scenario I described is one where someone is trying to victimize you, not one where you're legitimately guilty of anything. It seems that it is now easier for that to happen.
 
Your post sets up the tragic punchline: this system scans for things on YOUR phone, not just in iCloud thereby refuting the whole “Apple will get in trouble if these images are on their servers” thing. We already knew iCloud photos and documents were scanned but doing it on device even when iCloud is disabled is a whole other kettle of fish.

The problem here though is that what Apple is doing also scans your photos in iCloud, so iCloud is included with this.

BL.
 
  • Like
Reactions: peanuts_of_pathos
Let's also not forget this is the same company that would not help the FBI access dead known/proven terrorists' phones to try to prevent other attacks and get information.

But they will scan your cloud data for these types of images over such privacy rights.

A bit hypocritical which criminals they will assist with or not as well.

And I agree, this all begins with formulating it in a way that no one would say no. "You all want to prevent exploited children or are you a monster?"
Apple sucks now. No different from google.
Tim Cook even lied to Congress saying China wasn’t stealing from them.
 
Thank god the EU does not allow face scanning (as far as I know or at least not by default)
I don't mind facial recognition for personal use as it really helps with photo organizing. I do have issues with someone scanning my photos looking for things to report. And there are other ways for this to be exploited. I understand what they are doing but I do not like it. As others have said this is a slipper slope.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.