Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
We were so busy celebrating Apple protecting people from big brother we failed to notice it was morphing into biggest brother.

Nobody likes the idea of child exploitation but Apple’s initiative is creepy in a different but huge slippery slope kind of way.

To my mind this initiative feels like a negotiated deal to keep the DOJ from continually dragging Apple into federal court trying to force creation of a golden decryption key.

if this is what it is, Apple will soon learn that such appeasement only delays the demand for said key.
 
Nope. You're not accurately portraying what the technology does. Making an md5 hash of an image produces a single hash. Changing one pixel in that image produces an entirely different hash. If Apple's hashing technology utilizes a simple md5 hash (for example) then the accuracy of the tool would be 100% because it would only ever match exact pixel perfect copies of images that have already been identified. Apple's technology doesn't work this way:



"Visually similar images" cannot be accounted for via a simple hashing strategy. If it's using a neural network as per the PR nonsense then it's not actually checking a binary hash (either it is the exact image or it's not), it's using some neural network system that provides a GUESS as to whether or not the image in question is a match. The problem with neural networks and guessing is that tricking a neural network into making an inaccurate guess is easier than you might think. There are many papers on how to do so.

Unless there is transparency on what that neural network system is and how it works you cannot state that this system wouldn't flag either 1) images that were explicitly, maliciously designed to trigger false positives or 2) home photos that might be mistaken for abuse images. Apple are misleading you by talking about image hashes because the expectation behind most hashing systems is that "visually similar images" do NOT produce the same hash at all. The fact that Apple's system supposedly produces the same hash for "visually similar images" means this is not straightforward hash checking.
"Visually similar" meaning the hash can't be defeated by cropping, changing to black and white, etc. It doesn't mean "both pictures have people in them". Here's the paper: https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf
 
  • Like
Reactions: flowsy
That's not exactly true, the first part of the announcement clearly states they are adding technology to scan all iMessage photos. The presumption right now with that part is it is a feature that has to enabled and presumably by a parent in relation to a minor child's associated Apple ID.

But that's a separate issue. Those images aren't being reported, but blurred or blocked. It's only known child abuse images being uploaded to iCloud that are being flagged and potentially reported outside of Apple.
 
  • Like
Reactions: flowsy
The only people who should be opposed to these specific features are people who are committing crimes, and people who don’t care about the safety of their children.
Oh, so you are definitely part of the group of people that believe anything companies and government say to them. Like nothing can go wrong or went wrong as the history teaches us. Ok, keep believing in everything you read and hear. Guess you're also one of those people that want to jail those who don't want to take the experimental not properly tested and unapproved vaccine.
 
  • Wow
Reactions: Maconplasma
But the CSAM paper says that it's scanned on device for hashes. They're not scanned in the cloud, they're scanned on device. I honestly don't see the problem here?
Do I have images of this nature on my phone, no? Would I want my kids to be notified of the dangers behind images like this, yes?

But most importantly, stepping out of this whole situation for a moment - which I don't think some people can - is it tangibly going to affect my life or usage of my iPhone? Absolutely not.

It's for the greater good, and I think people need to accept that.

Last point, I honestly don't think Apple, who are under so much privacy scrutiny at the moment, would announce a feature like this unless it's absolutely water tight. And from reading the technical paper regarding this, I do believe it is.
Hmmm you know i was born i Poland in 70s and i know a lot about ‘greater good’. So excuse me if i smell same bs like all greater good causes. Its just invasion of privacy. And i will oppose to that. Step by step all of the companies are trying to strip you from it for profit. Apple is no better than all of the other big tech. And yes it is gonna affect my use of phone. I will not take pictures.
 
So, apparently just saving the wrong picture in your photo library is enough for your account to be flagged.

This cannot be right. I see so many ways this could go wrong and people could get easily framed for committing a crime that they didn’t commit.

As an example: what if somebody sent you a nude from a girl that is supposed to be 18, but instead it turns out to be 15 and that image was flagged as child pornography?
Just by saving such a picture to your photo library your account is going to be reported.

Also keep in mind that there are other things that are considered equally illegal, such as storing copyrighted material on your computer without having a license to keep a private copy. How long until Apple scans your whole hard drive and reports you because you have a copy of an old movie sitting somewhere in your filesystem?
 
I'm no law expert, but none of that makes any sense. You are given a chance to read the terms - Apple doesn't need to lock out the "Agree" button for 10 minutes or whatever to make it legally binding. And if you let someone else set up an account in your name without supervision and review, that's on you. Maybe things like this will make people start being a bit more vigilant about things like that.
The VZW stores near me don’t give you the option to set up your own phone. I’ve even said, don’t open the box I’ll activate it myself and they tell me they can’t sell it without activating it.
 
The main problem I see with this is this demonstrates an access or backdoor to peoples phones Apple, supposedly, didn't have before. On a massive scale too. As per the article. This doesn't scan iCloud. This scans the users phone before it goes to iCloud. Then reports the phone to Apple. If something is found.

Which means they can potentially scan users phones against any database. Once you get past looking at the apps. It's all a hierarchical filesystem underneath.

What's to stop a dictatorial country from demanding Apple scan against their own databases of known illegal images. Then reporting their citizens. Perhaps databases of adult pornography or gay imagery. Even political or religious imagery.

It's also scanning text messages for types of images. Not just those found in databases. Which potentially means any type of image may also be flagged by this system. Even if it doesn't exist in a database. As all they'd need to do is change the criteria.

If they have the ability to scan images on users devices. Text would be far simpler. Flagging keywords in messages or outlawed books.

Apple knows full well how bad this may be twisted. That's why they introduced it with the tried and true method of "Won't somebody think of the children."

It's impossible to argue against the objectives of what they are doing in this specific case, which is why it's also an excellent place to start with on-device scanning on behalf of law enforcement.

It should be very easy to extend this to meet the demands of different governments around the world. As long as there is a source database to run scans against to identify targets, I expect it will be relatively trivial for laws to be passed to force Apple to extend the capabilities of the technology to meet local laws in different jurisdictions around the world. By "extending the capabilities", I mean doing content matching against other databases out there - and not just the one they have chosen in this use case.

It will be fascinating to see where this goes in the future.

I can understand why some people have concerns, equally I understand why other people think the top priority is tackling the illegal images issue now.... and we just have to hope Apple can successfully limit its expansion into other more problematic areas.

It must have been a very difficult decision for Apple to do this, which is why it's taken them so long to actually do it. They will be acutely aware of the risks on the downside.
 
The VZW stores near me don’t give you the option to set up your own phone. I’ve even said, don’t open the box I’ll activate it myself and they tell me they can’t sell it without activating it.

But you can still read and agree to the user agreement yourself there in the store.
 
You don't own the device, nor the OS, and that is the element these searches are going to run out of. You own a license to use the hardware and the OS. While your data is your own, they don't need permission to run anything off it.

Android is the same way.

---

Honestly, as a father of 4 (four) kids, when I saw this, I was relieved, and thankful Apple is taking steps to weed out people that have the horrible practice of preying on our children.

I was appalled to see so many ignorant responses about the subject, and granted I feel that, on a topic as delicate as this one, it should have fallen to the journalist to do due diligence and explain deeper how all of this works fro a granular point of view, to see panic at bay.

Nobody at Apple Park is going to sit down flipping your photos or your nudes. It's all code. Only if the scans for hashes match what is in the CSAM DB, then the content will be copied to somewhere to be reviewed by other algorithms, if those algorithms determine that the content in question is a positive, then further development is carried out.

And honestly if you have any photos who's hash is on that DB, excluding a false positive then I don't know what to tell you... go hire a good lawyer... move to the moon.
Thin end of the wedge.

You’re right in your intention of course, but the very real fear is that the righteous endeavour is but a Trojan Horse, dressed up in noble finery.

as with anything in life, don’t be guilty, sure. But this isn’t the only way, and it isn’t the right way.

it’s all about precedents, and the ”ignorant responses” you refer to are from people who also care about the children. I‘m a father of 3 young, digitally savvy children. I‘m aware of their steps into an expanding digital landscape, and I’m also very aware that they’re privacy and their rights to freedom of speech and expression should be protected at all costs, just as yours should.

and Don’t presume that i mean that those costs are at their expense. This is the wrong approach from Apple because it’s on device, and it exploits a back door that won’t necessarily be limited to morally acceptable access in the future, because who decides on censorship?

publicly available social media access is another thing. If pictures are being shared on Facebook, Twitter, Instagram, whatever, then go ahead and track them down. All energy in that direction.

but if you open that back door, it is like I said: the thin end of the wedge. It ends badly.
 
I can understand why some people have concerns, equally I understand why other people think the top priority is tackling the illegal images issue now....
Help me understand. Using a known image database does what exactly? Does it stop people from engaging in the behavior all together? Does it discourage reuse thus promoting the creation of new photos? Does it target those who feel they need to see the picture or those looking to profit from it?

it sounds like this removes a little privacy for everyone while encouraging those who profit from this content to make and sell more of it and those who feel they need it to acquire more.
 
But you can still read and agree to the user agreement yourself there in the store.
No, because they agree to it for you. They don’t ask or even tell you about it. And since it’s the carrier doing it, and technically theirs until they hand it off, they can get away with it.
 
So help me understand. From now or after the update, all my photos on the cloud are being scanned? My vacation pictures? My wife’s surgery and when she was sick? My child when she’s bathing with other kids when they were little? So that they can try and find a match? Is this how I’m interpreting this?
 
So help me understand. From now or after the update, all my photos on the cloud are being scanned? My vacation pictures? My wife’s surgery and when she was sick? My child when she’s bathing with other kids when they were little? So that they can try and find a match? Is this how I’m interpreting this?
Pretty much, yes

But not not to worry, its only on device. And by an algorithm. Unless there‘s a match, in which case a human is going to check… (/s)

Apple confirmed the feature is only active when using iCloud Photo though. I wonder for how long this is
 
Last edited:
  • Like
Reactions: ssgbryan
Oh, so you are definitely part of the group of people that believe anything companies and government say to them. Like nothing can go wrong or went wrong as the history teaches us. Ok, keep believing in everything you read and hear. Guess you're also one of those people that want to jail those who don't want to take the experimental not properly tested and unapproved vaccine.
Yes, you should definitely go to jail for that. (P.S.: it has been approved, it works, and the data set proving it works is > 300 million people)
 
I’m not worried about this algorithm finding out something about me relating to children pictures because I haven’t got any personal interest in that, no guilt, nothing to hide. This on device attack on privacy approach from Apple worries me because it proves that my data can and will be accessed and ‘mined’ for any innocuous content that someone decides can be manipulated to be offensive or politically motivated or whatever else is conjured up, driven by an oppressive urge for power or profit.
 
So, apparently just saving the wrong picture in your photo library is enough for your account to be flagged.

This cannot be right. I see so many ways this could go wrong and people could get easily framed for committing a crime that they didn’t commit.

As an example: what if somebody sent you a nude from a girl that is supposed to be 18, but instead it turns out to be 15 and that image was flagged as child pornography?
Just by saving such a picture to your photo library your account is going to be reported.

Also keep in mind that there are other things that are considered equally illegal, such as storing copyrighted material on your computer without having a license to keep a private copy. How long until Apple scans your whole hard drive and reports you because you have a copy of an old movie sitting somewhere in your filesystem?

I suppose it's theoretically possible to be tricked into saving an image of someone who is in the database they are scanning against. You are then flagged up by a matching on-device scan, and days/weeks/months later you hear law enforcement breaking down your door, before they seize all your computer equipment for forensic examination.

Do I think it's likely? No.

Apple has every motivation to build in all the safeguards necessary to ensure no horror stories such as the ones raised by you ever turn into reality. It will be an absolute disaster for their reputation if things did start going horribly wrong with this.
 
  • Like
Reactions: flowsy
LOL at the people mad at Apple when Google does much worst.

I suspect many people simply weren’t aware that google was already doing such a thing, or have become numb to it because you lose all expectations of privacy with a company like google. As with many things, it only becomes a problem when Apple does it, but I suppose this is a happy problem of sorts. Hopefully, it will spur more discussion in this area, but just for Apple but for the entire industry.

You can’t condemn only Apple while giving every other tech company a free pass in this area.
 
Ironically, I am now having to question the ethics of setting my children up with iPhones if they are to be exposed to a harmful and massive invasion of privacy from Apple (or the highest bidder) themselves, now and going into their future digital lives.
 
Help me understand. Using a known image database does what exactly? Does it stop people from engaging in the behavior all together? Does it discourage reuse thus promoting the creation of new photos? Does it target those who feel they need to see the picture or those looking to profit from it?

it sounds like this removes a little privacy for everyone while encouraging those who profit from this content to make and sell more of it and those who feel they need it to acquire more.

It may change some behaviour if people recognise the chances of them being caught in possession have been increased with Apple implementing this technology. Any reduction in the trade of this material is better than none.

But as I said in the sections you snipped out, I equally recognise the dangers of how this technology may be adapted and extended in future to meet the demands of governments around the world.

I doubt it was an easy judgement call for Apple to go ahead and do this. I'm sure there were many heated meetings in which people on both sides debated ferociously where this may lead.
 
  • Like
Reactions: flowsy and I7guy
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.