Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
On device software is the problem. The child porn is a red herring.

This is a local hash analysis engine.
It can be updated to be the "white supremacy" finder. So who cares, that's bad too!
How about the "racist word" hash finder?
How about the "I voted for Trump/Biden" hash finder?

Do you get it yet?

It's a hash of specific images, not arbitrary subjects.
 
  • Like
Reactions: cmaier and MozMan68
But most importantly, stepping out of this whole situation for a moment - which I don't think some people can - is it tangibly going to affect my life or usage of my iPhone? Absolutely not.

It's for the greater good, and I think people need to accept that.
If it doesn’t tangibly affect you, or so you believe, then it’s fine? Interesting moral framework.

Parting you out and giving away your two kidneys, liver, heart, corneas, etc. to save multiple people at the expense of just one would be for the greater good. I think you need to accept that.
 
It's a hash of specific images, not arbitrary subjects.
Problem with that is… who is in control of this set of images?

federal Government in concert with Center Missing and Exploited Children.

So the government is being the ultimate baseline of what is “acceptable” or not… the child abuse aspect needs to be removed from this, as it poisons the debate around the legitimate concerns being raised here.
 
Hashed, encrypted, anonymized or any other abstraction technique is not a defense either. They are still running an algorithm on my device with my data, without my consent, and then send resulting data externally. And that is where the argument should end.
You don't own the device, nor the OS, and that is the element these searches are going to run out of. You own a license to use the hardware and the OS. While your data is your own, they don't need permission to run anything off it.

Android is the same way.

---

Honestly, as a father of 4 (four) kids, when I saw this, I was relieved, and thankful Apple is taking steps to weed out people that have the horrible practice of preying on our children.

I was appalled to see so many ignorant responses about the subject, and granted I feel that, on a topic as delicate as this one, it should have fallen to the journalist to do due diligence and explain deeper how all of this works fro a granular point of view, to see panic at bay.

Nobody at Apple Park is going to sit down flipping your photos or your nudes. It's all code. Only if the scans for hashes match what is in the CSAM DB, then the content will be copied to somewhere to be reviewed by other algorithms, if those algorithms determine that the content in question is a positive, then further development is carried out.

And honestly if you have any photos who's hash is on that DB, excluding a false positive then I don't know what to tell you... go hire a good lawyer... move to the moon.
 
Problem with that is… who is in control of this set of images?

federal Government in concert with Center Missing and Exploited Children.

So the government is being the ultimate baseline of what is “acceptable” or not… the child abuse aspect needs to be removed from this, as it poisons the debate around the legitimate concerns being raised here.
They didn't just pick random photos off the internet, these are known images use by sex traffickers and child pornography rings that are KNOWN because these people were caught and the images confiscated.

No one is scouring the internet and arbitrarily deciding which images are being use here.
 
  • Like
Reactions: Maconplasma
And honestly if you have any photos who's hash is on that DB, excluding a false positive then I don't know what to tell you... go hire a good lawyer... move to the moon.
Even apart from Apple scanning your on-device photos, that’s kind of where the wheels get a little wobbly, isn’t it?
 
  • Disagree
Reactions: farewelwilliams
Countries can already force apple to scan the photos on their servers. The way this works is your own device does a scan and reports matches only if you have iCloud syncing turned on. If you don't like it, turn syncing off -you were already reducing your privacy by having syncing enabled, after all.

Why did Apple move from a cloud solution to an on-device solution if this will only ever be used for iCloud photos (which should never be accessible/scannable by Apple in the first place BTW)?
 
You don't own the device, nor the OS, and that is the element these searches are going to run out of. You own a license to use the hardware and the OS. While your data is your own, they don't need permission to run anything off it.

Android is the same way.

---


Regarding the hardware - that statement is fundamentally incorrect. It’s a very common misconception. It is absolutely your property.
 
I already answered this above, but why would anyone be worried about this unless they have known child porn images on their phone/in their icloud account?

It's not scanning your iCloud photos and reporting you because you took pics of your kids in the bath tub or running around naked like so many parents do.

These are KNOWN shared images for use in child pornography rings around the world...if a person has this in their iCloud account, they should be questioned about it.

The problem you are missing here is the erosion of what is perceived to be your rights regardless of what you may or may not have regarding your data on any cloud service. The problem with your statement is the slippery slope effect:

  • If you aren't a terrorist, then you have nothing to worry about! - said shortly after 9/11, and with the Patriot Act.
  • If you don't use PGP, then you have nothing to worry about! - said shortly after cryptography was ruled to be munitions, causing it to not be allowed to be exported, making the author of it out to be a criminal, despite it getting exported and not by him.
  • If you don't have a yellow star, you don't have anything to worry about! - said during WW2.

You see where this argument is going.

I don't have a problem with this being used to root out child pornography; in fact, on that end I support this. The issue we have is that if they were to come after you - or anyone in fact - and find any data that seems to be compromising, they don't go directly to you for further investigation, in which you would be within your rights (at least in the USA) to require them to get a warrant (your 4A right). They wouldn't have to do that at all, because since Apple is in possession of that data, they can simply subpoena Apple for that data - your data, without your consent - and get it. You have no say in that.

That is the problem you have, and what you are not comprehending while hiding behind the attacking of others (the stupidity comment) and not fully understanding the situation yourself. That needs to be re-examined before going off on others, because the process where your rights come in is a HUGE problem.

You're worried about the end game: the results. You are missing the problem that is the process, and the process is an even bigger legal problem than the results. The results are fixed; the process is the slippery slope that can affect generations. You need to enlighten yourself to that.

BL.
 
It's a hash of specific images, not arbitrary subjects.

Nope. You're not accurately portraying what the technology does. Making an md5 hash of an image produces a single hash. Changing one pixel in that image produces an entirely different hash. If Apple's hashing technology utilizes a simple md5 hash (for example) then the accuracy of the tool would be 100% because it would only ever match exact pixel perfect copies of images that have already been identified. Apple's technology doesn't work this way:

"The main purpose of the hash is to ensure that identical and visually similar images result in the same hash, while images that are different from one another result in different hashes," said Apple in a new "Expanded Protections for Children" white paper. "For example, an image that has been slightly cropped, resized or converted from color to black and white is treated identical to its original, and has the same hash."

"Visually similar images" cannot be accounted for via a simple hashing strategy. If it's using a neural network as per the PR nonsense then it's not actually checking a binary hash (either it is the exact image or it's not), it's using some neural network system that provides a GUESS as to whether or not the image in question is a match. The problem with neural networks and guessing is that tricking a neural network into making an inaccurate guess is easier than you might think. There are many papers on how to do so.

Unless there is transparency on what that neural network system is and how it works you cannot state that this system wouldn't flag either 1) images that were explicitly, maliciously designed to trigger false positives or 2) home photos that might be mistaken for abuse images. Apple are misleading you by talking about image hashes because the expectation behind most hashing systems is that "visually similar images" do NOT produce the same hash at all. The fact that Apple's system supposedly produces the same hash for "visually similar images" means this is not straightforward hash checking.
 
They didn't just pick random photos off the internet, these are known images use by sex traffickers and child pornography rings that are KNOWN because these people were caught and the images confiscated.

No one is scouring the internet and arbitrarily deciding which images are being use here.
Don't waste your time here with facts. People just want to be mad at Apple when they know they don't really have a rational reason to be.
 
That's an idiotic argument. It's perfectly reasonable and healthy for people to be concerned about privacy. What you're essentially arguing is the whole if you have nothing to hide mantra then you shouldn't mind us searching your home, car, perso

You don't own the device, nor the OS, and that is the element these searches are going to run out of. You own a license to use the hardware and the OS. While your data is your own, they don't need permission to run anything off it.

Android is the same way.

---

Honestly, as a father of 4 (four) kids, when I saw this, I was relieved, and thankful Apple is taking steps to weed out people that have the horrible practice of preying on our children.

I was appalled to see so many ignorant responses about the subject, and granted I feel that, on a topic as delicate as this one, it should have fallen to the journalist to do due diligence and explain deeper how all of this works fro a granular point of view, to see panic at bay.

Nobody at Apple Park is going to sit down flipping your photos or your nudes. It's all code. Only if the scans for hashes match what is in the CSAM DB, then the content will be copied to somewhere to be reviewed by other algorithms, if those algorithms determine that the content in question is a positive, then further development is carried out.

And honestly if you have any photos who's hash is on that DB, excluding a false positive then I don't know what to tell you... go hire a good lawyer... move to the moon.

I'm sorry but takes like yours are a slippery slope with lots of precedent. You're willingly forfeiting your privacy for an extra inch of perceived safety. First of all, you do own your phone. The right to repair argument is not settled, and it's definitely tilting towards the consumer side as legal challenges are mounting in many wealthy countries. You do own your data also.

I'm not worried about Apple employees flipping through my photos. I'm worried about a $2.5T company with immeasurable influence nudging its customer base towards towards questionable privacy practices, under the guise of child protection. As I mentioned in my prior post, you can't just give up and say "well I have nothing to worry about, here's my extra bit of personal freedom".

I think Apple is otherwise doing an exemplary job with its privacy practices. But this development is a step backwards.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.