Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
This is an invasion of privacy. I can understand Apple et al wanting to protect the integrity of their serves, etc. However, to be able to, to be allowed to invade individuals' phones is a not so thin end of a very big wedge.

There will be mistaken allegations, people's reputations destroyed, etc. Think of baby bathes, different cultures such as Japan were family bathes, indeed, communal public bathes etc are common.....

And of course, every authoritarian and/or dictatorial government will want the technology for their own legal purposes. Think China, Burma, Russia, Saudi, etc. As has been well reported, tech companies are experts at using the defense of "we must follow the laws of the country".....Does anyone seriously think Apple will walk away from the China market when asked to use the technology to find demonstrators?

Finally, how long before this technology will be everywhere in the Apple universe; iPads, desktops, MacBook Air, MacBook Pro.....?

My understanding is that this is like Google Pixels "now playing" - it is a downloaded local database of hashes of known pornographic photos of children, and it compares those to your photos that you are uploading to icloud photos. That's it.

If you take photos of you kid in the bath, unless you've already distributed those photos to people that have had them distributed to child porn sites and they have been found and hashed by Apple/the government, you're fine.

Don't distribute kiddy porn and you'll be fine.
In terms of google, well google already scan and index all the photos you upload to google photos, so you shouldn't be any more concerned about this news of Apple trying to stop kiddy porn.

If anything, Apple is actually behind the curve on this one. Microsoft, Google, and Dropbox have been scanning shared images on their cloud servers for years now to varying degrees of success.

I believe Apple is doing the right thing, a lot this initial uproar is because people don’t understand what is actually going on (which says a lot, because it’s actually nothing new), and that in time, it will gain acceptance.
 
  • Like
Reactions: flowsy
So the Police will have to get a court in order to search your home.

Meanwhile Apple, will just investigate your “property” eventhough there is 0 evidence that you are a criminal and without a court order. This is wrong.
 
The moment Apple reads whats on my device and reports it, is the moment I say au revoir to Apple. You no different than Goole and Facebook. Note this is what we know, god knows what else they are scanning we do not know about. What happened to what goes in your phone stays on your phone?

hate to say it, but Richard Stallman was right.

Time for the FOSS community to up their game for smartphone OS and apps for linux. I miss you Steve.
 
  • Like
Reactions: jk1221 and Exponent
So the Police will have to get a court in order to search your home.

Meanwhile Apple, will just investigate your “property” eventhough there is 0 evidence that you are a criminal and without a court order. This is wrong.
of course the old arguement is "you accepted the terms and service"... which technically you did and this is why FOSS (open source) software is better. You just use it, you do not give up any rights whatso ever, and the source code is available freely to be audited for any misuse by anyone who understand code language.
 
Any one looking for human respecting alternative than iCloud that is being viewed by Apple employees I suggest Proton Drive when is end to end encrypted. This means only the guy with the password , which is you, can see what is in it.
 
My understanding is that this is like Google Pixels "now playing" - it is a downloaded local database of hashes of known pornographic photos of children, and it compares those to your photos that you are uploading to icloud photos. That's it.

If you take photos of you kid in the bath, unless you've already distributed those photos to people that have had them distributed to child porn sites and they have been found and hashed by Apple/the government, you're fine.

Don't distribute kiddy porn and you'll be fine.
In terms of google, well google already scan and index all the photos you upload to google photos, so you shouldn't be any more concerned about this news of Apple trying to stop kiddy porn.

If anything, Apple is actually behind the curve on this one. Microsoft, Google, and Dropbox have been scanning shared images on their cloud servers for years now to varying degrees of success.

I believe Apple is doing the right thing, a lot this initial uproar is because people don’t understand what is actually going on (which says a lot, because it’s actually nothing new), and that in time, it will gain acceptance.
A defense using whataboutism isn't great.

The whole point of Apple is they are not doing these things. If they are just copying what MS, Google and Dropbox are doing, they are no better than them.

Privacy my ass!
 
My understanding is that this is like Google Pixels "now playing" - it is a downloaded local database of hashes of known pornographic photos of children, and it compares those to your photos that you are uploading to icloud photos. That's it.

If you take photos of you kid in the bath, unless you've already distributed those photos to people that have had them distributed to child porn sites and they have been found and hashed by Apple/the government, you're fine.

Don't distribute kiddy porn and you'll be fine.
In terms of google, well google already scan and index all the photos you upload to google photos, so you shouldn't be any more concerned about this news of Apple trying to stop kiddy porn.

If anything, Apple is actually behind the curve on this one. Microsoft, Google, and Dropbox have been scanning shared images on their cloud servers for years now to varying degrees of success.

I believe Apple is doing the right thing, a lot this initial uproar is because people don’t understand what is actually going on (which says a lot, because it’s actually nothing new), and that in time, it will gain acceptance.
My understanding this is NOT scanning the cloud services. Apple too has been doing that for years. This is technology that scans all the images on your phone while on your phone. To be clear, uploading to the cloud is not necessary.

In my mind, this is akin to Apple, or anyone else for that matter, without suspicion or cause, simply entering your house without authorization to rummage through your belongings....
 
  • Like
Reactions: jk1221
getting insane flashbacks to the Apple/Google COVID-detection API…read the f—ing spec
 
  • Like
Reactions: flowsy
My understanding this is NOT scanning the cloud services. Apple too has been doing that for years. This is technology that scans all the images on your phone while on your phone. To be clear, uploading to the cloud is not necessary.

Apple has been scanning the photos on your device for years! Objects, faces... Why is nobody talking about that? Just search for water, rainbow, bikini, bra and so on. And Apple determines what can be found and what I can search for. Can I turn this off? No. The technology has been around for a long time, and of course it only takes one little switch that needs to be flipped to open the door to abuse. As always.

Is there potential for abuse here to single out any opposition figures and government critics? Maybe? Are there already much better and easier ways to do that. Absolutely.
 
This is a new discussion that has nothing to do with what we were talking about.

Actually it has everything to do with it as anyone can see, but whatever. I'm done discussing this with you and I've already made my point abundantly clear.
 
manually reviewed? so, you are okey with your own child photos being compared with some false match and all the question from god knows, what kind of officer you are talking with? we will face same discrimination issues(race/religion/gender what not) with this kind of reviewing/policing.

Huh? In the REMOTE chance there is a false match, it will never end up in the hands of law enforcement precisely BECAUSE Apple manually reviews it. What is so hard to understand about this?
 
I think you are right saying that the chances are slim. But then you could also justify any other exploit that puts user privacy at risk in the name of doing something good.

Also, remember that when somebody is sharing content over the internet, you cannot tell where it actually comes from.

Let's say you get a nude of a girl that looks 18-21 years old from your friend "Joe" and you save it to your photo library.

Joe got it from Alan, that in turn got it from James. James found it on a website. The guy who uploaded the picture on the website is somebody who tricked this girl into posing nude.

After a while this girl goes to the police to report the abuse/revenge porn and the authorities flag her nudes because she is only 15 years old (even if in the picture she looked older or at least not underage).

Now you have a picture in your photo library that has been flagged, and your account is reported to the authorities.

Can you imagine what kind of consequences you would face as a private citizen or even more as a public figure if only people suspected that you owned child pornography because of an investigation triggered by an automated system like this?
There's an easy solution for this:

Don't download pics of girls you don't know for your spank bank.
 
Huh? In the REMOTE chance there is a false match, it will never end up in the hands of law enforcement precisely BECAUSE Apple manually reviews it. What is so hard to understand about this?
You probably need to give an example to people of what trillion to one odds actually mean. I was correct earlier when I called people on here "stupid.". 🤣
 
You probably need to give an example to people of what trillion to one odds actually mean. I was correct earlier when I called people on here "stupid.". 🤣

Well I WILL say it's quite obvious some people here are imbalanced are are thinking with their emotions instead of reason. It's like they are incapable of recognizing the good parts of this technology while still disapproving of the bad (potentially bad, at least) aspects. Black and white thinking.
 
I'll say it one more time for the slow people in the back:

* The iMessage option is for family iCloud accounts only (according to the article) and allows parents to protect their kids from receiving unwanted (or wanted) sexually explicit images on iMessage. Yes, the underage kids can still send naked pics to each other over WhatsApp, SnapChat, Messenger, etc.

* The iCloud scanning for child pornography is done on the device and utilizes a set database of KNOWN child pornography images that the algorithm checks against. No Apple employee is checking your innocent kid pics from the bath tub or running around without their diaper on. There is a ONE IN A TRILLION chance that one of your innocent photos is so perfectly taken that it MAY match the details of one of the child pornography pics (try to even imagine those odds). That is the only case in which some poor person at Apple will verify that the images are indeed NOT the same...and nothing happens.

* If you receive a pic (from any source) that is a known child pornography pic (even if you don;t know that...even if you think the person looks 18 or over)...and you save it to your camera roll/save to iCloud. Yes...you are technically trafficing child pornography and could be prosecuted. DON'T DOWNLOAD PICS OF POSSIBLE CHILD PORN!
 
I dont even understand why are Macrumors so worked up about it. I mean Macrumors comments have been telling you for YEARS.

It is Apple's iPhone, Apple's iOS, Apple's App Store and Apple everything. They are not yours. If you dont like it go somewhere else.[…]
Agree. People may end up going somewhere else.
 
Surely this won't have undesired consequences. What could go wrong?
oZwaEXK.jpg
 
Ok. I think this opens up a seriously dangerous precedent. I have a few pictures on my DB of my kids private parts, i have to send them to their prediatrician… and I would like to keep those. This feels really invasive… specially if I am not notified of a possible abuse of THEIR terms.

I live in a country that is way more used to natural nudity than lanlgo-saxans are… so where are the limits? This could be culture based. I, for myself, would not want for the lab print guy (on our old day) to be poking my photos.

Feeels lika a HUGE step backwards in privacy.
 
I'll say it one more time for the slow people in the back:

* The iMessage option is for family iCloud accounts only (according to the article) and allows parents to protect their kids from receiving unwanted (or wanted) sexually explicit images on iMessage. Yes, the underage kids can still send naked pics to each other over WhatsApp, SnapChat, Messenger, etc.

* The iCloud scanning for child pornography is done on the device and utilizes a set database of KNOWN child pornography images that the algorithm checks against. No Apple employee is checking your innocent kid pics from the bath tub or running around without their diaper on. There is a ONE IN A TRILLION chance that one of your innocent photos is so perfectly taken that it MAY match the details of one of the child pornography pics (try to even imagine those odds). That is the only case in which some poor person at Apple will verify that the images are indeed NOT the same...and nothing happens.

* If you receive a pic (from any source) that is a known child pornography pic (even if you don;t know that...even if you think the person looks 18 or over)...and you save it to your camera roll/save to iCloud. Yes...you are technically trafficing child pornography and could be prosecuted. DON'T DOWNLOAD PICS OF POSSIBLE CHILD PORN!

It has been stated that this feature will also know an image even if it’s been cropped, rotated, color adjusted, pixel changes, transformed, and any edits basically. So there is some estimation and leeway. How is it guaranteed that an adult/legal age consenting photo that might have similar “pose” or “angles” or “lighting” will not get flagged?

Nobody has explained this. They keep saying “it can’t happen” or keep stating the “one in a trillion” metric, but if the feature can track any photo manipulation, there is a higher risk of legit photos being flagged.
 
It has been stated that this feature will also know an image even if it’s been cropped, rotated, color adjusted, pixel changes, transformed, and any edits basically. So there is some estimation and leeway. How is it guaranteed that an adult/legal age consenting photo that might have similar “pose” or “angles” or “lighting” will not get flagged?

Nobody has explained this. They keep saying “it can’t happen” or keep stating the “one in a trillion” metric, but if the feature can track any photo manipulation, there is a higher risk of legit photos being flagged.
I don’t think you understand the tech involved in identifying the photos…I don’t either on a technical level, but they’ve explained it enough to where I have no reason to doubt it.

If the “one in a trillion” statement applies to the photo manipulation as they clearly stated, you can disagree with it I guess, but that doesn’t mean your statement is valid. They clearly link the white paper explaining why your innocent photos will not be mistaken for known child pornography.

Again, you don’t have to believe it, but that doesn’t make it a false claim based on the data.
 
I don’t think you understand the tech involved in identifying the photos…I don’t either on a technical level, but they’ve explained it enough to where I have no reason to doubt it.

If the “one in a trillion” statement applies to the photo manipulation as they clearly stated, you can disagree with it I guess, but that doesn’t mean your statement is valid. They clearly link the white paper explaining why your innocent photos will not be mistaken for known child pornography.

Again, you don’t have to believe it, but that doesn’t make it a false claim based on the data.

I do understand. It’s not a 1:1 hash comparison otherwise those photo manipulation changes would be an easy way to avoid this. So how does a similar picture with an adult subject avoid detection but they can track all these manipulation to the picture? I don’t trust the one in a trillion when they also say any manipulation is still a match.
 
This is an invasion of privacy. I can understand Apple et al wanting to protect the integrity of their serves, etc. However, to be able to, to be allowed to invade individuals' phones is a not so thin end of a very big wedge.

There will be mistaken allegations, people's reputations destroyed, etc. Think of baby bathes, different cultures such as Japan were family bathes, indeed, communal public bathes etc are common.....

And of course, every authoritarian and/or dictatorial government will want the technology for their own legal purposes. Think China, Burma, Russia, Saudi, etc. As has been well reported, tech companies are experts at using the defense of "we must follow the laws of the country".....Does anyone seriously think Apple will walk away from the China market when asked to use the technology to find demonstrators?

Finally, how long before this technology will be everywhere in the Apple universe; iPads, desktops, MacBook Air, MacBook Pro.....?

Ive certainly raised this issue a few pages back. Apple must adhere to the laws of the countries they operate in. There is nothing stopping this tech from being abused in other dictatorships. People tend to think very narrowly only in the US or other democracies.

What about Myramar, Belarus, China, etc all with gross human rights abuses in recent history? They can simply say ok now flag and turn over anything we named as "illegal" in our country. It is as illegal to us under our laws as CSAM. Apple is not exactly known to stand up against countries after all; ESPECIALLY China.

It's a VERY slippery slope to be abused, maybe not where you all live (right now at least), but in certain parts of the world. People need to think broader and less bout themselves here. This is a policy that affects all Apple users everywhere.

And let's be real. The pervs will just turn off icloud next. Most are not accidentally doing this but professionals/long-time at it with a high level of tech skills. So we will all give up our privacy when Apple will catch a tiny fraction of people. All in the name of "safety." Gee we haven't heard that argument before.

Remember, we are still taking our shoes off at airports over one basically (failed) non-incident 20 years ago now in the name of safety. But of course, taking your shoes off is a low burden vs invading your private photos. And a very different analysis. But point being is that the safety argument being pushed to where you cannot possibly disagree with it is not new to invade privacy.
 
Last edited:
It'll be interesting to see what the iOS 15 adoption rate is. Normally they get to 50% in a matter of weeks, but I have a feeling that people who really value their privacy won't update.
 
  • Like
Reactions: RedRage and _Spinn_
Ive certainly raised this issue a few pages back. Apple must adhere to the laws of the countries they operate in. There is nothing stopping this tech from being abused in other dictatorships. People tend to think very narrowly only in the US or other democracies.

What about Myramar, Belarus, China, etc all with gross human rights abuses in recent history? They can simply say ok now flag and turn over anything we named as "illegal" in our country. It is as illegal to us under our laws as CSAM. Apple is not exactly known to stand up against countries after all; ESPECIALLY China.

It's a VERY slippery slope to be abused, maybe not where you all live (right now at least), but in certain parts of the world. People need to think broader and less bout themselves here. This is a policy that affects all Apple users everywhere.

And let's be real. The pervs will just turn off icloud next. Most are not accidentally doing this but professionals/long-time at it with a high level of tech skills. So we will all give up our privacy when Apple will catch a tiny fraction of people. All in the name of "safety." Gee we haven't heard that argument before.

Remember, we are still taking our shoes off at airports over one basically (failed) non-incident 20 years ago now in the name of safety. But of course, taking your shoes off is a low burden vs invading your private photos. And a very different analysis. But point being is that the safety argument being pushed to where you cannot possibly disagree with it is not new to invade privacy.

Following on from the inevitable thin end of the wedge, for those who see no issue, what will be said when Apple deems it necessary to use similar technology to review all banking, and financial related transactions on iPhones, iPads, etc.? After all, money laundering, foreign corrupt practises, etc must all followed in order to protect the population . . . .
 
  • Like
Reactions: WestonHarvey1
It'll be interesting to see what the iOS 15 adoption rate is. Normally they get to 50% in a matter of weeks, but I have a feeling that people who really value their privacy won't update.
Writing here in the forum is kind of useless i guess, as Apple does neither read it nor care about it, but your idea should be spread. I have "nothing to hide", but I am not willing to give Apple the opportunity to look at my private pictures, if they are eventually flagged erroneously. It is MY OWN private stuff. Anyone controlling it, violates my privacy in an inacceptable manner.

Not to think about the hassle, if my account gets "erroneously" blocked, as I depend on the generosity of that company to restore my rights, that have been taken away because of an algorhithm, "thinking" that I am doing something unlawful.

We should elaborate on the idea, just not to install that iOS 15...
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.