Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
As literally pointed out earlier in this thread, other companies have been already doing this for decades!!!
Do you think Google photos users are worried about people storming into the house, killing them over false positives? Of course not.
Completely ridiculous argument, apples not losing billions over anything, they’re not the first huge company to implement this
Scanning files in a directory is not this.
 
99.9% of you seriously have a reading comprehension issue, are just plain stupid, or are seriously bored and want stuff to complain about.

The very first picture at the top of the article:
* ON DEVICE SOFTWARE..not cloud based
* APPLE NEVER SEES YOUR PHOTOS
* ZERO REPORTING DONE BY APPLE (for iMessage..yes, they may report for known child pornography)!!!!!! Only the receiver of the pics is warned and then may report if necessary/appropriate.

This is an OPTIONAL feature that parent's can use to protect kids. And if some woman (more so than men typically) is getting sick of getting unwanted d**k pics, I guess they could turn it on as well.

* Edited to clarify what Apple may report on.
 
Last edited:
Your problem here is that with how the 4A and subpoenas go, anywhere you go for cloud storage will have the same problem. You won't physically own your data on those servers; the people who own the servers will. This is the same problem everyone has with Facebook: When you upload any photos or make any posts to it, they own those posts and pictures, as they are now in possession of it, not you.

That concept carries over to any cloud service, so user beware.

BL.

Importantly this is not a legal requirement, the companies choose to structure their legal T&Cs like this. They are not required to own content you upload to their cloud.

Another way to get around this is to ACTUALLY end to end encrypt the data on cloud servers such that Apple getting access is impossible (the lavabit strategy). That way the legal conversation goes "Sure you can have access to our servers, but the data is completely useless as it's literally impossible for anyone except the owner to read it"
 
This is great. The iMessage stuff is smart for a way to stop unwanted pics from circulating.
I have 3 kids. They all use iMessage for communicating with my wife and I. They don't use iMessage to communicate with their friends. I don't know how much of an impact this will have if all other messaging apps don't get on board and do the same thing.
 
  • Like
Reactions: flowsy
so Apple wants to do police work

by mass surveillance

on their customers phones

within customers private photos

on the basis of a third party database

which can contain anything

all this to maybe catch a few

😳

maybe I’ll do what the perverts did a long time ago

and switch to a dumb phone
 
Last edited:
Yes but that mean Apple will be scanning your photos. And trust me soon its gonna be completly different justification of the invasion. That is the reason I do not use any cloud service. Apple or any other
But the CSAM paper says that it's scanned on device for hashes. They're not scanned in the cloud, they're scanned on device. I honestly don't see the problem here?
Do I have images of this nature on my phone, no? Would I want my kids to be notified of the dangers behind images like this, yes?

But most importantly, stepping out of this whole situation for a moment - which I don't think some people can - is it tangibly going to affect my life or usage of my iPhone? Absolutely not.

It's for the greater good, and I think people need to accept that.

Last point, I honestly don't think Apple, who are under so much privacy scrutiny at the moment, would announce a feature like this unless it's absolutely water tight. And from reading the technical paper regarding this, I do believe it is.
 
  • Like
Reactions: flowsy
99.9% of seriously have a reading comprehension issue, are just plain stupid, or are seriously bored and want stuff to complain about.

The very first picture at the top of the article:
* ON DEVICE SOFTWARE..not cloud based
* APPLE NEVER SEES YOUR PHOTOS
* ZERO REPORTING DONE BY APPLE!!!!!! Only the receiver of the pics is warned and then may report if necessary/appropriate.

This is an OPTIONAL feature that parent's can use to protect kids. And if some woman (more so than men typically) is getting sick of getting unwanted d**k pics, I guess they could turn it on as well.

I guess you're in that 99.9% percent because literally every bullet point you just wrote down does nothing to dissuade the concerns that people have. It's not an "OPTIONAL" feature either, that's just for the nude scanning part of iMessage, I guess you would know that if you weren't in the 99.9% of people that have reading comprehension issues according to you.
 
99.9% of you seriously have a reading comprehension issue, are just plain stupid, or are seriously bored and want stuff to complain about.

The very first picture at the top of the article:
* ON DEVICE SOFTWARE..not cloud based
* APPLE NEVER SEES YOUR PHOTOS
* ZERO REPORTING DONE BY APPLE!!!!!! Only the receiver of the pics is warned and then may report if necessary/appropriate.

This is an OPTIONAL feature that parent's can use to protect kids. And if some woman (more so than men typically) is getting sick of getting unwanted d**k pics, I guess they could turn it on as well.
On device software is the problem. The child porn is a red herring.

This is a local hash analysis engine.
It can be updated to be the "white supremacy" finder. So who cares, that's bad too!
How about the "racist word" hash finder?
How about the "I voted for Trump/Biden" hash finder?

Do you get it yet?
 
Importantly this is not a legal requirement, the companies choose to structure their legal T&Cs like this. They are not required to own content you upload to their cloud.

Another way to get around this is to ACTUALLY end to end encrypt the data on cloud servers such that Apple getting access is impossible (the lavabit strategy). That way the legal conversation goes "Sure you can have access to our servers, but the data is completely useless as it's literally impossible for anyone except the owner to read it"

This would also depend on the state of the data as it exists on the server. If the data itself sits unencrypted on a server, then the end-to-end encryption of the data is effectively useless, as the encryption state would only exist on the transmission of the data, not when the data finally exists and rests on the server. If the data is unencrypted, you'd still have the problem.

And from the way the article reads in how Apple would be creating hashes and comparing the hashes, that implies that the data could be sitting unencrypted on Apple's servers.

BL.
 
  • Like
Reactions: zakarhino
99.9% of you seriously have a reading comprehension issue, are just plain stupid, or are seriously bored and want stuff to complain about.

The very first picture at the top of the article:
* ON DEVICE SOFTWARE..not cloud based
* APPLE NEVER SEES YOUR PHOTOS
* ZERO REPORTING DONE BY APPLE!!!!!! Only the receiver of the pics is warned and then may report if necessary/appropriate.

This is an OPTIONAL feature that parent's can use to protect kids. And if some woman (more so than men typically) is getting sick of getting unwanted d**k pics, I guess they could turn it on as well.

You may want to include yourself in that reading comprehension issue, because straight from the OP:

Second, starting this year with iOS 15 and iPadOS 15, Apple will be able to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos, enabling Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC), a non-profit organization that works in collaboration with U.S. law enforcement agencies.

This is Cloud based, as it pertains also to iCloud Photos. So it may help to practice what you preach, and instead of carrying on about the perceived stupidity of other users, RTFM yourself so you don't get caught up in your own rants.

BL.
 
But most importantly, stepping out of this whole situation for a moment - which I don't think some people can - is it tangibly going to affect my life or usage of my iPhone? Absolutely not.

^ And that, ladies and gentlemen, is why authoritarian powers will continue their rampage of power grabbing and abusing people, because so long as their actions don't "tangibly affect my life or usage of my iPhone" today then it's of no concern to the average citizen.

I'm going to stop caring about climate change, racism, mass surveillance, rampant capitalism, and most other urgent issues because as of right now I can still 1) shop at Whole Foods and 2) open Instagram in the morning. My life isn't inconvenienced by those things today so I shouldn't concern myself with them.
 
Importantly this is not a legal requirement, the companies choose to structure their legal T&Cs like this. They are not required to own content you upload to their cloud.

Another way to get around this is to ACTUALLY end to end encrypt the data on cloud servers such that Apple getting access is impossible (the lavabit strategy). That way the legal conversation goes "Sure you can have access to our servers, but the data is completely useless as it's literally impossible for anyone except the owner to read it"
To clear things up… uploading to the cloud is not handing over your ownership.

“License from You. Except for material we may license to you, Apple does not claim ownership of the materials and/or Content you submit or make available on the Service. However, by submitting or posting such Content on areas of the Service that are accessible by the public or other users with whom you consent to share such Content, you grant Apple a worldwide, royalty-free, non-exclusive license to use, distribute, reproduce, modify, adapt, publish, translate, publicly perform and publicly display such Content on the Service solely for the purpose for which such Content was submitted or made available, without any compensation or obligation to you. You agree that any Content submitted or posted by you shall be your sole responsibility, shall not infringe or violate the rights of any other party or violate any laws, contribute to or encourage infringing or otherwise unlawful conduct, or otherwise be obscene, objectionable, or in poor taste. By submitting or posting such Content on areas of the Service that are accessible by the public or other users, you are representing that you are the owner of such material and/or have all necessary rights, licenses, and authorization to distribute it.”

Law enforcement requires an order from a court (subpoena) because they are still technically your property that law enforcement is seizing.

Note the difference in wording regarding “publicly available” areas you upload to and non public. And more explicitly that apple states they do not claim ownership.

No cloud service (that we think of in normal use) would claim to take ownership of user uploads. That’s a legal nightmare. That’s why they make you agree to contract with them (by hitting “I agree”) to allow use of the photos however they want (at least for Facebook) but they do not own the actual photos.
 
Last edited:
So a few takeaways here for me:
1. “Scanning hash” is part of NN/ML identifying images. But this will always have chances for false positive, and there is no telling those “human reviewers” are gonna be in line most of the time. Apple can definitely ease people’s mind by saying a ridiculously low number of false positive but I highly doubt this is true in practice.
2. “Nothing to worry cause nothing to hide” is the catalyst for full 1984. Dont worry. We are on our way there, ever faster than before.
3. You disagree with those EULA? Then there is literally no device you can use. Live like a caveman? Not an option now and will NOT be an option in the near future.
4. On-device Scanning, this time it’s for CSAM. What’s next? Political contents? Vaguely ”offensive“ materials deemed by countries and law enforcement? The possibility is endless. But yeah don‘t worry. There will definitely be a nice name in disguise for scanning all sorts of contents, not Just images.
5. Tom Scott got an amazing short video imagining A world where people of different class sees literally different version of the same world. I have a feeling this will come true much faster than people might think.
 
So a few takeaways here for me:
1. “Scanning hash” is part of NN/ML identifying images. But this will always have chances for false positive, and there is no telling those “human reviewers” are gonna be in line most of the time. Apple can definitely ease people’s mind by saying a ridiculously low number of false positive but I highly doubt this is true in practice.
2. “Nothing to worry cause nothing to hide” is the catalyst for full 1984. Dont worry. We are on our way there, ever faster than before.
3. You disagree with those EULA? Then there is literally no device you can use. Live like a caveman? Not an option now and will NOT be an option in the near future.
4. On-device Scanning, this time it’s for CSAM. What’s next? Political contents? Vaguely ”offensive“ materials deemed by countries and law enforcement? The possibility is endless. But yeah don‘t worry. There will definitely be a nice name in disguise for scanning all sorts of contents, not Just images.
5. Tom Scott got an amazing short video imagining A world where people of different class sees literally different version of the same world. I have a feeling this will come true much faster than people might think.

Your point number 3 is important because there's a few in this thread that seriously think "If you don't like it then don't upgrade your OS" is a legitimate response in the same way they probably think "If you don't like this country then leave" is a legitimate response.
 
I guess you're in that 99.9% percent because literally every bullet point you just wrote down does nothing to dissuade the concerns that people have. It's not an "OPTIONAL" feature either, that's just for the nude scanning part of iMessage, I guess you would know that if you weren't in the 99.9% of people that have reading comprehension issues according to you.
Read again, not only is it option for iMessage, but only applies to those users with a families account set up. Individual users most likely will not be able to even activate it. Parents protecting their kids...always a good option to have.

As far as the iCloud portion of the article, this is a set criteria of images in a database that are being compared to. I'm not worried about it...I don't have KNOWN (or unknown) child porn pics on my phone and shared into iCloud for Apple to compare to.

So, for the folks out there creating your own child porn and saving it to iCloud...don't worry, you'll be fine as long as you don't share it across the internet and get added to the database of images. :rolleyes:

This is a major problem in the world and I'm glad it is being addressed at least on some level.
 
  • Like
Reactions: flowsy
The main problem I see with this is this demonstrates an access or backdoor to peoples phones Apple, supposedly, didn't have before. On a massive scale too. As per the article. This doesn't scan iCloud. This scans the users phone before it goes to iCloud. Then reports the phone to Apple. If something is found.

Which means they can potentially scan users phones against any database. Once you get past looking at the apps. It's all a hierarchical filesystem underneath.

What's to stop a dictatorial country from demanding Apple scan against their own databases of known illegal images. Then reporting their citizens. Perhaps databases of adult pornography or gay imagery. Even political or religious imagery.

It's also scanning text messages for types of images. Not just those found in databases. Which potentially means any type of image may also be flagged by this system. Even if it doesn't exist in a database. As all they'd need to do is change the criteria.

If they have the ability to scan images on users devices. Text would be far simpler. Flagging keywords in messages or outlawed books.

Apple knows full well how bad this may be twisted. That's why they introduced it with the tried and true method of "Won't somebody think of the children."

Countries can already force apple to scan the photos on their servers. The way this works is your own device does a scan and reports matches only if you have iCloud syncing turned on. If you don't like it, turn syncing off -you were already reducing your privacy by having syncing enabled, after all.
 
HUGE PROBLEM WITH THIS. It doesn't matter to me that it only will flag "known bad image". In order to find the known bad images ALL of my images will be processed through who's servers? Or will my own phone scan and report back that I'm "clean"? And how exactly are these scanned? By counting pixel colors and comparing to a database then going keeping on that photo to see if it's a problem? Or literal matching against the "known bad images" database?

This literally nukes Apple's whole privacy marketing angle. If you are scanning photos, in any way, even at the metadata level, then privacy on the phone is 💀.
 
On device software is the problem. The child porn is a red herring.

This is a local hash analysis engine.
It can be updated to be the "white supremacy" finder. So who cares, that's bad too!
How about the "racist word" hash finder?
How about the "I voted for Trump/Biden" hash finder?

Do you get it yet?

No, reporting all those things sounds like a great idea to me.
 
You may want to include yourself in that reading comprehension issue, because straight from the OP:



This is Cloud based, as it pertains also to iCloud Photos. So it may help to practice what you preach, and instead of carrying on about the perceived stupidity of other users, RTFM yourself so you don't get caught up in your own rants.

BL.
I already answered this above, but why would anyone be worried about this unless they have known child porn images on their phone/in their icloud account?

It's not scanning your iCloud photos and reporting you because you took pics of your kids in the bath tub or running around naked like so many parents do.

These are KNOWN shared images for use in child pornography rings around the world...if a person has this in their iCloud account, they should be questioned about it.
 
HUGE PROBLEM WITH THIS. It doesn't matter to me that it only will flag "known bad image". In order to find the known bad images ALL of my images will be processed through who's servers? Or will my own phone scan and report back that I'm "clean"? And how exactly are these scanned? By counting pixel colors and comparing to a database then going keeping on that photo to see if it's a problem? Or literal matching against the "known bad images" database?

This literally nukes Apple's whole privacy marketing angle. If you are scanning photos, in any way, even at the metadata level, then privacy on the phone is 💀.

The scanning is done on your own device, by hashing the photo using a cryptographic algorithm and comparing to a set of known cryptographic hashes. If you have iCloud syncing turned off, then nothing happens.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.