Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Apple is love. Apple is life. When Apple speaks, we must listen. We must heed the call of Apple. We must never question Apple. The word of Apple is true wisdom. We must trust in Apple. Apple knows best, and I trust Apple. I will always trust Apple, with every fiber of my being, now and into the future. I am one with Apple. Apple is.

All hail Apple.
 
They already have access to every iCloud account's plain data, and their normal operations must require that too. They could scan the photos on-server without much of a change. Also, there's no way they didn't figure out on-device scanning until now.

On device scanning or CSS (Client Side Scanning) has been around for a while. There are privacy, legal, security, and other issues that have “warned” off anyone trying it on public users and researchers have pretty much all said it’s fraught with risk Even if there were laws passed to force it. Then there are the legal challenges and Constitutional challenges.

Knowing all this (I am assuming Apple has done their due diligence) Apple is apparently trying it anyway. It is concerning when a company that touts privacy looks to launch something that affects millions of users and their justification is “think of the children” AND every privacy and security group is against them.
 
The case of the San Bernardino shooter proves the point. Doesn't get more real than that. They gladly handed over iCloud data (in conjunction with legal due process, of course), but firmly rejected allowing access to local storage, even in light of very heavy pressure and bad press. Enough said. If you don't see it, you simply don't want to.



Again, the case of the San Bernardino shooter proves they don't want access to your local storage.
The argument was substantively different back then. They didn't possess the ability to do provide local storage access. This time the ability will be there.
 
  • Like
Reactions: zakarhino and dk001
The argument was substantively different back then. They didn't possess the ability to do provide local storage access. This time the ability will be there.

No, that's not correct. The whole controversy was that Apple COULD have created a back door for the FBI to access the shooter's phone, but they refused to. Obviously the FBI wouldn't have been angry at Apple for not doing something they couldn't do.
 
  • Haha
Reactions: dk001
Uh, I would have no problem with that, assuming it's a file that's without question illegal for an individual to possess. I'm puzzled why you think that would elicit a totally different reaction. If you're in possession of such files, I'd suggest you don't "share" them with Apple or any other company by uploading them to their servers. Pretty simple.
Heh, all of my music is either ripped from CDs (not illegal), purchased from iTunes, or downloaded for free from various websites that were giving away the music (also not illegal).

Most of the artists I listen to willingly give away their music anyway as long as you follow them on SoundCloud/FaceBook/Twitter or something.

As for movies, I prefer owning physical Blu-Rays.
 
The argument was substantively different back then. They didn't possess the ability to do provide local storage access. This time the ability will be there.
That's not really how this system works. The files on the device are secure until uploaded to iCloud. No iCloud means no access to those files.
 
On device scanning or CSS (Client Side Scanning) has been around for a while. There are privacy, legal, security, and other issues that have “warned” off anyone trying it on public users and researchers have pretty much all said it’s fraught with risk Even if there were laws passed to force it. Then there are the legal challenges and Constitutional challenges.

Knowing all this (I am assuming Apple has done their due diligence) Apple is apparently trying it anyway. It is concerning when a company that touts privacy looks to launch something that affects millions of users and their justification is “think of the children” AND every privacy and security group is against them.
If one has illegal CSAM images on their phone, they shouldn't have privacy. For those that do not have CSAM images, their privacy is not effected.
 
So many people don't realize that your phone is completely blind to the matching. There's no way for your phone to know if a match has been found or not, it's only until the file is uploaded to the server that a match can be identified. This makes the system even more private. Apple DOESN'T want to know what your photos look like, they ONLY want to know whether CSAM is being copied to their servers or not.
 
  • Like
Reactions: usagora
So many people don't realize that your phone is completely blind to the matching. There's no way for your phone to know if a match has been found or not, it's only until the file is uploaded to the server that a match can be identified. This makes the system even more private. Apple DOESN'T want to know what your photos look like, they ONLY want to know whether CSAM is being copied to their servers or not.

Exactly. It's unbelievable how many don't understand this (or claim they do but then talk as if it's not true).
 
Don't think a right that you never had can be taken away.

To be fair, Apple claims it is:
1631288857011.png


HOWEVER, they have never meant by that that everything you do within the Apple ecosystem is 100% private. This is what so many people on this forum (not you) can't seem to understand. They act like it's black and white. When it comes to iCloud, they are up front about the fact that your information is subject to review in order to enforce their own TOS and to cooperate with law enforcement.

1631289099763.png



But if someone with child porn or anything else on their phone wants to ensure that remains private, they have a choice NOT to use any iCloud-related services.
 
  • Like
Reactions: Jayson A
So many people don't realize that your phone is completely blind to the matching. There's no way for your phone to know if a match has been found or not, it's only until the file is uploaded to the server that a match can be identified. This makes the system even more private. Apple DOESN'T want to know what your photos look like, they ONLY want to know whether CSAM is being copied to their servers or not.

While many may not, the majority (judging from posts, articles, videos, and letters) are objecting to scanning for this on device and the restrictions Apple has imposed on it (ex: transparency and audit). CSAM identification is just the carrot to entice buy-in.
 
  • Like
Reactions: Mendota
While many may not, the majority (judging from posts, articles, videos, and letters) are objecting to scanning for this on device and the restrictions Apple has imposed on it (ex: transparency and audit). CSAM identification is just the carrot to entice buy-in.
If what you’re saying is true, then any company could be scanning for anything and you’d have no idea. Whether is solely done on the server or the hybrid solution Apple proposed, they can all be abused. How is this any different? How is this particular method more susceptible to being abused? The entire thing DEPENDS on iCloud to even function. If you can’t see that, then I can’t help you.
 
  • Like
Reactions: dk001 and usagora
I have photos of me washing my kids in the bath from birth.
Will that be flagged as "CSAM"?
What's the definition of CSAM? Can't find it anywhere.
 
  • Like
Reactions: Euronimus Sanchez
I have photos of me washing my kids in the bath from birth.
Will that be flagged as "CSAM"?
What's the definition of CSAM? Can't find it anywhere.

CSAM stands for "Child Sexual Abuse Material" - which means images or video depicting children being exploited for sexual gratification. if you had typed CSAM into Google, you would have gotten tons of results, so I'm not sure how it's possible you didn't find any definition of it anywhere. You couldn't have looked very hard.

And, no, obviously photos of you washing your kids in the bath are not CSAM. Please. Besides, they are only matching file hashes of known CSAM images.
 
If what you’re saying is true, then any company could be scanning for anything and you’d have no idea. Whether is solely done on the server or the hybrid solution Apple proposed, they can all be abused. How is this any different? How is this particular method more susceptible to being abused? The entire thing DEPENDS on iCloud to even function. If you can’t see that, then I can’t help you.

That is one of the concerns.
This tool could be potentially used in a number of ways. Apple’s claim of “we won’t allow it” only works so far. Hence the call for additional detail along with transparency and audits.

“We wouldn’t know” and “only if iCloud” are not compatible claims.
To your point, Apple could make the iCloud setting immaterial and you would not know.

There is far too much we don’t know or fully understand about this and Apple’s silence isn’t helping.
Why CSS is the start.
 
  • Like
Reactions: Mendota
Sorry online person. I am one of many making that claim including some very professional folks/groups.
Ok then provide your evidence and data that supports your claim.
If big bad Apple took you to court could you prove the facts that this is what they intend?

Of course not , this whole thing is getting so far off it’s unbelievable.
I wish you luck in a quest to find privacy in the modern online world, I guess the FBI with the help of Apple could be tracking you down now……….lol.:)
 
  • Angry
Reactions: Euronimus Sanchez
I have photos of me washing my kids in the bath from birth.
Will that be flagged as "CSAM"?
What's the definition of CSAM? Can't find it anywhere.

This should help.

If you want to know more:
 
So many people don't realize that your phone is completely blind to the matching. There's no way for your phone to know if a match has been found or not, it's only until the file is uploaded to the server that a match can be identified. This makes the system even more private. Apple DOESN'T want to know what your photos look like, they ONLY want to know whether CSAM is being copied to their servers or not.

Then let them scan their servers. Why do they need to have scanning software on a phone? Why has it waited until now if this was such a concern. It’s literally not a concern that they have mentioned before. So again, why now?

Here’s the bottom line and the way I see it from a business perspective:

Look through these forums. Go back years...all the way back to the early days of icloud, macs, iphones, ipads and see the feature request people were asking/hoping for in the next version of the OS. You won’t find anyone saying “gee, I hope the next version of the OS has CSAM scanning capabilities, that’s what I’m looking for!”. So I am of the mindset that if customers aren’t asking for something, then you as a business shouldn’t be forcing it upon their phones in any way either. Instead, you should be focused on what they actually ARE asking for whether it be camera upgrades, more stable software, more storage, better keyboards in laptops...whatever.

Apple is now treating every single one of it’s icloud users as guilty until proven innocent and acting as a proxy of law enforcement. I don’t think that’s their role, and I don’t like the nanny state thing they are doing.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.