Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
So you're against reporting child porn to the police? Did I hear that correctly? You're ok with them finding it and blocking it, but not reporting it?

I'm sorry, but if you attempt to upload illegal material to someone else's computer (which is basically what's happening, since servers are just computers), there's no excuse NOT to report it to the authorities.
Apple isn't the police. If they want to be the police, I'm not using their stuff. CSAM will not be the last reportable offense.

You think there's no excuse not to report it? Then why did Apple refuse to unlock that mass shooter's phone for the FBI? Tons of evidence in there probably, maybe something that could've saved lives.
 
Last edited:
Apple isn't the police. If they want to be the police, I'm not using their stuff. CSAM will not be the last reportable offense.

Um, no, Apple isn't the police - I guess that's why they're, you know, REPORTING IT TO THE POLICE 🤦‍♂️ 🤷‍♂️ Many years ago I remember seeing a sign in a computer repair shop that stated that if they discovered child porn on your computer in the process of working on it, they would report it to the police. This is nothing new. Why on earth would you cover up or ignore a crime you're aware is actively taking place? That's crazy.

You think there's no excuse not to report it? Then why did Apple refuse to unlock that mass shooter's phone for the FBI? Tons of evidence in there probably, maybe something that could've saved lives.

There's no comparison here. You're talking about someone's LOCAL iPhone storage, not files they're uploading to iCloud. Apple DID hand over to the FBI all the shooter's icLoud info. But how can Apple report something that resides on storage they have no access to? There's an entire world of difference. Apple has always stated in no uncertain terms that they have no interest in what you keep ON your iPhone.
 
  • Like
Reactions: QCassidy352
But how can Apple report something that resides on storage they have no access to? There's an entire world of difference. Apple has always stated in no uncertain terms that they have no interest in what you keep ON your iPhone.
Because that scanning is only done for images that would be uploaded to iCloud, right before it's uploaded. In terms of outcome, there is virtually no difference between this, and scanning your images after they are uploaded.
 
  • Sad
  • Like
Reactions: cyanite and dk001
Um, no, Apple isn't the police - I guess that's why they're, you know, REPORTING IT TO THE POLICE 🤦‍♂️ 🤷‍♂️ Many years ago I remember seeing a sign in a computer repair shop that stated that if they discovered child porn on your computer in the process of working on it, they would report it to the police. This is nothing new. Why on earth would you cover up or ignore a crime you're aware is actively taking place? That's crazy.
They don't have to be aware. They don't have to scan for it. Is what they've been doing all this time crazy? All this stuff they're planning to do... they could just not.

Btw, I'd also be wary of the computer repair place if they're stating that. Means they're possibly looking through my files unnecessarily. I've got tax forms and stuff on here.
There's no comparison here. You're talking about someone's LOCAL iPhone storage, not files they're uploading to iCloud. Apple DID hand over to the FBI all the shooter's icLoud info. But how can Apple report something that resides on storage they have no access to? There's an entire world of difference. Apple has always stated in no uncertain terms that they have no interest in what you keep ON your iPhone.
They had access to the physical device during the investigation, and they knew a crime had been committed.
Why is the device sacred but not the cloud storage when they're holding both in their hands?
 
Last edited:
Because that scanning is only done for images that would be uploaded to iCloud, right before it's uploaded. In terms of outcome, there is virtually no difference between this, and scanning your images after they are uploaded.

My question was rhetorical. I know how the CSAM detection process works. There's a world of difference between Apple putting sticking their nose into your iPhone snooping around out of curiosity (this is what they've vowed never to do), and checking to be sure files you are voluntarily uploading OFF your iPhone ONTO their servers are not illegal.
 
They don't have to be aware.

Correct. You can simply not use iCloud to store your photos, and Apple will be none the wiser. Your choice. Cloud services have never been 100% private. That's the compromise you make when using someone else's computers to store your files.

They had access to the physical device during the investigation, and they knew a crime had been committed.

Again, you're talking about someone's private device's LOCAL STORAGE. Again, Apple DID give the FBI what the guy had stored in iCloud. This has always been the way they operate. Nothing is changing. Their principle is to never have access to content stored solely on someone's device. That's what all those "privacy" ads people keep posting pictures of these threads are all about - what's on your IPHONE stays on your iPhone. iCloud ≠ iPhone.

Also, in the shooter case, you're talking about Apple looking for EVIDENCE surrounding a crime that has already taking place. With CSAM detection, they're simply enforcing their TOS for iCloud, and when that violation of the TOS also happens to be against federal law, they rightfully report those crimes to federal authorities.
 
Correct. You can simply not use iCloud to store your photos, and Apple will be none the wiser. Your choice. Cloud services have never been 100% private. That's the compromise you make when using someone else's computers to store your files.



Again, you're talking about someone's private device's LOCAL STORAGE. Again, Apple DID give the FBI what the guy had stored in iCloud. This has always been the way they operate. Nothing is changing. Their principle is to never have access to content stored solely on someone's device. That's what all those "privacy" ads people keep posting pictures of these threads are all about - what's on your IPHONE stays on your iPhone. iCloud ≠ iPhone.
The ads say "what happens on your iPhone stays on your iPhone." Everything that's on iCloud happened on your iPhone, so really the ads were never correct. Lines get blurry anyway given how many parts of the OS phone home.

As it stands, I already don't use iCloud Photos. Too expensive. But given that the scans occur on-device, I don't trust that they're not going to enable this for local photos later. This principle of not having access to local storage you're mentioning, Apple has never promised or even claimed it.
 
Could you imagine the insanity here if it wasn’t CSAM Apple was targeting, but any illegal pirated files whatsoever, like mp3s and shows you can’t otherwise buy, but are still illegal to have. Hey, illegal is illegal, you better stop it. That’s a good Apple user.

The tone would be very different for many of the “think of the children” mindset.

 
My question was rhetorical. I know how the CSAM detection process works. There's a world of difference between Apple putting sticking their nose into your iPhone snooping around out of curiosity (this is what they've vowed never to do), and checking to be sure files you are voluntarily uploading OFF your iPhone ONTO their servers are not illegal.
Well, to me, the distinction is overly fine. Kinda like asking whether you want to pay at the start of your meal or at the end, and how it doesn't really impact my dining experience either way.
 
  • Like
Reactions: cyanite
So you're against reporting child porn to the police? Did I hear that correctly? You're ok with them finding it and blocking it, but not reporting it?

I'm sorry, but if you attempt to upload illegal material to someone else's computer (which is basically what's happening, since servers are just computers), there's no excuse NOT to report it to the authorities.
Are you okay with someone coming into your house and searching for things to possibly report you on?
 
The ads say "what happens on your iPhone stays on your iPhone." Everything that's on iCloud happened on your iPhone, so really the ads were never correct.

Baloney. When you enable iCloud, you enter a legally binding agreement to abide by their terms of service and acknowledge that Apple can take whatever steps necessary and within the law to enforce those terms. You are voluntarily moving files OFF your iPhone.


And Apple has never claimed this principle of not having access to local storage.

Uh, that's precisely what all those "billboard" ads were claiming.

As it stands, I already don't use iCloud Photos. Too expensive. But given that the scans occur on-device, I don't trust that they're not going to enable this for local photos later.

Why would they care about your local-only photos? They're concerned about what you're putting on their servers, not on your phone.
 
Well, to me, the distinction is overly fine. Kinda like asking whether you want to pay at the start of your meal or at the end, and how it doesn't really impact my dining experience either way.

Huh? It's like the difference between someone storing illegal drugs in their own home and someone taking their illegal drugs to someone else's home and asking them to keep them safe for them. Hardly an "overly fine" distinction.
 
Huh? It's like the difference between someone storing illegal drugs in their own home and someone taking their illegal drugs to someone else's home and asking them to keep them safe for them. Hardly an "overly fine" distinction.
No, it's nothing like that at all. Not to me, at least.
 
Could you imagine the insanity here if it wasn’t CSAM Apple was targeting, but any illegal pirated files whatsoever, like mp3s and shows you can’t otherwise buy, but are still illegal to have. Hey, illegal is illegal, you better stop it. That’s a good Apple user.

Uh, I would have no problem with that, assuming it's a file that's without question illegal for an individual to possess. I'm puzzled why you think that would elicit a totally different reaction. If you're in possession of such files, I'd suggest you don't "share" them with Apple or any other company by uploading them to their servers. Pretty simple.
 
Why would they care about your local-only photos? They're concerned about what you're putting on their servers, not on your phone.
You don’t get it. They are putting software on phones that scan locally FIRST. I don’t like that. For the reasons I outlined above earlier today.

No, of course not. What does that have to do with this? We're talking about photos that you are uploading to someone else's server.
See above. And why weren’t they concerned about their servers before now?
 
You don’t get it. They are putting software on phones that scan locally FIRST. I don’t like that. For the reasons I outlined above earlier today.

No, YOU don't get it. As Apple has so clearly explained had you bothered to do your due diligence to read the original sources that explain the process, Apple has no access to that scan data. It's only once a threshold (30 detected illegal images), that Apple is able to decrypt that scanning data, and then only the scan data concerning the photos in question. They have effectively locked themselves out of everything else.

See above. And why weren’t they concerned about their servers before now?

That's the whole irony of this. They hadn't scanned before because they hadn't figured out a way to do it in a manner they deemed privacy-conscious enough. Now that they've developed this method which is far more private, people are freaking out, claiming it's an invasion of privacy. It's truly astounding.
 
Baloney. When you enable iCloud, you enter a legally binding agreement to abide by their terms of service and acknowledge that Apple can take whatever steps necessary and within the law to enforce those terms. You are voluntarily moving files OFF your iPhone.




Uh, that's precisely what all those "billboard" ads were claiming.



Why would they care about your local-only photos? They're concerned about what you're putting on their servers, not on your phone.
So, the default settings take files off your iPhone onto iCloud. Being generous, I could say it's still staying on your phone in a way if the privkeys are (like with iMessage), but in this case they're not. Or even without iCloud enabled, the device phones home. That's all fine, but the billboard was never accurate in any way. Also, idk how you're drawing from a short marketing phrase that Apple treats local storage differently from cloud, and it's weak in the first place to go based on an ad rather than something more serious like the ToS or an exec statement.

Why would they care about local-only photos? Because someone's local photos might be criminal. If not in the US, maybe somewhere else. This is the same company that will presumably "just obey local laws" in a country tracking adolescent video gaming hours... for the same reason too, "think of the children."
 
Last edited:
That's the whole irony of this. They hadn't scanned before because they hadn't figured out a way to do it in a manner they deemed privacy-conscious enough. Now that they've developed this method which is far more private, people are freaking out, claiming it's an invasion of privacy. It's truly astounding.
How do you know they hadn't figured out a way before? This tech isn't very new, and iPhones have been equipped with neural processing units for some time now for exactly this kind of task.

Even so, they've been doing fine without the scans all this time.
 
  • Like
Reactions: MadeTheSwitch
I noticed you made no effort to explain how it's not. Do you own your iPhone? Yes. Do you own Apple's servers? No.
I already did in my post above.

The photos reside on my iPhone, and are scanned only as a precursor to being uploaded to iCloud. If iCloud is turned off, or if I uploaded them to another service like dropbox or Google Drive, then Apple doesn't scan them (that's not say that these other companies don't do any sort of vetting on their end).

From my perspective, it makes no difference to me that I own my iPhone or that Apple owns their iCloud servers, or who really owns what at the end of the day, or at what step the photos get scanned by Apple. It's all the same outcome to me, and I don't really care whether it's being done via hardware or software.

I get that some people are unnerved that Apple could "change the deal" down the road, and go from scanning for CSAM to scanning for other content like Tiananmen Square images for the Chinese government, or how some bad player could potentially frame someone else by uploading CSAM imagery to their devices. Suffice to say, I am not particularly concerned about this because I feel that Apple has already done a very good job of detailing the numerous safeguards they have in place, and there are likely far easier ways for a government to "fix" its people if they really wanted to without having to go through Apple.

My intention is not to change your mind, but these are my thoughts on the matter and well, that's that.
 
  • Like
Reactions: hot-gril
So, the default settings take files off your iPhone onto iCloud. Being generous, I could say it's still staying on your phone in a way if the privkeys are, but they're not. Or even without iCloud enabled, the device phones home. That's all fine, but the billboard was never accurate in any way. Also, idk how you're drawing from that short phrase that Apple treats local storage differently from cloud, and it's weak in the first place to go based on an ad rather than something more serious like the ToS or an exec statement.

The case of the San Bernardino shooter proves the point. Doesn't get more real than that. They gladly handed over iCloud data (in conjunction with legal due process, of course), but firmly rejected allowing access to local storage, even in light of very heavy pressure and bad press. Enough said. If you don't see it, you simply don't want to.

Why would they care about my local-only photos? Because someone's local photos might be criminal.

Again, the case of the San Bernardino shooter proves they don't want access to your local storage.
 
  • Like
Reactions: cyanite
The case of the San Bernardino shooter proves the point. Doesn't get more real than that. They gladly handed over iCloud data (in conjunction with legal due process, of course), but firmly rejected allowing access to local storage, even in light of very heavy pressure and bad press. Enough said. If you don't see it, you simply don't want to.



Again, the case of the San Bernardino shooter proves they don't want access to your local storage.
Nor did they want to scan iCloud photos until recently. San Bernardino shooting occurred ~6 years ago.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.