Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
probably will not matter.

This doesn’t bother me, I’ve long assumed it already happened.
 
There are already alternatives to iCloud, they scan for CSAM too. If you don't want your photos scanned for CSAM, store them only on your own devices such as a NAS on your home network.

If you want full digital privacy, don't use email. Only communicate over end2end encrypted message services. Delete any Google, Facebook or other online accounts. In fact use no Google products or services. Only connect to the internet via a VPN from a Laptop running a a secure Linux Distro. Definitely don't use a smartphone.
Not bothered about server-side scanning for CSAM, don’t like the idea that my own device would be used.
 
  • Like
Reactions: Miat and Pummers
Its used only when uploading to icloud photos (in particular). If you don't use icloud photos there is nothing that happens.

For now, yes. What about next year? Or the year after? The thing is, prior to these changes Apple had neither the policy nor the technical capability on checking private, non-shared pictures. Now, they are not changing the policy, but they are introducing the capability. This means that the policy change is just a political question - technical basis will already be there.
 
For now, yes. What about next year? Or the year after? The thing is, prior to these changes Apple had neither the policy nor the technical capability on checking private, non-shared pictures. Now, they are not changing the policy, but they are introducing the capability. This means that the policy change is just a political question - technical basis will already be there.
you can say that about anything
 
For now, yes. What about next year? Or the year after? The thing is, prior to these changes Apple had neither the policy nor the technical capability on checking private, non-shared pictures.

.... that you (we) know of ...

Right or wrong, at least they're giving an appearance of openness about it.

Though truth be told, unless you're gonna go full Edward Lyle or Erik Heller, any sense of digital privacy today is an illusion.
 
  • Like
Reactions: Maconplasma
[tinfoil-hat]
... and just exactly how do you know they haven't already had this capability?
[/tinfoil-hat]

:D
I don't, but it's confirmed now, no ifs ands or buts about it. So I actually trust my iPhone less than my android phone. (for now -- I'm sure samsung will do the same thing since Apple led the way)

At least I still trust my Apple Watch. :)
 
For now, yes. What about next year? Or the year after? The thing is, prior to these changes Apple had neither the policy nor the technical capability on checking private, non-shared pictures. Now, they are not changing the policy, but they are introducing the capability. This means that the policy change is just a political question - technical basis will already be there.
And Apple is currently in business. But what about next year or the year after that? They could close business for all we know. I'm not sure why you're creating a future that doesn't exist. You can only cross bridges when you get to them. I will say this you are generally the voice of reason here and one of the more respectable posters and for you to be annoyed at Apple like this says a lot. Perhaps it's time for you to move on to an Apple competitor because you're saying, "What about next year, etc..." that suggest that you're done with Apple and perhaps Samsung or something in the Android world for phones and the Windows world for computers is your soon to be forthcoming destiny.
 
No plans to abandon Apple. All these companies are doing the same thing -- Google, Microsoft, Amazon, etc.

Even Adobe does it in their software. And they do the same for images/scans of US currency; maybe even non-US currency. Just try to manipulate a high-res scanned image of a US $10 bill in Photoshop... if your scanner will even complete the scan in the first place.

Printer makers have been embedding steganographic microdots on images that resemble money for decades now.

I'm not a lawyer, but I'm pretty sure convictions based on tips from Google, et al., have been held up in court. So I don't think they're breaking any laws [No, I don't have citations, so feel free to prove me wrong if you know otherwise].

It's been happening for probably >10 years already and I'm not aware of anyone unjustly being hauled off to jail for kiddieporn based solely on what Google said [again, prove me wrong if you're aware of case law]. And remember that just being in possession of such images is illegal; storing them on your iCloud is just as illegal (not to mention stupid).

At least with Apple, they're being up front about it and most of the processing happens on-device, minimizing server-side surveillance. Apple is fully within their right to take any means necessary to stop you from storing kiddieporn on their servers or transferring it across their networks.
 
In the end, you’ll have some middle aged guys rummaging through pictures of boobs that a horny 15 year old sends to her teen boyfriend/girlfriend because it has triggered an ML child porn filter, possibly involving the police and basically creating a lot of discomfort for everyone involved (except probably for the middle aged guy reviewing the photo). What kind of benefit can be gathered from that, frankly I have no idea.
It's a very good thing that Apple isn't doing that and has stated repeatedly in public that they have no plans to do so and will resist governmental efforts to force them.

Either you trust Apple with your privacy or you don't. The CSAM matching of hashes doesn't change that. Apple is not going to report even people over the threshold of matches to the police. That is someone else's job.
 
For now, yes. What about next year? Or the year after? The thing is, prior to these changes Apple had neither the policy nor the technical capability on checking private, non-shared pictures. Now, they are not changing the policy, but they are introducing the capability. This means that the policy change is just a political question - technical basis will already be there.
Of course Apple had the technical ability. What do you think is happening when Photos scans your photo library for tagging purposes and matching faces to names? If that isn't the capability of checking private, non-shared pictures then nothing is. It is far more sophisticated than matching hashes to a CSAM database.
 
Last edited:
  • Like
Reactions: artfossil
And Apple is currently in business. But what about next year or the year after that? They could close business for all we know. I'm not sure why you're creating a future that doesn't exist. You can only cross bridges when you get to them. I will say this you are generally the voice of reason here and one of the more respectable posters and for you to be annoyed at Apple like this says a lot. Perhaps it's time for you to move on to an Apple competitor because you're saying, "What about next year, etc..." that suggest that you're done with Apple and perhaps Samsung or something in the Android world for phones and the Windows world for computers is your soon to be forthcoming destiny.


Either you trust Apple with your privacy or you don't. The CSAM matching of hashes doesn't change that. Apple is not going to report even people over the threshold of matches to the police. That is someone else's job.

Of course Apple had the technical ability. What do you think is happening when Photos scans your photo library for tagging purposes and matching faces to names? If that isn't the capability of checking private, non-shared pictures then nothing is. It is far more sophisticated than matching hashes to a CSAM database.

Ah, but I think you might be missing one important point from my arguments. The thing is, I do trust Apple. I trust that their OS is not spying on me, and I trust them when they say that on-device detection will only be used to check iCloud uploads. I also don't think that that Apple is a nefarious company, they are usually fairly honest and open in their communication (of course, one still needs to analyse their announcements with scrutiny). So I just don't buy arguments like "you should move on" or "the device can already read your pictures" — because if we go that way, we are just succumbing to general, all-encompassing paranoia. Also, Apple is not stupid, if they did content monitoring behind user's backs, the backlash could literally crush their business. Past outrages over absolute minor things (like certificate validation or throttling the phone to prevent crashing) are clear examples to how users react to surprises. No, if Apple does something, they will communicate it, and they will try to be very clear about it. Frankly, I trust Apple in this regard more than I trust a "democratically elected government", because IMO a government generally has less accountability than a scrutinized corporation like Apple. All this "Apple is a CORPORATION, of COURSE they are EVIL" is naive childish nonsense, let's not succumb to this level.

My point is simply that on-device CSAM detection — no matter how technically sophisticated and reasonable it sounds — legitimizes on-device monitoring, making it a new "normal". I do not distrust Apple's policy, I distrust the technology itself, it just makes surveillance way too convenient. I actually think that Apple's intensions are good: their approach does protect data privacy, and it is more elegant (not to mention more secure) than doing on-server decryption and scanning. But the state of the world is such that governments want to push as much surveillance as possible, and Apple's technology is a huge present for them. They can now use arguments such as "look, Apple showed that minimally invasive content scanning is possible, so let's just make it a new law, and hey, let's also add terrorists, drugs, dissidents and whatever else to it". Apple is basically giving advanced weaponry to folks that you really don't want to be armed.


I particularly the article I am linking below. I think the author offers a sound, no-nonsense take on the matter, without going to extremes.



It's a very good thing that Apple isn't doing that and has stated repeatedly in public that they have no plans to do so and will resist governmental efforts to force them.

Sorry, I was being unclear. I had the new proposed EU regulations in mind. If passed, Apple will have to comply. They won't leave the EU market.
 
  • Like
Reactions: liberti and ader42
Actually, Federighi said something interesting in his interview:

Because it's on the [phone], security researchers are constantly able to introspect what’s happening in Apple’s [phone] software. So if any changes were made that were to expand the scope of this in some way —in a way that we had committed to not doing—there's verifiability, they can spot that that's happening.

This is indeed true, and that's something one can't have if the scanning is done in the cloud directly.

I still don't like this technology. Just too much relies on trust and promises. I'd prefer if we had laws mandating zero-knowledge encryption, but good luck getting it through in the governments of the world.
 
Actually, that's not true -- if it's flagged with their algorithm, Apple has access to it, and whoever they turn it in to. (for now -- and it sets a dangerous precedent.)
And hackers will learn (if they don't already do this, as they tried with one American diplomat) to insert photos and corrupted files to set people up.
Cyber war.
Are we having fun yet?
 
And hackers will learn (if they don't already do this, as they tried with one American diplomat) to insert photos and corrupted files to set people up.
Cyber war.
Are we having fun yet?
Yeah, that's a problem. You can put photos in iCloud from any browser. I would like Apple to be able to show where that came from, but I don't know if they can.
 
And hackers will learn (if they don't already do this, as they tried with one American diplomat) to insert photos and corrupted files to set people up.
Cyber war.
Are we having fun yet?
And if anything has been shown it’s that the United States is ill equipped for cyber warfare.

I’ll be a happy man if politicians are the ones getting swatted though. They deserve it.
 
I didn't realize they were putting this into MacOS as well. Holy smokes! They've lost their ever-lovin' minds
rofl.gif
 
I guess my pictures of my kid I took in the bathtub would qualify? So much for ICloud. Actually, he is 35 now, so those pictures aren't on the Cloud. But that actually says more about Apple's ICloud being vulnerable to hacking than anything. Nothing is safe.
My photos and video are getting taken down and so much for all the fun.
No, that is not what the CSAM scanning is about. It is only about matching against a database of known and verified child pornography that is maintained by NCMAC. Your bathtub pictures are presumably not in that database and would not trigger anything.

There is a separate feature where parents can enable a check on their children's iMessage traffic to watch for photos with too much skin. if such a photo comes in, the child is warned and asked if they want to view the image anyway and that their parents will be notified if they do. This one is more open ended and could flag legitimate images, but it is restricted to children and their parents.
 
you can't leave your mac unattended anymore. 1 minute and your co-worker, ex-wife or "friend" can put some images on your mac and the police will come visit you a few weeks later;

Every macOS user becomes an easy target;

And even if you find and delete the pics, it's too late. You are flagged and the police is coming anyway;
 
  • Like
Reactions: crymimefireworks
you can't leave your mac unattended anymore. 1 minute and your co-worker, ex-wife or "friend" can put some images on your mac and the police will come visit you a few weeks later;

Every macOS user becomes an easy target;

And even if you find and delete the pics, it's too late. You are flagged and the police is coming anyway;
Why are you leaving your computer unattended around a hostile ex-wife/husband regardless of what Apple is doing?
 
  • Sad
Reactions: crymimefireworks
Why are you leaving your computer unattended around a hostile ex-wife/husband regardless of what Apple is doing?
because I have a hidden camera filming her while she puts images on the computer, so I can show the police. They prosecute her for possession of images and she goes to jail for 10 years.

5D chess.

But seriously, you may not know in advance that an ex-wife is hostile to the point of framing you. Or are they all that hostile? I don't have an ex-wife so I can't test.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.