Taking the words out of my mouth. Mentioned initiatives should be resisted thoughI'm happy to use illegal technology then and there's nothing anyone can do about it.
Last edited:
Taking the words out of my mouth. Mentioned initiatives should be resisted thoughI'm happy to use illegal technology then and there's nothing anyone can do about it.
This doesn’t bother me, I’ve long assumed it already happened.probably will not matter.
Messaging and Chat Control
The End of the Privacy of Digital Correspondence The EU decided to let providers search all private chats, messages, and emails automatically for suspicious content - generally and indiscriminately. The stated aim: To prosecute child pornography. The result: Mass surveillance by means of fuwww.patrick-breyer.de
Not bothered about server-side scanning for CSAM, don’t like the idea that my own device would be used.There are already alternatives to iCloud, they scan for CSAM too. If you don't want your photos scanned for CSAM, store them only on your own devices such as a NAS on your home network.
If you want full digital privacy, don't use email. Only communicate over end2end encrypted message services. Delete any Google, Facebook or other online accounts. In fact use no Google products or services. Only connect to the internet via a VPN from a Laptop running a a secure Linux Distro. Definitely don't use a smartphone.
Not bothered about server-side scanning for CSAM, don’t like the idea that my own device would be used.
Its used only when uploading to icloud photos (in particular). If you don't use icloud photos there is nothing that happens.
you can say that about anythingFor now, yes. What about next year? Or the year after? The thing is, prior to these changes Apple had neither the policy nor the technical capability on checking private, non-shared pictures. Now, they are not changing the policy, but they are introducing the capability. This means that the policy change is just a political question - technical basis will already be there.
you can say that about anything
For now, yes. What about next year? Or the year after? The thing is, prior to these changes Apple had neither the policy nor the technical capability on checking private, non-shared pictures.
[tinfoil-hat]I don’t see how this statement makes sense. You can’t implement a policy if you don’t have the capability.
I don't, but it's confirmed now, no ifs ands or buts about it. So I actually trust my iPhone less than my android phone. (for now -- I'm sure samsung will do the same thing since Apple led the way)[tinfoil-hat]
... and just exactly how do you know they haven't already had this capability?
[/tinfoil-hat]
And Apple is currently in business. But what about next year or the year after that? They could close business for all we know. I'm not sure why you're creating a future that doesn't exist. You can only cross bridges when you get to them. I will say this you are generally the voice of reason here and one of the more respectable posters and for you to be annoyed at Apple like this says a lot. Perhaps it's time for you to move on to an Apple competitor because you're saying, "What about next year, etc..." that suggest that you're done with Apple and perhaps Samsung or something in the Android world for phones and the Windows world for computers is your soon to be forthcoming destiny.For now, yes. What about next year? Or the year after? The thing is, prior to these changes Apple had neither the policy nor the technical capability on checking private, non-shared pictures. Now, they are not changing the policy, but they are introducing the capability. This means that the policy change is just a political question - technical basis will already be there.
Which I think we had forgot until now.Though truth be told, unless you're gonna go full Edward Lyle or Erik Heller, any sense of digital privacy today is an illusion.
It's a very good thing that Apple isn't doing that and has stated repeatedly in public that they have no plans to do so and will resist governmental efforts to force them.In the end, you’ll have some middle aged guys rummaging through pictures of boobs that a horny 15 year old sends to her teen boyfriend/girlfriend because it has triggered an ML child porn filter, possibly involving the police and basically creating a lot of discomfort for everyone involved (except probably for the middle aged guy reviewing the photo). What kind of benefit can be gathered from that, frankly I have no idea.
Of course Apple had the technical ability. What do you think is happening when Photos scans your photo library for tagging purposes and matching faces to names? If that isn't the capability of checking private, non-shared pictures then nothing is. It is far more sophisticated than matching hashes to a CSAM database.For now, yes. What about next year? Or the year after? The thing is, prior to these changes Apple had neither the policy nor the technical capability on checking private, non-shared pictures. Now, they are not changing the policy, but they are introducing the capability. This means that the policy change is just a political question - technical basis will already be there.
And Apple is currently in business. But what about next year or the year after that? They could close business for all we know. I'm not sure why you're creating a future that doesn't exist. You can only cross bridges when you get to them. I will say this you are generally the voice of reason here and one of the more respectable posters and for you to be annoyed at Apple like this says a lot. Perhaps it's time for you to move on to an Apple competitor because you're saying, "What about next year, etc..." that suggest that you're done with Apple and perhaps Samsung or something in the Android world for phones and the Windows world for computers is your soon to be forthcoming destiny.
Either you trust Apple with your privacy or you don't. The CSAM matching of hashes doesn't change that. Apple is not going to report even people over the threshold of matches to the police. That is someone else's job.
Of course Apple had the technical ability. What do you think is happening when Photos scans your photo library for tagging purposes and matching faces to names? If that isn't the capability of checking private, non-shared pictures then nothing is. It is far more sophisticated than matching hashes to a CSAM database.
It's a very good thing that Apple isn't doing that and has stated repeatedly in public that they have no plans to do so and will resist governmental efforts to force them.
Because it's on the [phone], security researchers are constantly able to introspect what’s happening in Apple’s [phone] software. So if any changes were made that were to expand the scope of this in some way —in a way that we had committed to not doing—there's verifiability, they can spot that that's happening.
And hackers will learn (if they don't already do this, as they tried with one American diplomat) to insert photos and corrupted files to set people up.Actually, that's not true -- if it's flagged with their algorithm, Apple has access to it, and whoever they turn it in to. (for now -- and it sets a dangerous precedent.)
Yeah, that's a problem. You can put photos in iCloud from any browser. I would like Apple to be able to show where that came from, but I don't know if they can.And hackers will learn (if they don't already do this, as they tried with one American diplomat) to insert photos and corrupted files to set people up.
Cyber war.
Are we having fun yet?
And if anything has been shown it’s that the United States is ill equipped for cyber warfare.And hackers will learn (if they don't already do this, as they tried with one American diplomat) to insert photos and corrupted files to set people up.
Cyber war.
Are we having fun yet?
No, that is not what the CSAM scanning is about. It is only about matching against a database of known and verified child pornography that is maintained by NCMAC. Your bathtub pictures are presumably not in that database and would not trigger anything.I guess my pictures of my kid I took in the bathtub would qualify? So much for ICloud. Actually, he is 35 now, so those pictures aren't on the Cloud. But that actually says more about Apple's ICloud being vulnerable to hacking than anything. Nothing is safe.
My photos and video are getting taken down and so much for all the fun.
Why are you leaving your computer unattended around a hostile ex-wife/husband regardless of what Apple is doing?you can't leave your mac unattended anymore. 1 minute and your co-worker, ex-wife or "friend" can put some images on your mac and the police will come visit you a few weeks later;
Every macOS user becomes an easy target;
And even if you find and delete the pics, it's too late. You are flagged and the police is coming anyway;
because I have a hidden camera filming her while she puts images on the computer, so I can show the police. They prosecute her for possession of images and she goes to jail for 10 years.Why are you leaving your computer unattended around a hostile ex-wife/husband regardless of what Apple is doing?