Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

exogenous

macrumors newbie
Nov 26, 2021
14
3
This would never be applied by a cancel culture enthusiast to dragnet people who have orange shoes or prefer sprinkled chocolate on ice cream for Selective Enforcement of irrational edicts. No, there is no gaping black hole of potential abuse lurking in the very next moment of political time like the 1984 we find ourselves in today.

Treating everyone as a criminal to catch a criminal is reetarrded thinking. it does not work with firearms. it does not work with drugs. TSA is professional sexual assault and it accomplishes nothing good. Google is already treacherous with your information. Why would anyone rational think it would work digitally?

Let's instead make pedophilia a capital crime to swiftly stop 100% of recidivism, and resume harsh social correction of perversity.

Does ios 15 promote jailbreaking? Or discontinuance of iCloud? Or is the future of my mac devices Air Gap?

I will be testing GrapheneOS and it's compatriots.
 
Last edited:

dugbug

macrumors 68000
Aug 23, 2008
1,929
2,147
Somewhere in Florida
Well CSAM is kind of on hold and I am not leaving OSX, I rather just block all Apple connections in my firewall if it comes to it.

I am actually not sure if CSAM would be even legal in Europe with the data protection rules 🤔

Im not sure where all the CSAM paranoida is rooted but the actual scanning feature is tied to icloud photos upload. No other part of the OS does this, so if you don't use "icloud photos" there is no scanning. Even if you use photos and "icloud backup" there is no scanning.

There is some sort of government requirement for CSAM scanning which has to be done for cloud-based photo services (google, etc all provide CSAM scan data). In order to enable full end to end encryption I think this is the only thing they need to move to full on-device encryption. Seems strange that privacy hawks don't consider what they are losing.

-d
 
  • Like
Reactions: uller6

leman

macrumors Core
Oct 14, 2008
19,521
19,678
There is some sort of government requirement for CSAM scanning which has to be done for cloud-based photo services (google, etc all provide CSAM scan data). In order to enable full end to end encryption I think this is the only thing they need to move to full on-device encryption.

As far as I understand, only shared content needs to be verified (although I am still not quite sure what is the legal basis for that). The obvious solution would be to have end to end encryption for private as well as family-shared data and on-server verification for any other shared content.

Seems strange that privacy hawks don't consider what they are losing.

It's a really difficult topic, and there are obviously no clear-cut answers. Still, as a "privacy hawk" I know very much what I prefer — I'd rather have my cloud photos accessible by Apple (fully knowing this when I decide to share my data on iCloud) rather than having a backdoor on my device that creates a mechanism of policing the data I decided not to share. As pointed out before, this is a difference between capability and policy. Policy can be changed easily, capability cannot. In these matters its better if the capability is never there.
 

MajorFubar

macrumors 68020
Oct 27, 2021
2,174
3,826
Lancashire UK
I can't believe someone like me with a decidedly average intelligence can see the obvious flaws in this, while people way cleverer than me have not. All it will do is generate false positives and have innocent people labelled as p*dos for life, while those genuinely creating this filth will continue to use any one of the countless millions of digital cameras which do not have this functionality.
 

eltoslightfoot

macrumors 68030
Feb 25, 2011
2,547
3,101
I can't believe someone like me with a decidedly average intelligence can see the obvious flaws in this, while people way cleverer than me have not. All it will do is generate false positives and have innocent people labelled as p*dos for life, while those genuinely creating this filth will continue to use any one of the countless millions of digital cameras which do not have this functionality.
Well, just look at the articles put out by the tech security populace, and you will see that many cleverer than us both do see the issues.

https://www.eff.org/deeplinks/2021/...asking-apple-ceo-tim-cook-stop-phone-scanning

https://www.macrumors.com/2021/08/20/university-researchers-csam-dangerous/

https://www.bleepingcomputer.com/ne...t-apple-s-csam-scanning-can-be-fooled-easily/

There are more...
 

bobcomer

macrumors 601
May 18, 2015
4,949
3,699
There is some sort of government requirement for CSAM scanning which has to be done for cloud-based photo services (google, etc all provide CSAM scan data).
Not true. The others scan their servers, not your personal device. (as they should -- if it's on their server, they're responsible and if found to contain CSAM, they're legally liable.)

In order to enable full end to end encryption I think this is the only thing they need to move to full on-device encryption.
Where do you get the idea that will ever happen? Sounds like it's just wishful thinking. If you want end to end encryption, you'll have to do it yourself.

Anyway, if there's a backdoor so Apple can scan your data, it's not end to end encryption and shouldn't be thought of as such.

Seems strange that privacy hawks don't consider what they are losing.
You lose everything once they scan on your device.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.