Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

ian87w

macrumors G3
Feb 22, 2020
8,704
12,638
Indonesia
Why would China bother, when they can just walk into the Chinese data center Apple has to use and demand everything in anybody’s account? Wouldn’t that be a lot easier?
China is already sophisticated enough to control their own citizens without any help from the west. But this will be useful for those not within the Chinese system, eg. foreign journalists, those from Hong Kong, etc.

Other countries who are not as advance as China in terms of technology and capital can also benefit from this feature. It's one thing if Apple just do it on the US server side, but they pushed this to all iOS devices, and then they announced that they can tailor the system as needed on per country basis.
 

leman

macrumors Core
Oct 14, 2008
19,522
19,679
It's one thing if Apple just do it on the US server side, but they pushed this to all iOS devices, and then they announced that they can tailor the system as needed on per country basis.

It has been speculated that one strategic reason Apple developed this tech is so that they have a bargaining chip with totalitarian regimes in exchange to market access. Personally, I am not sure that I would presume current Apple management to be this machiavellian, but it is a real danger for the future if the next board/CEO are more "flexible".

At the same time, I am fairly sure that one intended use of this tech is to put pressure on the US government and the courts in the current discussion about the App Store's alleged monopoly. It makes a quite strong argument: "look, we can implement this effective and yet still private system for child abuse prevention, this would not be possible if we didn't have exclusive access to the system!".
 
  • Like
Reactions: JMacHack

Shirasaki

macrumors P6
May 16, 2015
16,263
11,765
And let's not even start with the philosophical and moral part of the issue, as this kind of technology comes very close to violating the presumption of innocence.
In fact, when checking Wikipedia webpage introducing the technology Apple says they are using, one assumption the protocol designer make is imagining people are suspicious before they are proven innocent. It is not applied to all parties involved in the exchange, but it’s also not “innocent until proven guilty”.

But, ultimately, those are tools created by human. Tools itself does not have purpose or agenda. Human does. The rooting problem is eliminating human who use the tools the “wrong” way, which this CSAM detection mechanism itself is not capable of doing so.
 

TopToffee

macrumors 65816
Jul 9, 2008
1,070
992
No. Apple does NOT analyse photos in your library. It will not try to determine if your kid is naked or not.
It will compute a hash of the picture and compare that with a database of known CSAM content. If the pic is not known in the database, it will not be flagged.
Exactly
 

TopToffee

macrumors 65816
Jul 9, 2008
1,070
992
you have no control of what gets uploaded to a database or not once it's in the open.

How can you say it does not qualify?
So how do you imagine that a picture of your child in the bath would get onto the CSAM database?
 

leman

macrumors Core
Oct 14, 2008
19,522
19,679
But, ultimately, those are tools created by human. Tools itself does not have purpose or agenda. Human does. The rooting problem is eliminating human who use the tools the “wrong” way, which this CSAM detection mechanism itself is not capable of doing so.

Exactly my point. The tools are actually quite good and well thought out. In fact, I would even support this tool if I had hard guarantees that Apple will never use them for anything else but CSAM iCloud photo scanning. But it's not the "today" I am afraid of, it's what this technology could trivially enable tomorrow. And Apple's new FAQ, as detailed as it is, does nothing to alleviate those fears.
 

Mac4Mat

Suspended
May 12, 2021
168
466
But license plates numbers are obvious and fixed, and is public record. If someone put in a wrong license plate, it can be easily proven otherwise.

The database used here is a black box, nobody knows what's the algorithm and data used to train it. If there are errors, nobody would know.
Not its not the equivalent. When ANPR is used, it is because it flags up against a real list of offences even though the presumption of innocence still applies ANPR reacts to number plate recognition against KNOWN discrepancies either insurance/drug dealing/licensing etc. etc., is is not ad hoc, whereas Apple's system is. It scans photos which may be a problem
 

Natrium

macrumors regular
Aug 7, 2021
125
246
I associate this with the license plate readers police use on their cars. If you are not driving a stolen car or have outstanding warrants, then there would not be a match to any database.
This is a completely false analogy. A license plate is not the same as your private photos. It’s purpose is to ID the owner of the car like your ID card. And it’s checked by the police. If they want to search your car they need probable cause or a warrant. Then they search for illegal content as defined by the law.

So, a correct analogy is your car manufacturer (Apple) opening the trunk of your car (iPhone) searching for what they deem “illegal” content (CSAM now, could be anything in the future), without probable cause and without a warrant.
 

leman

macrumors Core
Oct 14, 2008
19,522
19,679
Not its not the equivalent. When ANPR is used, it is because it flags up against a real list of offences even though the presumption of innocence still applies ANPR reacts to number plate recognition against KNOWN discrepancies either insurance/drug dealing/licensing etc. etc., is is not ad hoc, whereas Apple's system is. It scans photos which may be a problem

Apple system works the same way actually, it scans the pictures agains the list of known illegal material. The problem — as I see it — is less the scanning itself, but the fact that it is applied to private contents. The big difference is that number plates are publicly visible, and the mere fact of viewing or recording them does not constitute a breach of privacy or trust. In Apple's case, they are checking pictures that were never meant for public sharing.

This is somehow alleviated by the fact that they only check pictures uploaded to the cloud, and an argument can be made that those pictures are subject to Apple's cloud storage agreement. It is also understandable that they would not want to host offensive, illegal materials on their server, even in encrypted format. Still, by doing the scanning on-device they have created a system that can be easily extended to do other supervision as well. All assurance that we have that this system won't check our private (not cloud-backed) pictures or other data is Apple's promise. And that's just not enough...
 

Mac4Mat

Suspended
May 12, 2021
168
466
Apple system works the same way actually, it scans the pictures agains the list of known illegal material. The problem — as I see it — is less the scanning itself, but the fact that it is applied to private contents. The big difference is that number plates are publicly visible, and the mere fact of viewing or recording them does not constitute a breach of privacy or trust. In Apple's case, they are checking pictures that were never meant for public sharing.

This is somehow alleviated by the fact that they only check pictures uploaded to the cloud, and an argument can be made that those pictures are subject to Apple's cloud storage agreement. It is also understandable that they would not want to host offensive, illegal materials on their server, even in encrypted format. Still, by doing the scanning on-device they have created a system that can be easily extended to do other supervision as well. All assurance that we have that this system won't check our private (not cloud-backed) pictures or other data is Apple's promise. And that's just not enough...
Not possible. As Apple does not have access to all the data about 'illegal material', well not in the UK. It can therefore only assess on the probability/potential. I don't know if the U.S. publishes such material, which if it does would seem to be a problem in itself. For me its not about the photographs or the admirable intention, its about opening Pandora's box of privacy which Apple has always espoused to protect.
 

Silvestru Hosszu

macrumors 6502
Oct 2, 2016
357
234
Europe
Cliff, come on, you are a lawyer. You know that these things are fluent.

I mean, I totally trust Apple when they say that they are only using it to check iCloud uploads (and frankly, it is indeed a more privacy-oriented solution than server-side checking), and when they say that they use NCMEC databases. However, why should I trust it to remain this way? Client-side scanning sets a dangerous precedent — once the system is in place, it is trivial to extend it to other purposes. What if Tim has an accident in a year or so and a new Apple CEO is more open to making a deal with totalitarian regimes in exchange for market access and tax benefits? Or if they decide to extend the scanning to your private pictures that are not even stored on the cloud (which would be a trivial thing once the framework is in place). Or even more, scan all all the image data that goes through the APIs? Or what if a new governing body comes to power that redefines what "child endangerment" mean (there was already an example of Hungarian law that makes "gay propaganda" a criminal offense)? These are the real issues. It's not what we have now, it's what becomes easily possible in the future.

For now, I am not affected by these changes. I live in Europe and even though there is a lot of pressure to make surveillance more prevalent, European politicians are generally much more privacy-oriented and generally sane than their colleagues in the USA. But they are also massively incompetent and I worry that these developments will nudge our legislation in the wrong direction.

And let's not even start with the philosophical and moral part of the issue, as this kind of technology comes very close to violating the presumption of innocence.
I am a lawyer myself and also Hungarian (although I am not living in Hungary).
I think that your point using the Hungarian analogy has plenty of merits and I myself am surprised by Cliff's position on this.
Lets not forget that Apple operates in Hungary and they do have a law prohibiting the dissemination of any other adult material to certain subjects than straight ones.
So, how can Apple refuse to cooperate with the Hungarian authorities if they are asked to? And lets not forget that in EU the States have broad, almost exclusive, powers on criminal matters and GDPR limitations do not really apply.
 

leman

macrumors Core
Oct 14, 2008
19,522
19,679
For me its not about the photographs or the admirable intention, its about opening Pandora's box of privacy which Apple has always espoused to protect.

Exactly. The thing is, I don't know how the US legislation is. It is very much possible that Apple is required by law to scan images in the cloud storage for child pornography. If this is so, then their technology is indeed advanced and privacy-oriented and it could be a good compromise between privacy and fulfilling a legal obligation — in the US. Regardless, potential consequences of this technology are very worrisome.
 

Shirasaki

macrumors P6
May 16, 2015
16,263
11,765
Exactly my point. The tools are actually quite good and well thought out. In fact, I would even support this tool if I had hard guarantees that Apple will never use them for anything else but CSAM iCloud photo scanning. But it's not the "today" I am afraid of, it's what this technology could trivially enable tomorrow. And Apple's new FAQ, as detailed as it is, does nothing to alleviate those fears.
The unfortunate part is, majority of people, they DO NOT CARE ABOUT what that tool could be used in a nefarious way, either by dismissing the argument or counter arguing the worries by saying “you mean this and that then should we also ban the car the internet the knife” etc. This is just nonsense cause they are not even attempting to acknowledge the fundamental of the issue: PEOPLE. Remove those people using tools nefariously instead of just destroying tools.

Another unfortunate part is: every coin has two sides, and you cannot eliminate one side without also eliminating the other. Thus, these people will exist no matter what, and they form the so-called “speed bump” when human race makes progress in certain field.

Pretty philosophical at this point but now I lean more on solving the people problem than trying to criticise the tool on its own.
 

crymimefireworks

macrumors 6502
Dec 19, 2014
314
369
So, here's a question I never thought I'd ask: I'm beyond the return period for my M1 MacBook Air. I'm seriously thinking about selling it in light of what Apple's doing with iOS and their iPhone devices.

Is anyone hearing anything about them going to or deliberately not going to do this in macOS? Because, at this point, I really don't feel like I can trust them.

And I really, really hate this because, hardware-wise, it's a great laptop.
Agree with your thinking. Long term I'm making big changes because of this.

This week:
- Disabled iCloud

Over the next month:
- I didn't make the leap to M1 yet, which means Linux is an option.
- Looking at self hosting cloud services with NextCloud and Umbrel. Looks like a mess. But with demand and customers solutions get built. Storing a few notes and photos in the cloud isn't rocket science.

Coming years:
- Support open source hardware options. As bad as they are, they won't improve without customers.
 
Last edited by a moderator:

GrumpyCoder

macrumors 68020
Nov 15, 2016
2,127
2,707
As I understand it, this is a purpose built search tool, designed to search for one type of materials only, and nothing else. It cannot be used as a general purpose query tool by Apple. It can only change it's behaviour with an OS update.
No, that is not correct. It is a simple tool that works on any type of data/content. And the behavior can be changed at any time, by modifying configuration files. You don't hardcode these things, that's a 80s thing.
I agree with @cmaier reasoning that this is likely the first step in E2EE for iCloud, as it makes sense to me.
But then the whole approch doesn't work anymore.
Without E2EE they could easily run this in the cloud. No need to run it on device. With E2EE they have to run it on device, but then they can't manually review it in case of a match. They explicitly state they're doing that. The only way to review manually with E2EE would be a backdoor to the device. See how this doesn't work?


There's much more going on here. Apple is using the CP story to implement new technology into the core of the OS. For now it's checking images for CP with the option to disable it by deactivating iCloud Photos. The question is what comes next, once the technology is established. And it's always harder (more resistance) establishing such a technology than changing how it works. Looking at what Apple did so far makes it kind of obvious. They've implemented a way so network communication is possible and bypassing the firewall. That means you can't block connections with tools like Little Snitch. They're using this for Apple services, while the others are still going through the firewall. This means, we can't block communication from/to Apple servers (except for disconnecting from the internet). The next thing that happened was the fight against Facebook/Google and their "tracking", essentially destroying a part of their business model. This happened under the cover of "Privacy". And now they're implementing technology that allows them to track and check our behavior on device, not just on websites, but everything we do on our own devices. Do the math, tracking and using user-information is the next big thing for Apple, which we won't be able to block (firewall bypass) and all of that by killing the competition. Apple knows very well information is the most profitable market they could invest in, especially when it comes to AI. While non of this might be relevant now, let's see where we are with Apple in 5 years. What we see happening is the foundation of things to come.

Given Apples history of reporting CP with less than 300 cases per year shows how little they care about it (Facebook is over 20 million reports per year). But the whole thing is an excellent excuse to establish new technology and embed it in devices.
 

leman

macrumors Core
Oct 14, 2008
19,522
19,679
They've implemented a way so network communication is possible and bypassing the firewall. That means you can't block connections with tools like Little Snitch. They're using this for Apple services, while the others are still going through the firewall.

They did try to make their services extempt from firewall rules but quickly backtracked on it. Third-party firewalls do fully work on Big Sur and Monterey.

One should always look for reason behind the reason, especially where big money is involved, but Apple is far from being as nefarious as people like to portray them - most of the time at least.
 

crymimefireworks

macrumors 6502
Dec 19, 2014
314
369
This was just pointed out to me: https://www.patrick-breyer.de/en/posts/message-screening/?lang=en

Welp, and here I thought that things are better on this side. In contrast, Apple is almost a paragon of privacy…
Yep seems like end-to-end encryption or bust. Since larger corporations can't handle it, decentralized systems are the only option.

For a long time I thought Apple was an exception -- but why would there be an exception? It is now our job to build solutions that work despite the rules.
 
Last edited by a moderator:

GrumpyCoder

macrumors 68020
Nov 15, 2016
2,127
2,707
Agreed - so what's next for you given this scandal?
I wouldn't call it a scandal. It's the way Apple is doing business these days. They decided that going for the things they fought long ago is a good thing if it fits their business plan. So I have to re-evaluate my decision if Apple remains the ecosystem of my choice. I've always used Linux in addition to macOS and even Windows when I had to.

Next I will wait and see what happens. If they're only implementing this in the US, as the EU will have a word with them (it's illegal in the EU), then problem is solved. I'll just use an OS version from Spain, France, Germany, Austria, whatever.

I could also check if I could disable it myself or someone else could do it. If that is a long-term solution remains to be seen.

Or I simply switch to Linux for everything. I already have to use Linux for some things, macOS has become a reading/writing/websurfing/email tool for me with some video cutting and photo work here and there (hobby).
 
  • Like
Reactions: 09872738

crymimefireworks

macrumors 6502
Dec 19, 2014
314
369
I wouldn't call it a scandal. It's the way Apple is doing business these days. They decided that going for the things they fought long ago is a good thing if it fits their business plan. So I have to re-evaluate my decision if Apple remains the ecosystem of my choice. I've always used Linux in addition to macOS and even Windows when I had to.

Next I will wait and see what happens. If they're only implementing this in the US, as the EU will have a word with them (it's illegal in the EU), then problem is solved. I'll just use an OS version from Spain, France, Germany, Austria, whatever.

I could also check if I could disable it myself or someone else could do it. If that is a long-term solution remains to be seen.

Or I simply switch to Linux for everything. I already have to use Linux for some things, macOS has become a reading/writing/websurfing/email tool for me with some video cutting and photo work here and there (hobby).
Insightful, thank you! I'm curious about Linux as well.

What are your thoughts on replacing your iCloud usage? Afaik Linux doesn't do that part.
 

GrumpyCoder

macrumors 68020
Nov 15, 2016
2,127
2,707
They did try to make their services extempt from firewall rules but quickly backtracked on it. Third-party firewalls do fully work on Big Sur and Monterey.
That functionality is still there. They backpedaled on the whole App authentication. But the core functionality still exists and they can use it anytime. Tools like LuLu explicitly mention this on their website.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.