Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

bobcomer

macrumors 601
May 18, 2015
4,949
3,699
You wouldn't get one red cent. If Apple flags one of your pics the most that will happen is they will email you stating "One or several of your pictures were flagged as inappropriate and did not get uploaded." That's not grounds for a lawsuit. Good Lord.
Where does it say that's all that would happen? I'd certainly like to see that.
 

Maconplasma

Cancelled
Sep 15, 2020
2,489
2,215
How is this distinction even relevant? The iPhone is a deeply personal device that has access to the most intimate moments of your life. We are talking about Apple installing a deeply integrated spyware on this device. Right now, this spyware will only become active if you use certain cloud features abs it only uses data from certain sources (which are not necessarily trustworthy by the way). We only have Apple’s wire to trust that the spyware will not be used for other purposes. I find it very strange that there are people who dint find this situation unsettling.
What I said is very relevant because people never read the TOS when setting up MacOS and iOS. For that other poster to use an example of Apple being inside of peoples homes with cameras makes no sense in regards to this argument.

Here's an example for you. A person renting an apartment is RENTING. Every apartment complex has to do an annual or bi-annual inspection of their tenant's apartments to make sure the apartments are up to code. They have a right to do it because the tenant is leasing the apartment. The iPhone is the owner's property but iOS isn't so Apple has the right to scan for sex trafficking before they allow the upload to iCloud. Not sure why you're arguing at me about this? Go to Apple. They are the ones that made this announcement. I don't have to agree with anyone here against this. Is MR now supposed to be "us" against the evil giant?
 

turbineseaplane

macrumors P6
Mar 19, 2008
17,412
40,223
The only difference is that now sometiime in future point release of iOS/iPadOS 15 and Monterey then Apple will be comparing that hash to a known hashes of CSAM in a database.

We get it - we object to that.

"The database comparison" part is important.

Whose database?
What's it looking for?
What entity wants that and why?

This is user data - not something Apple should be building tools to rifle though in order to "compare it all against databases" with no cause or warrant - no matter what the end goals might be.

The objections aren't about CSAM - it's about the entire concept of violating user data this way.
 

Apple_Robert

Contributor
Sep 21, 2012
35,671
52,503
In a van down by the river
It only scans photos that are on your phone. But it only does the scanning if you have iCloud photo sync enabled. That’s just the way it works, to preserve your privacy. If you have already given away your privacy in a much more severe way (by enabling Apple to have access to all your photos), then Apple figures you have nothing to complain about. Even if you have iCloud photo sync enabled, after all, there are copies of photos (at least some of them) on your devices, and these are scanned. At the end of the scanning, a “voucher” is sent to Apple for each photo indicating whether it is likely child porn. Only if it receives more than a certain number of positive vouchers can Apple decode the information associated with those vouchers (the identity of the photos and the low-res versions).

As for “China, Russia, etc.” this system only works in the United States.

And, yes, if you’ve done nothing wrong, then mathematically you have less than a one in a trillion chance of even having APPLE look at your photos. And if they look at your photos, what happens then? Straight to the gulag? Nope. In that incredibly unlikely event, the worst that happens is an Apple employee has looked at low-res representations of the photos that have been incorrectly flagged. Is that something you really fear? Really? Then I assume you don’t use iCloud Photo Sync, because using iCloud Photo Sync allows apple to look at ALL your photos in full detail. Sure, they say they don’t, right? But you believe them when they say that, but not when they say there is a 1 in a trillion chance of them even ever looking at low res versions of a few of your photos?

Again, anyone with an excuse to worry about this either doesn’t understand math, is a shill for an Apple competitor (all of which probably already do full cloud scanning for this stuff), or is worried about getting caught.
Apple can accomplish their goal of being proactive in keeping filth off their servers by scanning pics uploaded to their servers. There is no need to do on device scanning to accomplish said goal. Apple could also accomplish end to end without on device scanning if they wanted to. Device scanning appears to be part of a new policy that will cover iPads and Macs that use FaceTime and iMessage as well.

I don't need my backups to be end to end encrypted in iCloud because I don't use iCloud for backups, although I store some of my backups in the Cloud, which are truly encrypted and I don't have to worry about Apple or anyone else getting access to them.

Apple is placating government entities with this on device scanning. Opening a backdoor for the government entities under the auspices of greater privacy and security is nothing more than meaningless rhetoric, in my opinion.

There is no proof that Apple is going to do end to end encryption. That is just a guess on someone else's part.
 
Last edited:

Shadow Demon

macrumors member
Oct 24, 2018
92
236
We get it - we object to that.

"The database comparison" part is important.

Whose database?
What's it looking for?
What entity wants that and why?

This is user data - not something Apple should be building tools to rifle though in order to "compare it all against databases" with no cause or warrant - no matter what the end goals might be.

The objections aren't about CSAM - it's about the entire concept of violating user data this way.
Except it is not user data being compared; it is checksum of a individual user data file. A checksum Apple is continously scanning and comparing every time a file is changed and saved. if you consider this a violation then your only option is stop using iCloud.
 

hagar

macrumors 68020
Jan 19, 2008
2,128
5,417
No they would not qualify. Unless the pictures of your kid are on an already existing database of KNOWN child pornography images.

This is the fundamental misunderstanding that people seem to have about how this system works.
No. Apple does NOT analyse photos in your library. It will not try to determine if your kid is naked or not.
It will compute a hash of the picture and compare that with a database of known CSAM content. If the pic is not known in the database, it will not be flagged.
 

GrumpyCoder

macrumors 68020
Nov 15, 2016
2,127
2,707
Well that requires new hardware and things. How about we start with something simpler that Apple could turn on with the flip of a switch;
Yes, ignore the hardware/software difference. The point is, they're doing this in our homes, our devices vs in public places. There is absolutely no point they could not do this in the cloud. But they want to put this on devices, which allows to do more things they're after in the end. If we get a match, that info is send to Apple. So where do we stop? Right now it's photos, next is search results in the internet, specific websites. Or illegal content such as music, movies and so on. Not just having that media on device, also browsing for it on the (dark) net.

Apple is trying to implement technology, that would allow to track everything we do on device. For some reason I'm picturing a whiteboard at Apple HQ with the milestones "prevent Google/Facebook tracking", "Implement our own tracking/scanning software on device", "Destroy Googles/Facebooks business model of selling data", "do the same thing that Google/Facebook did but embedded in our devices", "world domination" with an evil Tim Cook next to it.
 

turbineseaplane

macrumors P6
Mar 19, 2008
17,412
40,223
Except it is not user data being compared; it is checksum of a individual user data file. A checksum Apple is continously scanning and comparing every time a file is changed and saved. if you consider this a violation then your only option is stop using iCloud.

Regardless of the source of comparison or what is being looked for..

They have no business going through user data, on the users device, doing "content hunts" with no reasonable cause or suspicion and no warrant.
 

Maconplasma

Cancelled
Sep 15, 2020
2,489
2,215
So you have nothing.
I wasn't challenging any of your posts in the first place for you to come at me. So if you have any alternative point then post it, otherwise....
Also Apple is not going to turn anyone into the authorities. So if they flag pics on your account how else would you know unless they informed you? Some logic on your part would greatly help.
 
Last edited:

Shadow Demon

macrumors member
Oct 24, 2018
92
236
The link below is an excellent analysis. If you trust Apple to do exactly what they said and no more then all is good. Until it can be proven otherwise, the rest is conspiracy theory.

 

Shadow Demon

macrumors member
Oct 24, 2018
92
236
And also trust they won’t do anything government agencies or state departments covertly want.
You have a lot more faith in them than I do it sounds like.

Apple shouldn’t even be in the business of offering such a service on a users device.
I do, essentially all of my eggs are in Apple’s basket. We are inextricably linked, they fall I fall. However, I also locally encrypt all files vital to privacy.
 

GrumpyCoder

macrumors 68020
Nov 15, 2016
2,127
2,707
The link below ie an excellent analysis. If you trust Apple to do exactly what they said and no more then all is good. Until it can be proven otherwise, the rest is conspiracy theory.
That's not what it is about. It's about implementing on device technology that is not needed at all if the only purpose is what they say they're doing. They could do the same thing in the cloud. It's about unnecessary technology that could at any time be turned into something "evil". If this would be required in order to work, that's a different story, but it isn't. There is zero reason for this to work on the device.

It's funny how Tides are turning. When it's Huawei, Facebook, Google and evil Russia/China, it's a risk to national security and all hell breaks lose. When it's Apple doing this on device with no reason for it happing there, then they're the good guys and everything is fine.

The problem lies much deeper, when using things like Devon Think which allows for data to be fully end-to-end encrypted, Apple now has the option to scan it which is none of their business. All these secure options we have, won't be secure anymore at the hand of Apple. Whether they use it or not, is a completely different story.

I'm all for innocent until proven guilty, but when I see a bunch of masked guys heavily armed walking into a bank I can only wonder... sure they could just make a withdrawal. Again, it is not necessary to run this on device and achieve the same results, not at all.
 

turbineseaplane

macrumors P6
Mar 19, 2008
17,412
40,223
From @tolmasky on Twitter - all credit to Francisco

This is such a good thread: https://threader.app/thread/1424078607383678976

We’re past the point where giving Apple the benefit of the doubt can be interpreted as anything other than willful ignorance from a place of Western privilege. These aren’t hypotheticals, we already have examples of Apple's policies failing people in other countries. 1/🧵

Case in point, while we argue whether sideloading would ruin our "experience" on the iPhone, the bottleneck of the @appStore was already wielded against Hong Kong protestors when China forced Apple to remove http://HKmap.live , an app they used to avoid police violence. 2/🧵

If your takeaway is that this is merely a "troubling situation" in the "complicated relationship with China," then you aren't only demonstrating how you feel about people in other countries, but also living under a comfortable delusion that this couldn’t happen here too. 3/🧵

Similarly, if you staunchly defend the @appStore's laughable and disingenuous "security model" while finding the stinginess of Apple's Bug Bounty program a boring financial topic, it just shows that you're lucky enough to not actually be under a true security threat. 4/🧵

China's surveillance of Uyghur Muslims and the recent NSO spyware targeting journalists leave no doubt that iOS is under real threat. The fact that the targeting of a religious community didn't result in Apple pursuing the most ambitious Bug Bounty Program is shameful. 5/🧵

And finally, Apple's acquiesance to host Chinese iCloud accounts on Chinese servers should have made them want to steer completely clear from building the necessary infrastructure to employ mass surveilance at the click of a button. 6/🧵

Apple's conceptual explanations around this system being tied to iCloud thus give me zero assurance that these tools won't be used against people in countries like China, given Apple's existing, demonstrated, and repeated history there. 7/🧵

This is what I mean by Western privelege: whether it's weighing app censorship vs. UX, or arguing that Apple "has a lot to lose" if they bungle this scanning thing here, it ignores that there are people elsewhere that don't have the luxury of these "protective incentives". 8/🧵

The fundamental issue is that Apple has not grown up. They have either not realized, or simply choose to refuse the responsibility of, their position. They aren't "like a game console." They are one of the primary platforms that people *trust their lives to*. 9/🧵

For many people the iPhone *is* their computer. In the last year we've even entrusted the tracing of a pandemic to it. Leadership that understood the magnitude of this position would be proud of this accomplishment, but also humbled by the responsibility it brings with it. 10/🧵

Part of that responsibility means foregoing features that, in the wrong hands, could be used to cause harm. Apple isn't an immutable entity. Even if you trust today's Apple completely, we don't know who could be in control tomorrow, or what their values could be. 11/🧵


@tolmasky on Twitter - all credit to Francisco
 

cmaier

Suspended
Jul 25, 2007
25,405
33,474
California
Regardless of the source of comparison or what is being looked for..

They have no business going through user data, on the users device, doing "content hunts" with no reasonable cause or suspicion and no warrant.
They don’t!

They only scan the file if you are about to send the file to Apple! (Via iCloud sync)

You want they should just accept kiddie porn on their servers?
 

Shadow Demon

macrumors member
Oct 24, 2018
92
236
That's not what it is about. It's about implementing on device technology that is not needed at all if the only purpose is what they say they're doing. They could do the same thing in the cloud. It's about unnecessary technology that could at any time be turned into something "evil". If this would be required in order to work, that's a different story, but it isn't. There is zero reason for this to work on the device.

It's funny how Tides are turning. When it's Huawei, Facebook, Google and evil Russia/China, it's a risk to national security and all hell breaks lose. When it's Apple doing this on device with no reason for it happing there, then they're the good guys and everything is fine.

The problem lies much deeper, when using things like Devon Think which allows for data to be fully end-to-end encrypted, Apple now has the option to scan it which is none of their business. All these secure options we have, won't be secure anymore at the hand of Apple. Whether they use it or not, is a completely different story.

I'm all for innocent until proven guilty, but when I see a bunch of masked guys heavily armed walking into a bank I can only wonder... sure they could just make a withdrawal. Again, it is not necessary to run this on device and achieve the same results, not at all.
Here is the thing, Apple says it is necessary and I believe them. When end-to-end encryption is implemented in iCloud then it will be impossible to detect CSAM. Therefore, it will be necessary to make the comparison at the device level before it is uploaded to iCloud and encrypted.

Since you don’t, it looks like you are going down the Open-Source Linux rabbit-hole.
 

turbineseaplane

macrumors P6
Mar 19, 2008
17,412
40,223
When end-to-end encryption is implemented in iCloud then it will be impossible to detect CSAM. Therefore, it will be necessary to make the comparison at the device level before it is uploaded to iCloud and encrypted.

Which means… All the bidding the different state departments and government agencies want them to covertly do can now be very conveniently done, pre-encryption, at the device level since the infrastructure is set up for it.

Should make it even easier and faster to "locate the problem people or content"***

Welcome to hell


***subject to jurisdiction and local whims of governments, dictators or corporate influence
 

Shadow Demon

macrumors member
Oct 24, 2018
92
236
Which means… All the bidding the different state departments and government agencies want them to covertly do can now be very conveniently done at the device level since the infrastructure is set up for it.
Possibly, Russia might also launch a nuclear weapon into the US but I am not losing sleep over either one. This technology has always been marching forward, dystopia may be the future but I will long dead before it becomes my problem.
 

cmaier

Suspended
Jul 25, 2007
25,405
33,474
California
Which means… All the bidding the different state departments and government agencies want them to covertly do can now be very conveniently done, pre-encryption, at the device level since the infrastructure is set up for it.

Should make it even easier and faster to "locate the problem people or content"***

Welcome to hell


***subject to jurisdiction and local whims of governments, dictators or corporate influence

Right now it’s done, in those places, at the cloud level. Is that any better for you? At least if it’s done at the device level you can disconnect to prevent it.
 

bobcomer

macrumors 601
May 18, 2015
4,949
3,699
I wasn't challenging any of your posts in the first place for you to come at me. So if you have any alternative point then post it, otherwise....
Also Apple is not going to turn anyone into the authorities. So if they flag pics on your account how else would you know unless they informed you? Some logic on your part would greatly help.
So? I asked you because I wanted to know if what you said was true, as that would make a difference in what I think.

So I'll ask again if you have any proof that they will never turn anyone into the authorities? (or even anything that says they wont!) I have no proof they will, but that's certainly the direction they are going with this -- I know that from how our government works. The people in charge of this state (SC) just love to legislate and prosecute morality faults that they perceive.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.