Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Will you leave the Apple ecosystem because of CSAM?


  • Total voters
    1,040
Status
Not open for further replies.
So here is an interesting read (sorry... cross-posted... but this is IMPORTANT)

Many people have stated that Apple is required to do this, as a service provider.

Someone also linked to the actual law; 18USC2258A.

Here's an interesting part of this. 2258A, section (f)

(f) Protection of Privacy.-Nothing in this section shall be construed to require a provider to-

(1) monitor any user, subscriber, or customer of that provider;

(2) monitor the content of any communication of any person described in paragraph (1); or

(3) affirmatively search, screen, or scan for facts or circumstances described in sections (a) and (b).

Now... read that again carefully. NOTHING in this section shall be construed to *require* a provider to...
... MONITOR ANY USER, SUBSCRIBER, OR CUSTOMER
... MONITOR THE CONTENT OF ANY COMMUNICATION...
... AFFIRMATIVELY SEARCH, SCREEN OR SCAN FOR FACTS OR CIRCUMSTANCES.


That being said, this is a CHOICE by Apple... and NOT A REQUIREMENT. In fact, the law specifically says that they are NOT REQUIRED to scan, monitor or search for CSAM. Just to report it if it is discovered.

I have been saying this since the announcement. People are still under the impression that scanning must take place regardless, they think the choice is between on-device vs. on-cloud even though the correct answer is no scanning at all with zero access E2EE.

I think that speaks to the public's general default view about policing in general, they believe it must simply exist in its current pervasive, authoritarian, and militaristic form because they simply can't imagine a world without it. The notion that people are entitled to not be treated like criminals with spyware pre-installed on their property is a completely alien concept to them apparently.
 
By the way, AI photo scanning and categorizing has been done locally on your device for years now and that was cool. How do you know Apple hasn't been using any of that data they're gathering about your photos? You don't. The only thing we could do is trust them.

The fact that they're being very open about this process is very telling. It tells me that they've thought this through very much and they don't intend to use it maliciously.

But then again, I carry around a tiny beacon with me everywhere I go that knows my location at all times, so if they wanted to abuse my privacy, they've either been doing it all along and I haven't noticed or they could start to at any time. Either way, it's nothing they haven't or couldn't have done in the past.
It was, because there was no reason to suspect abuse, whereas this system is literally designed to put people in jail. The faces in the pictures that get grouped together are just that, until I give the phone info on who that person is (which I never did. I know who they are, my phone doesn't need to know).

Apple's intent isn't my worry. The intent of those providing the database is my worry. Apple's execution is inherently incompatible with the US societal values, and is ethically highly questionable (abuse the 4th amendment of everyone to convict a few wrongdoers). We're supposed to have the protections of probable causes and warrants, which conveniently businesses don't have to follow. It's not up to Apple to do the job of law enforcement (obviously report it if they happen across illegal content, but don't actively search our devices for it).

This is one of the times I want Apple to follow one of the industry standard practices below:
1. Scan the server, and only the server, and turn over data only when shown a warrant with probable cause.
2. Change iCloud to where Apple only becomes a secure tunnel between my iPhone, iPad, and Mac and doesn't store any data on cloud servers.
3. Scrap the whole NeuralHash thing AND E2E encrypt iCloud.
 
Let me ask a hypothetical.

If the government wanted to install a camera in your home, to make sure that your child wasn't being abused in any way, with the promise that nobody would ever look at your spouse walking around in their underwear, would you be OK with it?

That's what Apple is doing...
 
It was, because there was no reason to suspect abuse, whereas this system is literally designed to put people in jail. The faces in the pictures that get grouped together are just that, until I give the phone info on who that person is (which I never did. I know who they are, my phone doesn't need to know).

Apple's intent isn't my worry. The intent of those providing the database is my worry. Apple's execution is inherently incompatible with the US societal values, and is ethically highly questionable (abuse the 4th amendment of everyone to convict a few wrongdoers). We're supposed to have the protections of probable causes and warrants, which conveniently businesses don't have to follow. It's not up to Apple to do the job of law enforcement (obviously report it if they happen across illegal content, but don't actively search our devices for it).

This is one of the times I want Apple to follow one of the industry standard practices below:
1. Scan the server, and only the server, and turn over data only when shown a warrant with probable cause.
2. Change iCloud to where Apple only becomes a secure tunnel between my iPhone, iPad, and Mac and doesn't store any data on cloud servers.
3. Scrap the whole NeuralHash thing AND E2E encrypt iCloud.
I guess you didn't know there is machine learning AI running in your photos all the time finding pictures of cars and cats, but I guess that could never be used for any other type of image eh?
 
Apple must be getting weary with all this non-stop dancing :D

Never fear, though: The ITA contingent will be along shortly to explain to all us scare-mongering Neanderthals how everything is just fine and nobody need worry.

Meanwhile, my exodus from Apple began today in earnest: Just got back home with the Garmin watch that will likely replace my Apple Watch.
 
Let me ask a hypothetical.

If the government wanted to install a camera in your home, to make sure that your child wasn't being abused in any way, with the promise that nobody would ever look at your spouse walking around in their underwear, would you be OK with it?

That's what Apple is doing...
Except that camera wouldn't report anything unless the footage matched exactly to footage they already have on file, which wouldn't be possible unless you specifically had that exact footage and played it in front of the camera. Nice try though.
 
Apple must be getting weary with all this non-stop dancing :D

Never fear, though: The ITA contingent will be along shortly to explain to all us scare-mongering Neanderthals how everything is just fine and nobody need worry.
That white paper was posted last week dude.
 
I guess you didn't know there is machine learning AI running in your photos all the time finding pictures of cars and cats, but I guess that could never be used for any other type of image eh?
Precisely why I don't have any of *those* pictures. No technology with a radio can be trusted to that level.
 
Except that camera wouldn't report anything unless the footage matched exactly to footage they already have on file, which wouldn't be possible unless you specifically had that exact footage and played it in front of the camera. Nice try though.
So, you'd be cool with that. Trusting that the Government would "only look" if it, say, heard a child screaming.
 
Except that camera wouldn't report anything unless the footage matched exactly to footage they already have on file, which wouldn't be possible unless you specifically had that exact footage and played it in front of the camera. Nice try though.
Oh my. Up until now I thought you were merely way too trusting in Apple, but now you're saying you'd be ok with the government installing monitoring in your home, to detect the possibility of child abuse?!?! :oops:

The gulf between our views is even wider than I imagined. We got nothing to talk about anymore.
 
Are you talking about this article that they posted last week with a last modified date of 8-13? https://www.apple.com/child-safety/...del_Review_of_Apple_Child_Safety_Features.pdf

Ah yes, that paper mentions the existence of a secondary scanning system in a single sentence with absolutely no other information about it. Thanks for linking, my fears have now been alleviated because Apple only published this new information after 8 days of public backlash instead of 13 days. /s

Still doesn't explain how the second hashing system works. Still doesn't explain why they took over a week to reveal the secondary system. Still doesn't explain why the initial peer review papers didn't mention it.

And most importantly: Still doesn't explain why they need to be scanning my content in the first place.
 
Except that camera wouldn't report anything unless the footage matched exactly to footage they already have on file, which wouldn't be possible unless you specifically had that exact footage and played it in front of the camera. Nice try though.

"Except that camera wouldn't report anything [because government entity known for lying and violating citizens' rights pinky swears it won't do anything else]. Checkmate friend. Nice try though."
 
I would not really call it "whining". Apple is integrating a dangerous backdoor to control the content of your devices. If they do that check on their servers, I do not care, but I definitely do not want to get a counter on my iDevice, which may trigger a control check by some clerk at Apple, who "approves" my content. Things on my devices do not have to be "approved".
All the whining is about hypothetical fears. You're complaining about something that already happens with every photo and video on cloud storage services. Google, Microsoft, and Apple have been doing this for years. I got no problem with this.
 
  • Like
Reactions: JBGoode
I’m always a little skeptical of a totally new forum member chiming in so strongly on this topic.

(particularly if it supports Apple)
Meh. I understand your skepticism, but people hear about a thing, search on a term, and get a hit on a forum. "Hey, I'd like to participate in that conversation," and join up. It happens. I've done it.

In fact: That's how I ended-up here. I'd actually forgotten I'd already signed-up here quite a while ago to ask about older Mac Minis. Luckily, I used the same email address for my new sign-up and the system told me it was already in use. Since I run my own mail server, I knew it had to be me :p
 
Oh my. Up until now I thought you were merely way too trusting in Apple, but now you're saying you'd be ok with the government installing monitoring in your home, to detect the possibility of child abuse?!?! :oops:

The gulf between our views is even wider than I imagined. We got nothing to talk about anymore.
Oh, I thought he was using it as an analogy to what's happening on my device.
 
I’m always a little skeptical of a totally new forum member chiming in so strongly on this topic.

(particularly if it supports Apple)

I saw another account without a profile pic fervently defending Apple in all of the MR news threads about this since the unveil. They were posting so much I checked out their profile and saw they were inactive on MR for years before coming back to exclusively defend Apple's position. Very weird. Tim Cook's alt?
 
Meh. I understand your skepticism, but people hear about a thing, search on a term, and get a hit on a forum. "Hey, I'd like to participate in that conversation," and join up. It happens. I've done it.

Yeah...fair... I just wish those types of accounts were fewer than they are.

It’s nice to have people that are participating in discussions and topics on a wide range of things and not just firing up the anger machine on one specific very controversial topic.

Obviously anybody can participate… I just weigh those commenters much lower than others.

It just feels really weird to come totally out of the blue in a strong defense of Apple on a topic like this one.
 
  • Like
Reactions: jseymour
Bs6cY3ueYBk9M4kx94qqpTiX1zslNS-znpMmxl4EY654IBHt1Vm_luDwpQt1VKlyVLxx3VSBUVl1_h0gvYE5vfmk64QCxlTNs64FDjWURRD9iX0wPJsX2Mt9bz5EBwUBWj0fH1SdZ-fMlkS3s_gyNK7c0xawG0zry2M
 
By the way, AI photo scanning and categorizing has been done locally on your device for years now and that was cool. How do you know Apple hasn't been using any of that data they're gathering about your photos? You don't. The only thing we could do is trust them.

The fact that they're being very open about this process is very telling. It tells me that they've thought this through very much and they don't intend to use it maliciously.

But then again, I carry around a tiny beacon with me everywhere I go that knows my location at all times, so if they wanted to abuse my privacy, they've either been doing it all along and I haven't noticed or they could start to at any time. Either way, it's nothing they haven't or couldn't have done in the past.
The AI photo scanning and categorizing is a feature of the software itself. All photo software to include Adobe do this. It is a feature that stays on the device, unless you choose to upload it to a cloud service. To try and claim it is the same is laughable.
 
Status
Not open for further replies.
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.