Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Will you leave the Apple ecosystem because of CSAM?


  • Total voters
    1,040
Status
Not open for further replies.
Kindly explain how scanning for a few old photos would have prevented anything?
This shows your lack of knowledge on this topic.
Here is a screenshot of some of the threads in this forum which can help educate you on the topic, contain many great links, and show all sides of the debate. Some really good stuff is in there.
View attachment 1839539
Did....you not understand at all what I am saying? I said having these photos would not prevent the actual act. People performing the act would just use another camera or different phone or simply turn off iCloud Photos. Aren't you saying the same thing I am - having the photos would not prevent anything.

Edit: Ugh I missed the most critical three letters I possibly could in my post. I edited the original post, I meant to write "would not have prevented" but I missed writing the not. Sorry about that. I am pretty much saying the same thing you are, the rest of my post leads in that direction you are saying.
 
Last edited:
Instead of playing “but Apple ….”, why not try doing some basic research of your own or just read a couple of the threads here on the topic. Easy way to catch up. The relevant Federal laws have been quoted and linked more than a few times.

… you can lead a horse to water, but you can’t make it drink.
Uh I have done my research. All the big hosting providers actively scan for this sort of thing....except Apple. Apple's numbers show 265 reported NCMEC vs the 70,000+ from others and even 20 million from Facebook. Apple is considered a Safe Haven at this point. So Apple is trying to come up with a solution. And in my opinion, this is the most privacy-focused approach they could have implemented.

Why is everyone just focused on Apple, if NOBODY is required to be scanning for this stuff, why don't we all break down Microsofts, Dropbox, Google, Facebook, and many many many other company's doors to stop them from doing the same thing Apple is trying to implement? The moment Apple does it, its all OMG NO NO NO, yet you continue to use Google, Facebook, Dropbox, OneDrive?! Its insane how much crap Apple gets just because its Apple.
 
Last edited:
  • Like
Reactions: usagora
Did....you not understand at all what I am saying? I said having these photos would not prevent the actual act. People performing the act would just use another camera or different phone or simply turn off iCloud Photos. Aren't you saying the same thing I am - having the photos would not prevent anything.

Edit: Ugh I missed the most critical three letters I possibly could in my post. I edited the original post, I meant to write "would not have prevented" but I missed writing the not. Sorry about that. I am pretty much saying the same thing you are, the rest of my post leads in that direction you are saying.

Chuckle … that little edit changed the whole context.
 
That's how they get you. ;) They just insert it back in after everyone has upgraded.
Exactly. It wouldn't surprise me at all if they did.

If by „spyware“ you are referring to the CSAM scanning software, no, it’s not part of iOS 15.
...yet.

I know this is a huge thread, and I haven't read every post -- but, I just need to post my two cents. The CHILD PORN scanner is absolutely no big deal. It's NOT going to flag the photo of your kid in their first bath. It's NOT going to flag the photo your kid took of themselves nekkid. It's NOT going to flag literally any "normal" photo parents take of their kids -- and it won't even flag photos 17 year olds take of themselves to share.

It *will* flag photos that are KNOWN TO BE CHILD PORN and are traded on the dark underbelly of the internet. If you have CHILD PORN ON YOUR PHONE you need help; and hopefully the Apple scanner will find you and get you the help you need.

If this tech existed years ago, Josh Duggar wouldn't have molested his sisters for anywhere near as long as he did.

Bottom line -- if you're WORRIED about Apple finding CHILD PORN on your phone, maybe you shouldn't have ILLEGAL CHILD PORN on your phone. In other news, water is wet, and illegal **** is illegal.
I don't care. I just don't want them to hijack my phone to work for them. The scanning does nothing for me. It is extra work performed by my hardware for the benefit of somebody else. Apple sees me as guilty until proven innocent, and my phone has to do extra work to prove I'm innocent. This is wrong.

It's as if a store security guard asked all the customers to show him the contents of their pockets upon leaving the store, just to make sure nobody has stolen anything. And then people like you would say: "well, it's no big deal, you'll be fine if you don't steal anything, only shoplifters should be worried". No, I won't be fine with that. If the store believes I might be a shoplifter, then it's up to them to prove I'm guilty, not up to me to prove I'm not. I will not accept that.
 
It's as if a store security guard asked all the customers to show him the contents of their pockets upon leaving the store, just to make sure nobody has stolen anything. And then people like you would say: "well, it's no big deal, you'll be fine if you don't steal anything, only shoplifters should be worried". No, I won't be fine with that. If the store believes I might be a shoplifter, then it's up to them to prove I'm guilty, not up to me to prove I'm not. I will not accept that.
Sadly, that actually happens at Kroger and Walmart here. If the stupid door sensors that detect shoplifters goes off (even the many false positives, like my old cell phone tends to trip 'em!) they stop you right there (security guard and all) and verify your receipt against each and every item in your cart. It's embarrassing and people just stare at you.
 
  • Like
Reactions: eltoslightfoot
It's even worse when you try using Walmart Pay. There is no printed receipt. Once the app actually crashed when I tried to pull up the 'digital' receipt when the sensor went off, and I spent 30 minutes in their 'security room' while they called the department I bought the item from (just a stupid Linksys router!) to verify I actually paid for it. Makes one really start to question the point of those 'tap and pay' methods now doesn't it?
 
They never do that around here. If they did, a big scandal would ensue and people would just stop buying there. Competition among supermarkets is quite fierce, and they're all very careful when it comes to their public image.
That's not to say they don't fight shoplifters. They certainly do. But they never do anything drastic unless they're absolutely sure, 100%, that the person has indeed stolen something (e.g. they've seen him/her on a security camera, or something like that). In case of doubt they prefer not to upset the customer. An annoyed customer who's falsely accused and makes a big scandal and then fills the social media with his/her story can prove to be much more expensive in the long run than a bottle of shampoo that he/she might have been stealing.

What usually happens when a sensor beeps is one of two things. Either the security guard just waves his hand at you, meaning that you can go, it's OK (in case that sensor is known to give false positives), or he takes a quick look at the receipt, a quick general glance over the shopping cart, then he thanks you and lets you go. They're always very polite.
 
Last edited:
Nope , 😂

here’s an example of my last few hundred photos , my dad is in his 80’s and has quite literally taken gigabytes of nothing as he’s accidentally using the camera on his iPad ..

13CDD5A8-A631-4093-AAD3-06AABF03A553.jpeg
 
  • Haha
Reactions: Vlad Soare
Well, that is what doctors very much dispute. I don‘t think that is really true
Well, 10,000 doctors in the Royal College of Physicians are in agreement. To the point that in the UK, there are vape shops IN HOSPITALS to help people quit smoking. But... that's a different topic. I'd be glad to give you a LOT of links that prove Public Health and FDA corruption, as well as why the MSA made smoking profitable for states, and vaping threatened that. Feel free to send me an IM. :)
 
I know this is a huge thread, and I haven't read every post -- but, I just need to post my two cents. The CHILD PORN scanner is absolutely no big deal. It's NOT going to flag the photo of your kid in their first bath. It's NOT going to flag the photo your kid took of themselves nekkid. It's NOT going to flag literally any "normal" photo parents take of their kids -- and it won't even flag photos 17 year olds take of themselves to share.

It *will* flag photos that are KNOWN TO BE CHILD PORN and are traded on the dark underbelly of the internet. If you have CHILD PORN ON YOUR PHONE you need help; and hopefully the Apple scanner will find you and get you the help you need.

If this tech existed years ago, Josh Duggar wouldn't have molested his sisters for anywhere near as long as he did.

Bottom line -- if you're WORRIED about Apple finding CHILD PORN on your phone, maybe you shouldn't have ILLEGAL CHILD PORN on your phone. In other news, water is wet, and illegal **** is illegal.
I'm not worried about Apple finding child porn on my phone. I don't have any. That's not the point. I'm worried about the precedent that it sets, by allowing Apple to scan things ON MY DEVICE and report it to the authorities. That's a big difference from them scanning their own servers for stuff you upload. A person on this board made a comment to driving a car that you purchased, but every time you speed it notifies the police.

When Apple is touting "What happens on your phone stays on your phone", this is a huge change from that stance. It doesn't matter that they're scanning for CSAM. It doesn't matter WHAT they're scanning for. The very fact that they are scanning AT ALL on your own device should worry anybody who values their privacy.

The bottom line is, we're paying for a device that could be used against us. And people's response is "Then don't do anything illegal". Did you know that in SOME countries it's illegal to have a picture of Winnie the Pooh? Think about that. Apple could just as easily be swayed by that government to scan for Winnie the Pooh pictures and report it, instead of child porn.

Now do you see the issue?
 
Uh I have done my research. All the big hosting providers actively scan for this sort of thing....except Apple. Apple's numbers show 265 reported NCMEC vs the 70,000+ from others and even 20 million from Facebook. Apple is considered a Safe Haven at this point. So Apple is trying to come up with a solution. And in my opinion, this is the most privacy-focused approach they could have implemented.

Why is everyone just focused on Apple, if NOBODY is required to be scanning for this stuff, why don't we all break down Microsofts, Dropbox, Google, Facebook, and many many many other company's doors to stop them from doing the same thing Apple is trying to implement? The moment Apple does it, its all OMG NO NO NO, yet you continue to use Google, Facebook, Dropbox, OneDrive?! Its insane how much crap Apple gets just because its Apple.
The difference, again, is that Google, Facebook, et al are scanning once you upload to THEIR servers. Apple is doing the scans on our own devices. It's about privacy, and the potential for abuse.
 
Sadly, that actually happens at Kroger and Walmart here. If the stupid door sensors that detect shoplifters goes off (even the many false positives, like my old cell phone tends to trip 'em!) they stop you right there (security guard and all) and verify your receipt against each and every item in your cart. It's embarrassing and people just stare at you.
Ahh... but that is because the door sensors made a DISCOVERY.

The difference, as pointed out, is that Apple is asking you to empty your pockets every single time, whether or not the door sensor went off, "just in case".
 
  • Like
Reactions: Vlad Soare
I believe as a hosting provider for files, Apple is legally required to scan for this sort of thing. Which is why Dropbox, OneDrive, Google Drive and others do the same. So Apple needs to do something, whether its on-device or somewhere else its up to them. But they need to do SOMETHING. Apple chose this option because, honestly, it really is the most privacy-focused approach. Alternative would be to not encrypt data or have a backdoor for the encrypted files on their servers in order for the scans to run.
Nuuuuuu.... Apple is not legally required to scan for anything.
In fact, the law specifically states that providers should NOT go out of their way to scan.


Here's an interesting part of this. 2258A, section (f)

(f) Protection of Privacy.-Nothing in this section shall be construed to require a provider to-
(1) monitor any user, subscriber, or customer of that provider;
(2) monitor the content of any communication of any person described in paragraph (1); or
(3) affirmatively search, screen, or scan for facts or circumstances described in sections (a) and (b).

Now... read that again carefully. NOTHING in this section shall be construed to *require* a provider to...
... MONITOR ANY USER, SUBSCRIBER, OR CUSTOMER
... MONITOR THE CONTENT OF ANY COMMUNICATION...
... AFFIRMATIVELY SEARCH, SCREEN OR SCAN FOR FACTS OR CIRCUMSTANCES.


That being said, this is a CHOICE by Apple... and NOT A REQUIREMENT. In fact, the law specifically says that they are NOT REQUIRED to scan, monitor or search for CSAM. Just to report it if it is discovered.
 
I'm not worried about Apple finding child porn on my phone. I don't have any. That's not the point. I'm worried about the precedent that it sets, by allowing Apple to scan things ON MY DEVICE and report it to the authorities. That's a big difference from them scanning their own servers for stuff you upload. A person on this board made a comment to driving a car that you purchased, but every time you speed it notifies the police.

When Apple is touting "What happens on your phone stays on your phone", this is a huge change from that stance. It doesn't matter that they're scanning for CSAM. It doesn't matter WHAT they're scanning for. The very fact that they are scanning AT ALL on your own device should worry anybody who values their privacy.

The bottom line is, we're paying for a device that could be used against us. And people's response is "Then don't do anything illegal". Did you know that in SOME countries it's illegal to have a picture of Winnie the Pooh? Think about that. Apple could just as easily be swayed by that government to scan for Winnie the Pooh pictures and report it, instead of child porn.

Now do you see the issue?
Many don’t understand the difference in looking for someone who committed a crime & looking for a crime someone committed.
 
It's even worse when you try using Walmart Pay. There is no printed receipt. Once the app actually crashed when I tried to pull up the 'digital' receipt when the sensor went off, and I spent 30 minutes in their 'security room' while they called the department I bought the item from (just a stupid Linksys router!) to verify I actually paid for it. Makes one really start to question the point of those 'tap and pay' methods now doesn't it?

Hmm, I use Walmart Pay all the time and the Walmart app has never crashed (let alone crashed and not been able to be re-opened). Sounds like you just had some bizarre rotten luck that day. Unfortunately, Walmart doesn't know you from Adam, and there are ton of dirtbags out there that would steal items like that, so I can't blame them for being cautious.

My only complaint about the Walmart app is when it updates, you have to sign in again to use Walmart Pay, but the app won't let you paste a password in, so I have to manually type my random, long, strong password in each time that happens (and since my apps update automatically, I don't know this is going to be required until I'm trying to actually check out). It won't let me copy/paste from my password manager app.
 
I don't care. I just don't want them to hijack my phone to work for them. The scanning does nothing for me. It is extra work performed by my hardware for the benefit of somebody else. Apple sees me as guilty until proven innocent, and my phone has to do extra work to prove I'm innocent. This is wrong.

That's simply not true. If they thought you were guilty, they wouldn't need to scan anything - they'd simply report you to the authorities. It's just like walking through metal detectors at the airport and sending your carryon luggage through x-ray machines. Do you holler about that? I hope not! Simply checking for something illegal/forbidden is not an accusation of guilt.

Apple doesn't know you personally, so don't take it personally. You also have a choice not to use their cloud service for storing your photos.
 
That's simply not true. If they thought you were guilty, they wouldn't need to scan anything - they'd simply report you to the authorities. It's just like walking through metal detectors at the airport and sending your carryon luggage through x-ray machines. Do you holler about that? I hope not! Simply checking for something illegal/forbidden is not an accusation of guilt.

Apple doesn't know you personally, so don't take it personally. You also have a choice not to use their cloud service for storing your photos.
It's not connected to iCloud, some blogger claimed that, but it is independent from that.

Makes technically no sense because each file is scanned on the fly for viruses, and of course hashed. So there is no reason to do this on the users phones.

It is like it is (don't mention the war), Apple is introducing mass surveillance and erases all privacy for future generations. That's what's it about. Only that.
 
  • Like
Reactions: eltoslightfoot
That's simply not true. If they thought you were guilty, they wouldn't need to scan anything - they'd simply report you to the authorities. It's just like walking through metal detectors at the airport and sending your carryon luggage through x-ray machines. Do you holler about that? I hope not! Simply checking for something illegal/forbidden is not an accusation of guilt.

Apple doesn't know you personally, so don't take it personally. You also have a choice not to use their cloud service for storing your photos.
It simply is true. In your analogy it would be like the TSA goes into your home because they know you are going to go on your way later that day, then let's you go on your way, and at some random point in the future they arrest you based off something that was way in the past. (After a search warrant for your whole home is exercised and you have no say in any of this.)

By the way, nothing in my scenario is personal either. The TSA doesn't know me personally.
 
  • Like
Reactions: dk001
It simply is true. In your analogy it would be like the TSA goes into your home because they know you are going to go on your way later that day, then let's you go on your way, and at some random point in the future they arrest you based off something that was way in the past. (After a search warrant for your whole home is exercised and you have no say in any of this.)

Wow, please try to follow the bouncing ball. Apple is only "searching" items (photos) that you have specifically chosen to upload to their iCloud service. They aren't searching the entire contents of your phone. And of course, it's not really Apple searching (as in a human agent employed by Apple) but software. They know absolutely nothing about the scan results except about illegal images.

And where are you getting this "way in the past" nonsense? If you attempt to upload 30+ illegal images, Apple will immediately be notified and begin their investigation. And the "you have no say" is nonsense as well. Don't want your photos to be scanned? Don't use iCloud for photos. Simple as that.
 
  • Like
  • Haha
Reactions: Jayson A and dk001
It's not connected to iCloud, some blogger claimed that, but it is independent from that.

So Erik Neuenschwander, Apple's Head of Privacy, is "some blogger"?

. . . if a user does not like this kind of functionality, they can choose not to use iCloud Photos and if iCloud Photos is not enabled, no part of the system is functional.

Article:
 
Status
Not open for further replies.
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.