Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Will you leave the Apple ecosystem because of CSAM?


  • Total voters
    1,040
Status
Not open for further replies.
Wow, please try to follow the bouncing ball. Apple is only "searching" items (photos) that you have specifically chosen to upload to their iCloud service. They aren't searching the entire contents of your phone. And of course, it's not really Apple searching (as in a human agent employed by Apple) but software. They know absolutely nothing about the scan results except about illegal images.

And where are you getting this "way in the past" nonsense? If you attempt to upload 30+ illegal images, Apple will immediately be notified and begin their investigation. And the "you have no say" is nonsense as well. Don't want your photos to be scanned? Don't use iCloud for photos. Simple as that.
You completely missed it, didn't you?
 
  • Like
Reactions: dk001
No, I'm not leaving over this. Leaving Apple would mean going back to Android which has okay options for phones and horrible options for tablets. And more importantly, getting the cash to buy those things. I understand people don't like it, I'm not a big fan of it either, but the system seems designed to only find CSAM creators and traders. I can't for the life of me think of any other people this would reasonably work for.
 
You completely missed it, didn't you?

Were you looking in the mirror as you typed that? If that was directed at me, this would be the part where you explain exactly what I missed instead of leaving it as a cop-out rhetorical question reply.
 
Were you looking in the mirror as you typed that? If that was directed at me, this would be the part where you explain exactly what I missed instead of leaving it as a cop-out rhetorical question reply.
Right, I was too busy missing the bouncing ball. Got it.
 
Right, I was too busy missing the bouncing ball. Got it.

Apparently you still are. You have absolutely no rebuttal to what I said. Hard to argue with facts, huh?

Notice I followed my remark about you missing the bouncing ball with an explanation of how that was so. You simply tell me I "completely missed it" yet offer no rational explanation after that.
 
Wow, please try to follow the bouncing ball. Apple is only "searching" items (photos) that you have specifically chosen to upload to their iCloud service. They aren't searching the entire contents of your phone. And of course, it's not really Apple searching (as in a human agent employed by Apple) but software. They know absolutely nothing about the scan results except about illegal images.

And where are you getting this "way in the past" nonsense? If you attempt to upload 30+ illegal images, Apple will immediately be notified and begin their investigation. And the "you have no say" is nonsense as well. Don't want your photos to be scanned? Don't use iCloud for photos. Simple as that.

Reports to NCMEC do not initiate an immediate investigation. It can take weeks to years to get to an actual LEO knocking on your door level investigation.

NCMEC gets thousands of reports. Tens of thousands.
I am unsure how many go directly to law enforcement. Cloud providers that scan, report their findings to NCMEC, not the DOJ or FBI.
 
No, I'm not leaving over this. Leaving Apple would mean going back to Android which has okay options for phones and horrible options for tablets. And more importantly, getting the cash to buy those things. I understand people don't like it, I'm not a big fan of it either, but the system seems designed to only find CSAM creators and traders. I can't for the life of me think of any other people this would reasonably work for.
All due respect but if you cannot think of any other people that this (Apple scanning on-device) would work for you should think a little more about it. It only takes say a popular flag like, Don't Tread on Me, to be considered a 'threat' to the government.. and then a law that makes this flag a designation of terrorist activities.. (keep in mind DHS literally just came out and and all but labeled people who don't like this Admin potential terrorists.. Obama didn't like those with 'Bibles and guns") illegal or a 'threat' and then can force Apple (gotta 'follow the law') to scan for those images.

I am not debating the designation/political activities I mentioned.. it could just as easily be images or flags associated with other political affiliations. The point is that once scanning is allowed for designated material.. the precedent has been set.

It's always 'for the children'.. but that is BS. Scanning devices, without a warrant or an actual crime that is being investigated is a blatant breach of our rights. Not liking certain groups and therefore the government going after them doesn't make it right.
 
Reports to NCMEC do not initiate an immediate investigation. It can take weeks to years to get to an actual LEO knocking on your door level investigation.

Source?

Also, I was referring to Apple's investigation (to confirm it's CSAM before submitting a report to NCMEC.

In any case, I'm not sure why you or the other guy think it matters how long after the crime was committed the police investigation takes place. Unless there's a statute of limitations for prosecuting a crime, then I don't see an issue.
 
  • Sad
Reactions: dk001
All due respect but if you cannot think of any other people that this (Apple scanning on-device) would work for you should think a little more about it. It only takes say a popular flag like, Don't Tread on Me, to be considered a 'threat' to the government.. and then a law that makes this flag a designation of terrorist activities.. (keep in mind DHS literally just came out and and all but labeled people who don't like this Admin potential terrorists.. Obama didn't like those with 'Bibles and guns") illegal or a 'threat' and then can force Apple (gotta 'follow the law') to scan for those images.

I am not debating the designation/political activities I mentioned.. it could just as easily be images or flags associated with other political affiliations. The point is that once scanning is allowed for designated material.. the precedent has been set.

It's always 'for the children'.. but that is BS. Scanning devices, without a warrant or an actual crime that is being investigated is a blatant breach of our rights. Not liking certain groups and therefore the government going after them doesn't make it right.
We're just going to have to disagree then, because as much as I dislike groups like the KKK or the Proud Boys, I can admit that the majority of them are most likely not stockpiling political images on their devices on the level that pedophiles stockpile CSAM. History has shown us time and time again that CSAM creators and traders often amass massive collections and are often linked to communities that work hard to hide from detection.

If the US government uses Apple to catch right-wing people, that'd be funny, because they're out and in the open right now on every social media platform. At least in the way Apple designed this, this will only catch predators.
 
By the way, Apple can remote control your iPhone. They’ve done it with mine. What kind of back door is this? Why aren’t people worried about that?
 
By the way, Apple can remote control your iPhone. They’ve done it with mine. What kind of back door is this? Why aren’t people worried about that?
I've made that argument before, haven't gotten a decent response to that yet.
 
Simply checking for something illegal/forbidden is not an accusation of guilt.
It is, if it takes place on my property without my permission. Your analogy with the metal detectors in airports doesn't really stand. This is more like the police searching my house for illegal stuff without a warrant.
I don't mind if they scan the photos in their cloud. But my phone is mine, it belongs exclusively to me, and whatever I store on it is nobody else's business. It's only when it gets into Apple's cloud that it becomes Apple's business. Once it's stored in the cloud they may do with it what they wish, as long as they do it on their own servers, using their own resources. They have no right to highjack my phone to do their work.
 
Last edited:
It is, if it takes place on my property without my permission.

1. You WOULD be giving permission if you agree to the TOS for both iOS and iCloud (if and when CSAM detection is implemented). 2. Even if you didn't give permission, merely searching for something illegal is not the same as accusing somebody of doing something illegal. Your problem is with the legality of the search itself in that case - that's a separate topic. The search doesn't imply guilt if you don't give permission and then magically NOT imply guilt all of a sudden if you do.

I don't mind if they scan the photos in their cloud. But my phone is mine, it belongs exclusively to me, and whatever I store on it is nobody else's business.

Cool - then don't choose to store your photos on iCloud, and your photos will not be scanned. I fail to see the issue.
 
  • Like
Reactions: Ethosik
I've made that argument before, haven't gotten a decent response to that yet.
I mean all they'd have to do is change some lines of code so they don't have to ask permission and boom, they have free reign of your phone. That's what people sound like when they say they're going to abuse this CSAM detection and use it for other stuff or completely ignore iCloud (which is a crucial part of this entire setup, so without it, it just doesn't work at all) and scan all of your local content (not even just photos).

I don't see that happening, just like I didn't see them remote logging into our devices whenever they want (which they definitely can do right now).
 
Cool - then don't choose to store your photos on iCloud, and your photos will not be scanned.

It's not that simple. They've lured me into spending a lot of money on their phones and tablets, and the iCloud and overall ecosystem was a very strong point in their marketing, and one of the major reasons why I went with Apple. Now that they've got me tied to their ecosystem, they start imposing unbearable conditions for continuing to use it, on the grounds that "you're free to leave if you don't like it". That's unfair to say the least.

Anyway, as I said, I don't object to Apple scanning what I send to the cloud. I only object to my own device, my processing power, my battery, being used for that. My own device is doing extra work for them. That's not right.
 
Last edited:
I mean all they'd have to do is change some lines of code so they don't have to ask permission and boom, they have free reign of your phone. That's what people sound like when they say they're going to abuse this CSAM detection and use it for other stuff or completely ignore iCloud (which is a crucial part of this entire setup, so without it, it just doesn't work at all) and scan all of your local content (not even just photos).

I don't see that happening, just like I didn't see them remote logging into our devices whenever they want (which they definitely can do right now).
Same, people can and will list off a number of doomsday scenarios that either do not happen or if they do happen it's nowhere as horrible as it's made out to be.
 
I'm not worried about Apple finding child porn on my phone. I don't have any. That's not the point. I'm worried about the precedent that it sets, by allowing Apple to scan things ON MY DEVICE and report it to the authorities. That's a big difference from them scanning their own servers for stuff you upload. A person on this board made a comment to driving a car that you purchased, but every time you speed it notifies the police.

When Apple is touting "What happens on your phone stays on your phone", this is a huge change from that stance. It doesn't matter that they're scanning for CSAM. It doesn't matter WHAT they're scanning for. The very fact that they are scanning AT ALL on your own device should worry anybody who values their privacy.

The bottom line is, we're paying for a device that could be used against us. And people's response is "Then don't do anything illegal". Did you know that in SOME countries it's illegal to have a picture of Winnie the Pooh? Think about that. Apple could just as easily be swayed by that government to scan for Winnie the Pooh pictures and report it, instead of child porn.

Now do you see the issue?
They're scanning on your phone *only* when you upload to iCloud. Because iCloud is encrypted. Don't want scans? Don't use their photo hosting service. Not really an issue IMHO.
 
  • Haha
Reactions: dk001
Don't want scans? Don't use their photo hosting service. Not really an issue IMHO.
It is an issue when you've been touting your magnificent cloud for over a decade, luring me into spending thousands of dollars on your exquisitely connected devices. Now that you've got my money, you tell me "just don't use it if you don't like it". That's not really fair, is it?
I do want a photo hosting service, and I've chosen Apple because it was marketed as honest and privacy-oriented.
 
  • Like
Reactions: dk001 and PBz
It is an issue when you've been touting your magnificent cloud for over a decade, luring me into spending thousands of dollars on your exquisitely connected devices. Now that you've got my money, you tell me "just don't use it if you don't like it". That's not really fair, is it?
Uh, don't put child porn on their servers, and you can continue to use their cloud based photo storage.

This isn't complicated.
 
  • Like
  • Haha
Reactions: dk001 and Jayson A
It's not that simple. They've lured me into spending a lot of money on their phones and tablets, and the iCloud and overall ecosystem was a very strong point in their marketing, and one of the major reasons why I went with Apple. Now that they've got me tied to their ecosystem, they start imposing unbearable conditions for continuing to use it, on the grounds that "you're free to leave if you don't like it". That's unfair to say the least.

Anyway, as I said, I don't object to Apple scanning what I send to the cloud. I only object to my own device, my processing power, my battery, being used for that. My own device is doing extra work for them. That's not right.
Your device is ALREADY doing work for them. It's decrypting the encrypted apps/music and confirming you paid them the right amounts of money to use those files. It's calling home to see if you paid enough money, in the right way, to have your SIM unlocked. It performs diagnostics, reports errors, etc.. All of that is work apple wants done -- and it's happening on your device!

Wait'll you hear about what Android phones do without your knowledge!
 
We're just going to have to disagree then, because as much as I dislike groups like the KKK or the Proud Boys, I can admit that the majority of them are most likely not stockpiling political images on their devices on the level that pedophiles stockpile CSAM. History has shown us time and time again that CSAM creators and traders often amass massive collections and are often linked to communities that work hard to hide from detection.

If the US government uses Apple to catch right-wing people, that'd be funny, because they're out and in the open right now on every social media platform. At least in the way Apple designed this, this will only catch predators.
I am guessing that the sick humans that traffic what Apple is going after are much more complex in how they store/trade/distribute than just being on their phone. Much more than QMAGA folks IMO.
 
Your device is ALREADY doing work for them. It's decrypting the encrypted apps/music and confirming you paid them the right amounts of money to use those files. It's calling home to see if you paid enough money, in the right way, to have your SIM unlocked. It performs diagnostics, reports errors, etc.. All of that is work apple wants done -- and it's happening on your device!
Actually, as much as I'd like to challenge this, I must confess I can't. It's quite a good point.
 
  • Like
Reactions: PBz
We're just going to have to disagree then, because as much as I dislike groups like the KKK or the Proud Boys, I can admit that the majority of them are most likely not stockpiling political images on their devices on the level that pedophiles stockpile CSAM. History has shown us time and time again that CSAM creators and traders often amass massive collections and are often linked to communities that work hard to hide from detection.

If the US government uses Apple to catch right-wing people, that'd be funny, because they're out and in the open right now on every social media platform. At least in the way Apple designed this, this will only catch predators.
It wasn't log ago.. a certain Administration (before orangemanbad) was targeting people, including members of the media putting them on No Fly Lists and having the IRS investigate them. I put nothing past our government, no matter who is in office. This functionality would be a very dangerous tool in their hands.

I still remember sitting next to a man on a flight who saw me reading an article by Newt Gingrich.. he said, "I am probably the most liberal/progressive person you will meet.. professor at a liberal university in NJ.. but what Obama did in the dark of night on New Year's Eve (2011) scares the isssh out of me. I may be a fan of Obama and his policies now.. but what happens when a political 'opponent' of mine is in office? Could I be detained?"

This was in reference to Obama signing the NDAA (.. allowing the indefinite detention of Americans) into Law while everyone was partying on NYE 2011.

The last thing we need to do is give ANY government a path to choose what they want to search ON OUR phones.
 
Source?

Also, I was referring to Apple's investigation (to confirm it's CSAM before submitting a report to NCMEC.

In any case, I'm not sure why you or the other guy think it matters how long after the crime was committed the police investigation takes place. Unless there's a statute of limitations for prosecuting a crime, then I don't see an issue.

They have it right on their website.
Here is FB's: https://www.facebook.com/safety/onlinechildprotection
These generally go in via the NCMEC tipline: https://www.missingkids.org/cybertipline

Also watched this on a TWIT live on Youtube where they had an NCMEC board member, a former NCMEC board member, and a CBS investigative journalist. Leo (host) was shaking his head over some of the stuff discussed.
 
Status
Not open for further replies.
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.