Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Will you leave the Apple ecosystem because of CSAM?


  • Total voters
    1,040
Status
Not open for further replies.
Let me ask a hypothetical.

If the government wanted to install a camera in your home, to make sure that your child wasn't being abused in any way, with the promise that nobody would ever look at your spouse walking around in their underwear, would you be OK with it?

That's what Apple is doing...

Actually that camera would be mobile and scanning through ”X” in your home. This tech by Apple does nada in preventing CSAM. Just the continued use of specific known existing items.

Still, definitely get your point.
 
I guess you didn't know there is machine learning AI running in your photos all the time finding pictures of cars and cats, but I guess that could never be used for any other type of image eh?

Can you find another “example”? The two are not the same at all.
Unless you have CSAM on your device, pop it up and ask for a comparison search….

btw - as poor as that AI tech example you used, makes me even more concerned about how well developed this is.
 
This is your blind spot! You don't know what they are doing... You only know what they are telling you and have no idea what they are concealing. So, they can do anything they want, and you don't know that they aren't.

Exactly, but this has been true for the Mac since 1984, the iPhone since 2007 and the iPad since 2010.
You can't know for sure what Apple is doing or not.

However many ways there may be for Apple doing the right thing, it is certain that there are vastly more ways of Apple doing the wrong thing, or rather not doing the right thing.

You have to trust them if you are going to use their products and services.
 
Because now Apple is setting a precedent, by asserting the right to search your personal device for incriminating evidence. This can’t be compared to simple imagine analysis to help organize your vacation photos. This is a search for criminal behavior, and the search is occurring on your private property.

Most people in modern western society acknowledge that a warrant would be needed for that type of thing, and that innocent citizens should not be subjected to warrantless search and seizure, especially when they haven’t been accused of or suspected of committing a crime.

Apple is a corporation led by unelected individuals with no public oversight. If they assert this right without any pushback, then Pandora’s box will be open.

I would also ask “Open and Honest”?
Apple isn’t saying why they are doing this - a lot of effort for little gain, why doing it client side, and only giving us bits and pieces of the process.

Open and honest? Apparently not.
 
No-one made any such argument, those are your words.

A company proposes a solution to an existing problem, everyone runs to the hills and screams loss of privacy. None of you (people against it) as Adults can do everything or have the power to protect your kids yourselves, yet your'e all outraged when a company offering solutions that require services you already have enabled. Ironically, then you'd probably blame (or in the the USA, sue) the company because you did what iCloud on to protect your kids...

You're already submitted to accepting iCloud, it needs to be on to use Apple Pay, to use AirDrop, to save passwords to Keychains etc.

In the 80's TV had Charlie the cat child safety awareness programmes, such as 'charley said, always tell your mummy before you go off somewhere'. You think parents stop their kids watching because they thought the big baddie TV companies were trying to control them???
lol settle down
If Apple's solution didn't put every user under the microscope and actually fought the root problem, we'd likely not have any issues. If someone is stupid enough to share known CSAM through iCloud, they'd be caught with just server-side scanning. What we object to is having the scanning system ON OUR DEVICES. That crosses borders that even Google wouldn't (since base Android is open-source, that can be verified). The only sure opt-out is to stop using Apple devices. I foresee the detection software becoming fully standalone on-device whether you use iCloud or not in a future update (because software never develops into something more powerful, right?)

No, you don't need iCloud for AirDrop. It uses peer-to-peer Bluetooth/WiFi.

Charlie the cat didn't then proceed to follow the kid around and blow the whistle every time the kid wanted to do something. If the kid wanted to simply walk down the street to the candy store without telling anybody, Charlie the cat was none the wiser that anything happened. Apple would know and rat on the kid. And if he bought candy that he wasn't allowed to have, Apple would have him arrested. Charlie couldn't. Also in the '80s, there were payphones everywhere, so even though the kid didn't tote a cell phone, he could still call home in many cases if he got into a jam.
 
  • Like
Reactions: CasualFanboy
I voted not leaving, but not comfortable with it. I’m not going to turn off iCloud photos because I’m not worried about the feature as it currently stands, but I’m definitely concerned over what it could develop into in the future. When they start detecting other types of content on my phone is when I’ll turn it off.
 
I wasn’t aware iOS and iPadOS had selective photo backup.
It doesn't have selective backup, but it also doesn't right now either, so whatever can be done to our photo libraries could be done all along in the cloud, but what happens in the cloud we have no control over and whatever software they're using to scan our stuff. However, with the software on-device, it can be picked apart and audited by an independent security group or groups to make sure it's doing exactly what they say it's doing. At least that's what how I understood it. All of this information is in Apple's white papers.
 
Not per se. But they've implied the code wouldn't be in there until the next major release. It still isn't--well, not that we know of, but part of it is. Bad optics, if nothing else.


Srsly? Have you been living under a rock? It's been one of Apple's big selling points over the competition. Heck, they're even trying too claim they're doing CSAM scanning on the devices to preserve privacy. (Which is about as tortured logic as ever I've seen.)

1. They have said the CSAM detection system would not be there until iOS 15 which is true.

2. Again, if it's plentiful, please provide a statement with a source where Apple said it was the most important thing. A statement from Apple saying "We believe privacy is a fundamental human right" is not enough evidence.
 
I voted not leaving, but not comfortable with it. I’m not going to turn off iCloud photos because I’m not worried about the feature as it currently stands, but I’m definitely concerned over what it could develop into in the future. When they start detecting other types of content on my phone is when I’ll turn it off.
I think that's the best approach. No sense in losing sleep over it until there's an issue. Apple has always taken our privacy seriously.
 
  • Haha
Reactions: Mendota
It doesn't have selective backup, but it also doesn't right now either, so whatever can be done to our photo libraries could be done all along in the cloud, but what happens in the cloud we have no control over and whatever software they're using to scan our stuff. However, with the software on-device, it can be picked apart and audited by an independent security group or groups to make sure it's doing exactly what they say it's doing. At least that's what how I understood it. All of this information is in Apple's white papers.
So if I have Photo Backup on, by default all of my photos get scanned by this proposed solution.
 
If Apple's solution didn't put every user under the microscope and actually fought the root problem, we'd likely not have any issues. If someone is stupid enough to share known CSAM through iCloud, they'd be caught with just server-side scanning. What we object to is having the scanning system ON OUR DEVICES. That crosses borders that even Google wouldn't (since base Android is open-source, that can be verified). The only sure opt-out is to stop using Apple devices. I foresee the detection software becoming fully standalone on-device whether you use iCloud or not in a future update (because software never develops into something more powerful, right?)
Yeah, but isn't the whole point that Apple is going through all this trouble precisely to set themselves apart from Facebook and google by not scanning your photos server-side?

Currently, iOS scans your photos on-device to tag them, and this information does not leave your device.

At the same time, Apple doesn't scan the contents of your iCloud library, and evidently has no intention of doing so anytime soon.

Instead, they go out of their way to design such a convoluted system that is effectively designed to scan your iCloud photos for child porn, without actually scanning for it in the same invasive manner that other companies currently do. And for what?

The only reason I can think of is that Apple is laying the ground for offering fully encrypted iCloud storage one day. They reportedly backed off on this initiative in 2018 at the behest of the FBI, and I am guessing that from Apple's POV, this proposed CSAM detection method offers users the best of both worlds. Nobody can access your iCloud photos that are fully encrypted (if and when Apple does offer such a feature), the data scanned from your photos never leaves your iPhone, but Apple probably still needs a way to convince law enforcement that they are not acting as a safe haven for child pornography so as to get them off their backs.

To me, while the scanning technically will take place on my device itself, it applies only to photos about to be uploaded to iCloud, so in terms of outcome, there really is no difference regardless of whether it happens on my iPhone or in the cloud from Apple's servers. So while the technology may have the potential to be way more invasive, I believe the reality is that it will still end up being a lot less invasive than what Facebook and Google are currently doing to weed out child pornography.

I acknowledge that nobody really knows how such technology could evolve or be abused in the future, but is this not the very definition of a slippery slope argument? The key talking points aren’t so much about CSAM detection as it is situated today (primarily because there isn’t much controversy found there) but rather the slippery slope argument about how by launching CSAM detection now, Apple will find itself in compromising situations in the future. The fear is that countries will force Apple to find non-CSAM material. This feels a bit like a straw man argument to me, not least because there is no mechanism in place for countries to have Apple use CSAM detection to start searching for non-CSAM, and these articles just talk around all of the safeguards Apple has built into CSAM detection like they don't exist.

Even if they wanted to, there are probably way easier ways to "fix" a problematic individual than to frame them via tainting the US CSAM database (which entails so many steps that even a mission impossible movie based on this would be deemed too implausible).

Of course, there is always the possibility that I am totally and completely wrong, and 2-3 years later, Apple still hasn't offered encrypted iCloud storage, and I am left wondering "So what was the point of it all really?"
 
1. They have said the CSAM detection system would not be there until iOS 15 which is true.

2. Again, if it's plentiful, please provide a statement with a source where Apple said it was the most important thing. A statement from Apple saying "We believe privacy is a fundamental human right" is not enough evidence.
1. But the underlying mechanism has reportedly been discovered in iOS as far back as 14.3. If true, then it has already been put on our phones with NO notice (I read all the release notes, and there was no mention). It won't be active until iOS 15, but the fact it was part of an update without being mentioned isn't cool.

2. They mention fundamental human rights a lot, but:
 
  • Like
Reactions: CasualFanboy
More screeching voices of concern.



 
Last edited:
Jesus, you people are scared.

I have formed a better opinion since my previous comment - I think that I might actually have become too dependent on Apple's ecosystem.
I haven't got much against iOS or macOS or their services.. there's just something gone awry in my choices along during my computing years.
Movies and music bought within Apple's services that I can't get out easily because of DRM. Movies are the worst. Music is doable, but I don't have 500 million CD's to burn my music library to.

Scanning for Child Pornography images that are in the CSAM registry is the least of my worries. I don't mind.. it's the half-naked Britney Spears images I'm worried about that could get me in the hands of the CIA; stepping out of the plane in the US; orange jumpsuit and chains around my legs and torso.

I wish Britney Spears could see me now. Using my flesh spear.

And yes, call me what you wish. I will stay using Apple's iCloud photos and I will suck on Apple's breasts until someone slices off my lips and cut out my tongue.

Bottomfeeders... hah.. believe in aliens too? UFOs? You're wack - all of you(meaning those who believe in aliens and ufos - not all people everywhere and not everyone on this forum)

It's sometimes like hearing tears through a waterfall on this forum.. so much crying from scared and often hostile babies that have a habit of inducing fear in the common people.
It's not the half-naked Britney Spears you have to worry about... it's that vintage Traci Lords stuff....
 
I'm not going to read 23 pages, but it's likely been said, but...

You do realize that you do not own iOS? You are merely granted a license. You are at the mercy of Apple. Good or bad, thats how licensing rolls.

Don't like it? Here is a roll of wire and a pile of sand.. build your own
You're right. Maybe it's time (since i *DO* own the hardware) to find an open-source phone software that will run on an iPhone.
 
1. They have said the CSAM detection system would not be there until iOS 15 which is true.
Only partially true. See: [P] AppleNeuralHash2ONNX: Reverse-Engineered Apple NeuralHash, in ONNX and Python

Apple has already begun putting the code into iOS as of iOS 14.3.

2. Again, if it's plentiful, please provide a statement with a source where Apple said it was the most important thing. A statement from Apple saying "We believe privacy is a fundamental human right" is not enough evidence.
I guess you missed Apple CEO Tim Cook's open letter on Apple's (alleged) dedication to privacy. Here ya go: https://appleinsider.com/articles/1...-privacy-policies-in-open-letter-to-customers

I guess you likewise missed this, too:

On_Your_iPhone.jpeg
 
Status
Not open for further replies.
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.