Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Will you leave the Apple ecosystem because of CSAM?


  • Total voters
    1,040
Status
Not open for further replies.
Apple has not done this to date.
Why now and Why using this method?
I think a user pointed out… that perhaps, they have a mandate to do so. And maybe, this method has privacy in mind… I can’t say. Listening to Rene speak about it… it seems Apple is trying to keep their hands clean as possible.
Gotta love the implication in this question. If you're opposed to Apple compromising your privacy and security, you must be in favor of child abuse :rolleyes: (I have a response to that implication, but it would probably get me banned.)
I mean, I get it. Because you and people who oppose this CSAM method are fully against child abuse, however… are totally against Apple scanning personal files/photos. But it’s tough position for Apple to be in… I highly doubt they want to be in a business of scanning Apple users personal files/photos given their track record against privacy.
 
No and the people who will are maybe .5% of apple’s current customers. Most of them will come back or buy another iPhone anyway. It’s all whining from them currently anyway.
As we have seen the Taliban swooping over Afghanistan I reminded of just how carefully one has to guard liberty and how quickly it can evaporate through complacency. I don't think this issue is going to go away, and I do think it will hurt Apple's sales. I'm not upgrading any time soon, which is a shame since I have been saving up for a refresh of my technology. If there were a viable alternative to Apple, I would definitely pursue it.

EDIT: Actually, I think there should be another option for the poll: 'No I won't be leaving Apple becuase of this, but I will be voting for candidates who promise to make this kind of surveillance illegal.'
 
Last edited:

"The first protection against mis-inclusion is technical: Apple generates the on-device perceptual CSAM hash database through an intersection of hashes provided by at least two child safety organizations operating in separate sovereign jurisdictions – that is, not under the control of the same government. Any perceptual hashes appearing in only one participating child safety organization’s database, or only in databases from multiple agencies in a single sovereign jurisdiction, are discarded by this process, and not included in the encrypted CSAM database that Apple includes in the operating system. This mechanism meets our source image correctness requirement"
 
Yeah yeah, but WHAT IF CSAM isn't CSAM? What then? You don't know what they're putting on your phone! It could be anything! You will never know! They could be looking for pictures of door knobs and then BOOM you're arrested! Crazy man, just crazy
I totally agree with you and I understand why there is cause for concern. Personally though, I won't worry about it until I need to worry about it. I have been using icloud photos since iOS 8/9. If they really cared about my photos, they would already have had access to them for years.
 
I ask you... would you rather for Apple not fight against Child Abuse when they have the technology to do so? I'm curious.. what alternative method do you think Apple should do?
Let Law Enforcement handle it. It’s part of what they do. Why should everyone give up “a little” privacy every time a tech firm decides they want to help “[X] Abuse” or “[Y] Harassment” (or whatever other cause they want to support for whatever reason)? How many X or Y causes are there?

And that’s the argument I see a lot of times: “It’s a small price to pay to help end X or Y.” As if the loss of privacy for all would even make a difference.
 
  • Like
Reactions: boswald and Mega ST
For this reason, no. If Apple does actually use this tech for something nefarious, then yes. Until then, my cat photos will be in iCloud.
 
  • Like
Reactions: Samdh90

"The first protection against mis-inclusion is technical: Apple generates the on-device perceptual CSAM hash database through an intersection of hashes provided by at least two child safety organizations operating in separate sovereign jurisdictions – that is, not under the control of the same government. Any perceptual hashes appearing in only one participating child safety organization’s database, or only in databases from multiple agencies in a single sovereign jurisdiction, are discarded by this process, and not included in the encrypted CSAM database that Apple includes in the operating system. This mechanism meets our source image correctness requirement"
This issue is not how Apple is using this now, but how this scheme, or something like it, could be used in the future. Basically Apple just laid the groundwork for mobile devices preprocessing data (any kind of data) for server-based AI to scan for virtually anything. Mark my words, this is the onset of serious, widespread, distributed AI surveillance.

81px-Hal_9000_Panel.svg.png

link
 
We obviously have widely disparate philosophies relating to personal freedom, privacy, and responsibility. So far apart I suspect we're not even talking the same language.
Like I said, you do what you need to for your own sake, but I don't see this as my privacy being compromised. It's not like someone is searching through my stuff.
 
  • Haha
Reactions: Pummers
This issue is not how Apple is using this now, but how this scheme, or something like it, could be used in the future. Basically Apple just laid the groundwork for mobile devices preprocessing data (any kind of data) for server-based AI to scan for virtually anything. Mark my words, this is the onset of serious, widespread, distributed AI surveillance.

81px-Hal_9000_Panel.svg.png

link
Apple has had on device scanning far before this moment. How do you know they haven't been reporting home? Did you just trust them? Also, since on-device scanning already exists, couldn't they then abuse that as well?

Also, before iOS 15, all Siri queries were handled on the server, and now they're on your device.
 
Oh my... Yeah, we have no common ground upon which to discuss this productively.
So you're saying a person is going through my personal files and looking at them? Hmmm, I can't find that anywhere in the articles and documentation.
 
I think a user pointed out… that perhaps, they have a mandate to do so. And maybe, this method has privacy in mind… I can’t say. Listening to Rene speak about it… it seems Apple is trying to keep their hands clean as possible.

I mean, I get it. Because you and people who oppose this CSAM method are fully against child abuse, however… are totally against Apple scanning personal files/photos. But it’s tough position for Apple to be in… I highly doubt they want to be in a business of scanning Apple users personal files/photos given their track record against privacy.

There is a lot of misinformation on this. If Apple was really scanning outside of warrants and subpeonas, the number of found issues would have far exceeded the 265 they reported last year. Apple’s docs say they can, not that they would. There is no law today that says they “have to” scan.
 
  • Like
Reactions: Pummers
But it should not happen on my device. I didn’t buy my phone to spy on me.

There are only three places to check.

1. On your device before Apple receives them
2. On a third party server in between
3. On Apple's server after they received the image

If, which I don't think it is, Apple wants to stop this material from coming onto their servers at all, they can't use option 3. If you won't have it on 1, it only leaves option 2 available.

Would you like to get a third party involved?
 
Why do folks keep harboring that it is iOS vs Android for selection. It isn’t.
It is iOS vs Google vs Samsung vs OnePlus vs Sony vs Huawei vs ….
There are a lot of flavors of Android and they are quite different.
Correct. It's all Android in the backend.

Huawei is a really bad example to convince me about privacy concerns by the way.
 
There are only three places to check.

1. On your device before Apple receives them
2. On a third party server in between
3. On Apple's server after they received the image

If, which I don't think it is, Apple wants to stop this material from coming onto their servers at all, they can't use option 3. If you won't have it on 1, it only leaves option 2 available.

Would you like to get a third party involved?
Why not go back to Apple's servers simply providing a tunnel between your iPhone and Mac? Nothing stored, just relaying encrypted traffic between your own devices. Then they don't have to worry about nefarious content being stored on their hardware.
 
Why not go back to Apple's servers simply providing a tunnel between your iPhone and Mac? Nothing stored, just relaying encrypted traffic between your own devices. Then they don't have to worry about nefarious content being stored on their hardware.
I would LOVE to use my Mac as my "cloud" and have my photos update wirelessly with end to end encryption. That would be sick! However, I'm not opposed to using iCloud either. Just sucks to have to pay for more storage.
 
There's a difference. Other vendors have the scanning software running on their server and that software can only scan what YOU feed it by uploading your images.

Apple CSAM software runs on your device. It basicly can scan all the mages on your phone, but Apple promises that it only scans your images before uploading them to the cloud.

But do we believe that promise? The software is already present on your device...

Spotlight also scans most of my files. iCloud Backup scans most of my files. Photos scans every photo in my library. Apple could just make small changes to those technologies and they would be superb for catching a lot more.

When a device has Internet connection and the operating system is optimised for being installed, updated, configured from servers belong to the same company which makes the OS, you really have no choice but to trust them. Their capability is almost unlimited when it comes to changing the software.

The CSAM detection tool just shows how much Apple controls the devices but this isn't new. It was new in 2008.

So for me, nothing has really changed. Apple is still almost all-powerful when it comes to my Apple devices and that's fine. I trust them.
 
  • Like
Reactions: artfossil
No I’m not going to leave because what’s the alternative? Google? Privacy is a myth

I’m totally against CSAM so let’s just get this out of the way.

What I don’t like is Apple has the capability to scan your photos, messages and (?) without your knowledge. To clarify when I say without your knowledge yes they’ve announced they’re scanning your iPhone but you can’t see the scan happening or what they’re scanning.

If they were just scanning iCloud I wouldn’t have a problem with that because it’s their server so they set the standards. My issue is they can remotely and secretly scan your your device. Apple has a policy that it will comply with whatever local laws in the country they do business with. You can bet if China tells Apple they want them to scan iMessages on certain iPhones they will do it
 
Then what are you doing spending large sums of money with a company you don't trust? They've explained how the system will be used. If you don't believe them, then you have no reason to believe anything they've ever said about privacy.
It’s called marketing. Apple doesn’t give a crap about privacy and they never have. Apple once tried advertising but sucked at it so they came up with a new marketing strategy. I’m a big Apple fan and love their products so this isn’t going to change anything for me. It’s about being realistic. Apple is a corporation not a charity. They’re not in the business to help the world so you have to think about everything they do has one purpose. To sell a product and make a profit. Even their quasi-charitable actions are just to attract people to buy their products. I’m not Apple hater. I have an iMac, iPad, iPhone and will probably buy the next MacBook Pro. I’m just a realist. If you don’t worship these companies like a religion you won’t be disappointed when they do something that you’re not happy with. If you like the products buy them.
 
Status
Not open for further replies.
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.