Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Will you leave the Apple ecosystem because of CSAM?


  • Total voters
    1,040
Status
Not open for further replies.
If your happy to be guilty until Apple prove you innocent that's fine for you.

Im innocent and I don't need Apple invading my privacy to prove it.

Would you allow anyone into your home to search it without any reason?

Remember who said this........

If you don’t use iCloud photos, there is no scanning.

I don’t, and never have, used cloud services..iCloud or otherwise.
 
I trusted Apple with all the other technologies on my devices and I trust them with new ones.
That is, of course, your prerogative, just as it's my prerogative not to trust them any longer. And I do not.

Get over it.
Oh, bless your heart :) I am over it, where "it" is defined as "Apple." As our devices require replacement or when Apple forces their spyware upon our existing Apple devices, they'll be replaced with non-Apple devices. Unfortunate, but necessary.

Nothing's gonna change and the sky isn't falling.
Don't look now, but it has changed. Probably irrevocably. As somebody noted in another thread: Trust: Hard to win, easy to lose, impossible to recover.

No, the sky isn't falling, but I no longer trust Apple and will no longer grace them with the benefit of my custom.
 
Meanwhile, the gubment is already trying to force Apple into installing a backdoor. Note: This bill didn't get any votes, but as we know, all they have to do is include this bill with another bill (an omnibus bill) and they'll push it through.

 
And so the slippery slope begins.... on a "per country" basis... which means not necessarily just CSAM images... but any "illegal" images.
To be fair: They don't admit, or even suggest, "expansion" means "to other types of images or data." They claim:
Apple confirmed to 9to5Mac today that any expansion of the CSAM detection feature outside of the United States will take place on a country-by-country basis depending on local laws and regulations. The company did not provide a specific timetable on when, or if, it will expand CSAM detection to additional countries.
[emphasis added]
 
  • Like
Reactions: Pummers
Curious how no one even bothers to mention that you can easily opt-out, by not using iCloud..

Hmm..iCloud photos is opt-in, actually..
 
  • Haha
Reactions: Pummers
To be fair: They don't admit, or even suggest, "expansion" means "to other types of images or data." They claim:

[emphasis added]
Apple is already filtering words and content in China. Are you really so naïve as to think that this will NOT be abused?

Remember, the CSAM database is a US thing. China will have their own database.
 
  • Like
Reactions: Pummers
Until the LEO shows up at your door you will discount anything shown judging by your stance.
During the next year if Apple launches this, we will likely find out.
Nobody will show up at my door. Even if that 1 in a trillion happens, there will be no case against me because I don't have anything illegal in my possession, so what are they gonna do?
 
  • Haha
Reactions: Pummers
Holy smokes! My wife just surprised the livin' daylights outta me!

I mentioned the "Apple will expand CSAM to other countries..." article. She just shook her head. "I realize how much you love your new iPad," I said. "I won't force you to get rid of it." "Yeah, but I don't want it with that stuff on it," she replied.

That's really sad. Brand new iPad Pro we just bought her for her birthday not a year ago. She loved that iPad :(

Now, Apple, I am angry
hot.gif
 
Apple is already filtering words and content in China. Are you really so naïve as to think that this will NOT be abused?

Remember, the CSAM database is a US thing. China will have their own database.
Apple already said that all iPhones will have the same database. It's not going to vary by country.
 
  • Haha
Reactions: dk001
Curious how you've managed to miss the numerous mentions of that in this and the other threads.
I can scroll to the top of this page and there no mention of it, just how Leo is going to show up at your door and such.

Again, this whole issue is moot if you just DON'T USE iCloud. It's very simple.
 
Last edited by a moderator:
  • Like
Reactions: Jayson A
Holy smokes! My wife just surprised the livin' daylights outta me!

I mentioned the "Apple will expand CSAM to other countries..." article. She just shook her head. "I realize how much you love your new iPad," I said. "I won't force you to get rid of it." "Yeah, but I don't want it with that stuff on it," she replied.

That's really sad. Brand new iPad Pro we just bought her for her birthday not a year ago. She loved that iPad :(

Now, Apple, I am angry
hot.gif
What "stuff"?
 
Nobody will show up at my door. Even if that 1 in a trillion happens, there will be no case against me because I don't have anything illegal in my possession, so what are they gonna do?
As I've asked before... when can we install the camera in your house? We promise we'll just scan for child abuse, and never EVER look at your wife in her underwear.
 
I can scroll to the top of this page and there I no mention of it, just how Leo is going to show up at your door and such.
This thread is now 45 pages long. It's been mentioned over and over and over and over and over and over and over and over and... again.

Again, this whole issue is moot if you just DON'T USE iCloud. It's very simple.
Now it's just been mentioned again.

This is pretty easy: Some, perhaps many, people don't. want. file. scanners. on. "their". devices. Clear, now?
 
I can scroll to the top of this page and there no mention of it, just how Leo is going to show up at your door and such.

Again, this whole issue is moot if you just DON'T USE iCloud. It's very simple.
The whole issue is moot even if you do use iCloud. Nobody is going to show up at your door for any reason if you're not collecting child porn. End of story. How are people twisting this into their own little scary world, I have no idea. No, your privacy has not been invaded and nobody is showing up at your door.
 
Nobody will show up at my door. Even if that 1 in a trillion happens, there will be no case against me because I don't have anything illegal in my possession, so what are they gonna do?
You're missing the point, Jayson. It's not about whether or not you personally have anything illegal in your possession, it's a question of you being comfortable with being constantly "checked" to make sure you don't.

I mean, if you got stopped EVERY SINGLE TIME by the TSA just to "make sure" you weren't a terrorist or carrying a weapon, EVERY SINGLE TIME, you'd start to feel a bit persecuted, would you not?

How about if every single time you drove to work, the same cop stopped you and searched your car for pot? Doesn't matter if you've never had pot in your life, it's "just to protect the children", because you pass a school on your drive, and we don't want any pot dealers selling to kids.

Do you see how it's an invasion of privacy, illegal search, and even harassment? THAT is what people are fighting agains.
 
What "stuff"?
Obviously the "stuff" is the ability for a remote actor to run scans on it.

The fact so many people are Okay with this utterly makes me lose faith in the security of our future. Clearly people do NOT care until the blatant negative effects that CAN happen ARE happening. Thats just pure irrisponsibility.
 
  • Like
Reactions: eltoslightfoot
I can scroll to the top of this page and there no mention of it, just how Leo is going to show up at your door and such.

Again, this whole issue is moot if you just DON'T USE iCloud. It's very simple.
That doesn't stop the client-side code and database from still landing on our devices. The only way to avoid that is to avoid iOS 15. We all read that this applies to iCloud Photos, but we object to having the software check photos ON-DEVICE. This is one of the few times where having something on-device puts us at higher risk than on-server. I'm about 3 features away from turning iCloud off entirely.
 
  • Like
Reactions: eltoslightfoot
You're missing the point, Jayson. It's not about whether or not you personally have anything illegal in your possession, it's a question of you being comfortable with being constantly "checked" to make sure you don't.

I mean, if you got stopped EVERY SINGLE TIME by the TSA just to "make sure" you weren't a terrorist or carrying a weapon, EVERY SINGLE TIME, you'd start to feel a bit persecuted, would you not?

How about if every single time you drove to work, the same cop stopped you and searched your car for pot? Doesn't matter if you've never had pot in your life, it's "just to protect the children", because you pass a school on your drive, and we don't want any pot dealers selling to kids.

Do you see how it's an invasion of privacy, illegal search, and even harassment? THAT is what people are fighting agains.
I'm not afraid of my photos being checked, no, not one bit.

Also about your analogies, they're stupid. No human is physically grabbing my phone and looking through it.

MY photos will be 100% private, so yeah, nothing to worry about. I'll keep using iCloud and I'll let you know when people show up at my door to arrest me.
 
  • Haha
Reactions: Pummers
That doesn't stop the client-side code and database from still landing on our devices. The only way to avoid that is to avoid iOS 15.
And everything else you "turn off" on your phone. Guess what... it's STILL THERE.
 
Status
Not open for further replies.
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.