Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Will you update to iOS 15 and let Apple’s operating system search your personal devices for CSAM?

  • Yes, I will update to iOS 15

    Votes: 117 68.0%
  • No, I will stay on iOS 14.7 or earlier

    Votes: 55 32.0%

  • Total voters
    172
Mass surveillance? Seriously? NOTHING stored directly on your phone is viewable by Apple or anyone else except yourself. However, if you have multiple CSAM images on your phone AND try to send those to iCloud, then you voluntarily give up all rights to privacy.



Ya think? It's like me refusing to go outside ever again because I might get struck by a meteorite. Oh, or maybe I'll be struck by lightning or gunned down in a drive-by shooting. I mean, what criminal walks down the street thinking, "Hey, let me knock this random guy out, unlock his iPhone and ruin his life by texting him some child porn, downloading it to his phone, and uploading it to his iCloud!" LOL!

In any case, I think the logic that "We can't do X because there is an extremely remote chance it may be abused or used against somebody" is flawed. ANY technology can be abused. But do we just eliminate all of it? Of course not!
I went ahead and updated to iOS 14 because all the scenarios for abuse of the contact tracing system that I ran through were very focused in how it could be done. That said, I still have my old phone running iOS 13.4 without contact tracing code in case there were issues. This hashing of images could so easily be abused without us having any warning. Any government could threaten Apple and twist their arm to search for other known images. I’d like to know why Apple is even launching this in the first place. I’d rather them introduce a whitelist for incoming SMS/iMessage as that would be much better solution to protect the children from pictures being sent to them, not to mention save the rest of us from relentless political messages on election years (seriously, it was so bad last year I almost cancelled my cell service).

The chances are higher than you think. There has been a shooting nearly every night for the past week in one neighborhood near me. Nobody knows why, other than bad parenting combined with social media and kids thinking GTA is real life. One of the shooters was 15 and didn’t live anywhere near the bars he shot up.

Try using an iPhone without iCloud nowadays. Because you have to keep iCloud turned off to prevent your phone from automatically uploading call logs and other background data to iCloud. Also, many other features like Apple Home barely work or don’t work at all unless you use iCloud. You don’t have a choice. Give us your data so you can turn that light on with your phone. Just like you can’t even download free apps without an Apple ID. When you ask why, you get moderated because questioning Apple’s motives is against the TOS of Apple Support Community.

All this does is punish those who store and receive indecent material, and puts a chilling effect on people that don’t have stuff to hide like myself. It does nothing for those actually causing harm to children. The algorithms don’t have a way to identify fresh material, only that which has been circulated and came across law enforcement in the past.

The whole argument that “if it saves 1” is bogus. Put millions of people and entire societies at risk to maybe catch a handful of people with dirty secrets. That’s how we ended up with the Patriot act and let the government wiretap and collect all communications that we still have to deal with 20 years later.

Maybe eliminating some technology would do the human race good. I should’ve known better than to trust a technology company. Time to dig out the DSLR.
 
Maybe someone can clear this up for me. This is something that already takes place on iCloud photos, isn’t it? I didn’t really think the whole deal about scanning photos was “new”.

So basically the only change here is the “where” it’s being done. Not the “what” is being done.
 
Maybe someone can clear this up for me. This is something that already takes place on iCloud photos, isn’t it? I didn’t really think the whole deal about scanning photos was “new”.

So basically the only change here is the “where” it’s being done. Not the “what” is being done.
Your devices would scan photos and perform object recognition to enable search and group photos by the people in the photos, but there wasn’t framework to report it to authorities, or to Apple. This update adds a whistleblower to your phone so that if some image hashes match a secret list, you become a big, red dot on Apple’s radar.
 
  • Like
Reactions: turbineseaplane
I went ahead and updated to iOS 14 because all the scenarios for abuse of the contact tracing system that I ran through were very focused in how it could be done. That said, I still have my old phone running iOS 13.4 without contact tracing code in case there were issues. This hashing of images could so easily be abused without us having any warning. Any government could threaten Apple and twist their arm to search for other known images. I’d like to know why Apple is even launching this in the first place. I’d rather them introduce a whitelist for incoming SMS/iMessage as that would be much better solution to protect the children from pictures being sent to them, not to mention save the rest of us from relentless political messages on election years (seriously, it was so bad last year I almost cancelled my cell service).

The chances are higher than you think. There has been a shooting nearly every night for the past week in one neighborhood near me. Nobody knows why, other than bad parenting combined with social media and kids thinking GTA is real life. One of the shooters was 15 and didn’t live anywhere near the bars he shot up.

Try using an iPhone without iCloud nowadays. Because you have to keep iCloud turned off to prevent your phone from automatically uploading call logs and other background data to iCloud. Also, many other features like Apple Home barely work or don’t work at all unless you use iCloud. You don’t have a choice. Give us your data so you can turn that light on with your phone. Just like you can’t even download free apps without an Apple ID. When you ask why, you get moderated because questioning Apple’s motives is against the TOS of Apple Support Community.

All this does is punish those who store and receive indecent material, and puts a chilling effect on people that don’t have stuff to hide like myself. It does nothing for those actually causing harm to children. The algorithms don’t have a way to identify fresh material, only that which has been circulated and came across law enforcement in the past.

The whole argument that “if it saves 1” is bogus. Put millions of people and entire societies at risk to maybe catch a handful of people with dirty secrets. That’s how we ended up with the Patriot act and let the government wiretap and collect all communications that we still have to deal with 20 years later.

Maybe eliminating some technology would do the human race good. I should’ve known better than to trust a technology company. Time to dig out the DSLR.

But I bet you still go outside, in spite of the shootings. No point in living your life in fear or paranoia.

I think you're drastically underestimating how widespread CSAM is. Obviously Apple wouldn't be going through all this trouble to catch just "a few" people. And it seems you're also propagating the myth that people who download and distribute CSAM aren't part of the "actual" problem ("All this does is punish those who store and receive indecent material . . . It does nothing for those actually causing harm to children."). But they're the ones creating the demand that fuels the supply.

Look, let's cut the melodrama about "millions of people and entire societies" being at risk now. That is patently absurd. ALL that's happening is anonymous on-device CSAM detection that NEVER leaves your phone unless you upload multiple CSAM images to iCloud. So that on-device scanning is not putting you any more at risk than anything else you store on your phone, because Apple can't see or interpret any of it. You want to argue that perhaps some government could "twist" Apple's arm into abusing their technology. I have every confidence that Apple will not bow down to such a thing out of principle, and even if they did, they would inform users in that country of what they're doing, so those users could make a choice to jump ship or not.
 
  • Angry
Reactions: user_xyz
Mass surveillance? Seriously? NOTHING stored directly on your phone is viewable by Apple or anyone else except yourself. However, if you have multiple CSAM images on your phone AND try to send those to iCloud, then you voluntarily give up all rights to privacy.
Right now the vast majority of people agree with the goal of fighting child abuse, but what about when China requires Apple to scan phones for photos or articles about Tiananmen Square, or the Uighur genocide, and report "offenders" to the government?

Or when Russia requires Apple to report users who appear to be in a same-sex relationship, as determined by scanning their messages?

Is that not mass surveillance under the cover of "We're required to comply with local law."?
 
Right now the vast majority of people agree with the goal of fighting child abuse, but what about when China requires Apple to scan phones for photos or articles about Tiananmen Square, or the Uighur genocide, and report "offenders" to the government?

Or when Russia requires Apple to report users who appear to be in a same-sex relationship, as determined by scanning their messages?

Is that not mass surveillance under the cover of "We're required to comply with local law."?

You're talking about what-ifs, not current reality. We'll cross that bridge when it comes. No point in speculating. But I will say that although Apple is obviously not perfect, I am PRETTY sure they will not start violating human rights just because a government asks them to. I don't think they're quire THAT devoid of all principles and would fight tooth and nail to circumvent those corrupt requests if they are made.
 
  • Sad
Reactions: icanhazmac
You're talking about what-ifs, not current reality. We'll cross that bridge when it comes. No point in speculating. But I will say that although Apple is obviously not perfect, I am PRETTY sure they will not start violating human rights just because a government asks them to. I don't think they're quire THAT devoid of all principles and would fight tooth and nail to circumvent those corrupt requests if they are made.

I don’t know about that. Apple does upload iCloud backups of Chinese customers to a company with ties to the CCP after all.

 

Which is exactly why I don’t link to NYT.

I was able to get it to come up on Google after a few tries.

This is the key part of the article.

In China, Apple has ceded legal ownership of its customers’ data to Guizhou-Cloud Big Data, or GCBD, a company owned by the government of Guizhou Province, whose capital is Guiyang. Apple recently required its Chinese customers to accept new iCloud terms and conditions that list GCBD as the service provider and Apple as “an additional party.” Apple told customers the change was to “improve iCloud services in China mainland and comply with Chinese regulations.”

The terms and conditions included a new provision that does not appear in other countries: “Apple and GCBD will have access to all data that you store on this service” and can share that data “between each other under applicable law.”
 
  • Like
Reactions: Jstuts5797
I'll be upgrading.. The only photos in my Photo Library are pet photos, photos with friends, photos of cars & electronics, and memes.

I'm only concerned about this database if it's used for malicious purposes in other countries.. Replace CSAM with something to identify protestors that a hostile government deems illegal or whatever.. Tim apple Cook said that Apple complies with all laws in countries that they operate in...
 
You're talking about what-ifs, not current reality. We'll cross that bridge when it comes. No point in speculating. But I will say that although Apple is obviously not perfect, I am PRETTY sure they will not start violating human rights just because a government asks them to. I don't think they're quire THAT devoid of all principles and would fight tooth and nail to circumvent those corrupt requests if they are made.
You’re right. Society will cross that bridge because “Well, they’re doing it for a good thing.” Many what-ifs in the recent past are now reality. COVID passports and fully functional contact tracing on our phones when it was originally “only an API” were what-ifs that are now reality. Central Bank Digital Currencies (read. Cashless society) was a what-if, now is a reality in some parts of the world.

You see why I’m concerned? We’re so focused on each little step that we don’t realize how far down the staircase we’ve already fallen. Despite all the advancements in security and encryption, our data is less safe now than it was 10 years ago. It’s all stored in some random data centers that who knows who has access to them. What backdoors are built into the servers themselves or the software they run? In the past, it took someone to actually break into your house and steal an external hard drive or a computer to steal your data. Now, they send you an authentic-looking email and trick you into trying to sign into a service through the link. Those that fall for it have their account breached.

Apple already reportedly abuses human rights with slave labor in their supply chains.
 
Last edited:
Which is exactly why I don’t link to NYT.

I was able to get it to come up on Google after a few tries.

This is the key part of the article.

OK, so iCloud users are agreeing to that. If they don’t like it, they can not use iCloud. It’s not like they’re being sneaky about it.
 
OK, so iCloud users are agreeing to that. If they don’t like it, they can not use iCloud. It’s not like they’re being sneaky about it.

You’re missing the point entirely. It’s proof that Apple will bend to the will of governments which means it isn’t much of a stretch to believe this new on-device scanning functionality can and will be adapted to look for other content governments disagree with down the line.

I mean imagine this… You post a photo of yourself protesting to social media, the government takes that photo and sends the hash to Apple with priority, Apple’s on-device scanning flags it for review, and now the government compels Apple to hand over your info. It’s really not hard to see how this technology can be abused with some arm-twisting by world governments.
 
It's true that iCloud in general has always been a buggy service though.
It's slowly getting more stable, but the experience doesn't feel Apple-like, more Microsoft-like.
Maybe I’ll start using it in 10 years time. 🤔😂
 
You’re missing the point entirely. It’s proof that Apple will bend to the will of governments which means it isn’t much of a stretch to believe this new on-device scanning functionality can and will be adapted to look for other content governments disagree with down the line.

I mean imagine this… You post a photo of yourself protesting to social media, the government takes that photo and sends the hash to Apple with priority, Apple’s on-device scanning flags it for review, and now the government compels Apple to hand over your info. It’s really not hard to see how this technology can be abused with some arm-twisting by world governments.
Just because it could in theory happen doesn't mean that it will. Not saying it won't happen, but that's the whole issue with slippery slope arguments.
 
Since I buy a new iPhone every year, it's already baked-in. Will update my iPad for new features.
 
Meant to reply to this earlier. What we're saying is it's not optional IF you are using iCloud. It's all or nothing.

Turning off the photos toggle in iCloud settings should be enough to stop it.

Apple has also confirmed that it cannot detect known CSAM images in iCloud Backups if ‌iCloud Photos‌ is disabled on a user's device.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.