Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Will you update to iOS 15 and let Apple’s operating system search your personal devices for CSAM?

  • Yes, I will update to iOS 15

    Votes: 117 68.0%
  • No, I will stay on iOS 14.7 or earlier

    Votes: 55 32.0%

  • Total voters
    172

cwosigns

macrumors 68020
Original poster
Jul 8, 2008
2,267
2,746
Columbus,OH
In light of the recent announcement that Apple will be scanning devices client-side for CSAM, will you still update to iOS 15 or will you stay on iOS 14?
 
Yes, on day 1.

I believe that in time, sentiment will come round to support and accept what Apple is doing. Right now, there is so little actual information going around, and more of general hysteria and knee-jerk reactions, so I don't expect the uproar we are seeing now to be representative of the overall mood months down the road.
 
Yes.

I already use iCloud Photo Library, so Apple has already been scanning my photos. I take the time to investigate privacy policies, and I’m comfortable trading some of my personal privacy for the convenience of Apple’s products and services.

Content I don’t want even Apple to access I just keep on an encrypted local drive or just physical paper copies (like my birth certificate or a few medical records).
 
Yes, day 1 for me too. Apple is already scanning your photos for facial recognition, animals, other objects, and now even text. Where was the uproar then? In my mind, this really isn’t anything entirely new. They are not looking at your photos like a human would, but generating a hash from them to check against a database. This is all done on your device, just like the other types of recognition mentioned above.

Maybe I’m way off base here, but I think of this as being similar to antivirus/malware detection. Those tools are also looking at your files to check against a database of known bad files. Apparently people don’t seem to mind these “looking” at their files, but now they do with photos? 🤷‍♂️
 
I’m on the beta so I’ll say that puts me in the “yes” automatically.

Do I wonder what this could be used to do in the future? Sure I do. At the same time though, it would be an absolute reputation killer for Apple if this goes wrong. They are where they are partly because of that reputation. They don’t want to blow it.

I am the father of a daughter, so I actually like the sound of the parental notification thing. She’s not at the age to have a phone yet though, so that’s not a feature I’d be utilizing anytime soon.
 
Well, I was going to. I was really looking forward to the new Safari features to help reduce tab clutter, but that’s a big, Fat HELL NO now. What was the idea behind this? So now I can get in trouble when some random joker sends me a picture over SMS? What happens when an account is breached and a bunch of “unapproved” photos get uploaded to iCloud and then synced to your devices?
 
Well, I was going to. I was really looking forward to the new Safari features to help reduce tab clutter, but that’s a big, Fat HELL NO now. What was the idea behind this? So now I can get in trouble when some random joker sends me a picture over SMS? What happens when an account is breached and a bunch of “unapproved” photos get uploaded to iCloud and then synced to your devices?
Yes for me, most certainly day 1. But the privacy implications have got me thinking, that’s for sure.

You can’t get in trouble if someone sends you a pic over sms. Only if you then save it to your icloud photo library, it gets hashed -- this hash is compared to a known database of photos. That’s why the term “scanning photos” isn’t quite right.
 
  • Like
Reactions: CMoore515
wlll o
Yes for me, most certainly day 1. But the privacy implications have got me thinking, that’s for sure.

You can’t get in trouble if someone sends you a pic over sms. Only if you then save it to your icloud photo library, it gets hashed -- this hash is compared to a known database of photos. That’s why the term “scanning photos” isn’t quite right.
right, but several messaging apps DO automatically save pics you receive to your camera roll. Unwanted spam messages are prevalent. What’s to prevent someone sending me 30 spammed CSAM pics while my phone is on DND? They get saved and uploaded to iCloud and next thing you know, I get traded for cigarettes in the prison yard.
 
Well, I was going to. I was really looking forward to the new Safari features to help reduce tab clutter, but that’s a big, Fat HELL NO now. What was the idea behind this? So now I can get in trouble when some random joker sends me a picture over SMS? What happens when an account is breached and a bunch of “unapproved” photos get uploaded to iCloud and then synced to your devices?

Well it's a shame that you are never going to be able to update your iPhone again because you misunderstood what Apple is doing. The child safety features with Messages has NOTHING to do with the CSAM reporting with iCloud. All Messages will do is warn children before they view or send sexually explicit material and their parents will be notified only IF they choose to view or send that material.

Also, don't you use a strong password and 2FA for your iCloud account? You should be! In fact, 2FA is mandatory. So in the extremely remote chance that your iCloud account is breached by some expert hacker who also has one of your Apple devices in their possession (to get the 2FA code) AND device passcode AND happens to know and dislike you and wants to ruin your life by uploading CSAM to your iCloud account, then I guess you have something to be worried about. If you think there's a good chance of that happening, then I'd submit that either you're running with the wrong crowd or you're an extremely paranoid person.
 
Last edited:
  • Like
Reactions: Nightfury326
I am already running the betas. I haven’t decided if I will keep uploading photos to iCloud because I don’t like the privacy invasion and the potential for false positives but I’m not going to hold my OS back when I don’t have anything to worry about with my screenshots of video games and pictures of tech.

I also am hoping that iOS 15 GM finally fixes the problem with 5G on a number of T-Mobile MVNOs. Plus I used to be in the Android camp flashing my device all the time so I get excited about updates.
 
  • Like
Reactions: Jstuts5797
Well it's a shame that you are never going to be able to update your iPhone again because you misunderstood what Apple is doing. The child safety features with Messages has NOTHING to do with the CSAM reporting with iCloud. All Messages will do is warn children before they view or send sexually explicit material and their parents will be notified only IF they choose to view or send that material.

Also, don't you use a strong password and 2FA for your iCloud account? You should be! In fact, 2FA is mandatory. So in the extremely remote chance that your iCloud account is breached by some expert hacker who also has one of your Apple devices in their possession (to get the 2FA code) AND device passcode AND happens to know and dislike you and wants to ruin your life by uploading CSAM to your iCloud account, then I guess you have something to be worried about. If you think there's a good chance of that happening, then I'd submit that either you're running with the wrong crowd or you're an extremely paranoid person who needs to seek professional help!
Wow, OK. Well, all 3 pieces of this come with iOS 15. Otherwise, I was really looking forward to iOS 15, but not if the groundwork for mass surveillance on-device is going to be installed. I invested in Apple products because they didn’t cross that line, but I see that was a marketing play.

As for my account security, I have a strong password, 2FA, everything. That still doesn’t stop someone from knocking me out, holding my phone in front of my face to unlock it, and then doing some serious damage to my account. Chances of that happening are, granted, pretty slim, but let’s face it. One of us has an insanely higher chance of being in a car accident than coming down with COVID, but guess which problem shut down the entire globe. I drive an electric car in a fossil-fuel heavy area. Just my car saying EV on the back is enough to induce road rage.

And no, I don’t need professional help, I need Silicon Valley to stop acting like parents and let me use the tools I spent good money on the way I want to use them. Imagine having a SWAT team show up because a hammer reported to the authorities that it touched glass (hammer came into contact with glass. You must be breaking into someone else’s house, or maybe you just set it down on a glass table?).
 
Wow, OK. Well, all 3 pieces of this come with iOS 15. Otherwise, I was really looking forward to iOS 15, but not if the groundwork for mass surveillance on-device is going to be installed. I invested in Apple products because they didn’t cross that line, but I see that was a marketing play.

Mass surveillance? Seriously? NOTHING stored directly on your phone is viewable by Apple or anyone else except yourself. However, if you have multiple CSAM images on your phone AND try to send those to iCloud, then you voluntarily give up all rights to privacy.

As for my account security, I have a strong password, 2FA, everything. That still doesn’t stop someone from knocking me out, holding my phone in front of my face to unlock it, and then doing some serious damage to my account. Chances of that happening are, granted, pretty slim . . .

Ya think? It's like me refusing to go outside ever again because I might get struck by a meteorite. Oh, or maybe I'll be struck by lightning or gunned down in a drive-by shooting. I mean, what criminal walks down the street thinking, "Hey, let me knock this random guy out, unlock his iPhone and ruin his life by texting him some child porn, downloading it to his phone, and uploading it to his iCloud!" LOL!

In any case, I think the logic that "We can't do X because there is an extremely remote chance it may be abused or used against somebody" is flawed. ANY technology can be abused. But do we just eliminate all of it? Of course not!
 
Last edited:
I am advising my family not to upgrade immediately because safari sucks. I hate the layout and such. I will however.
 
This topic is posted every year and every year the same people go „I will NEVER update“ but they‘re the first to hammer the OTA servers come launch day or GM release.

It‘s a myth that people stay back on older releases to prove a point no one cares about.
 
CSAM is opt-in only.
I'm already on iOS 15 and don't see why one wouldn't want to upgrade.
 
  • Like
Reactions: jagolden
This topic is posted every year and every year the same people go „I will NEVER update“ but they‘re the first to hammer the OTA servers come launch day or GM release.

It‘s a myth that people stay back on older releases to prove a point no one cares about.

LOL! The usual progression:

1. Apple releases a new device/update
2. "It's awful! WTF were they thinking?! This is the end of Apple as we know it!"
3. "Imma go around the forum telling everyone how trash this new device/update is!"
4. cool down period
5. buys the device or installs the update
6. "I never thought I'd say this, but it's starting to grow on me."
7. "This is a great device/update."

rinse and repeat

🤣
 
Don’t Use iCloud, since it’s always buggy for me.
It's true that iCloud in general has always been a buggy service though.
It's slowly getting more stable, but the experience doesn't feel Apple-like, more Microsoft-like.
 
Source? What would be the point of making this optional? It would be pointless then, because why would anyone opt-in, guilty or not?
MacRumors.
There's been a whole article this week JUST to say this.

Basically, it's like a Parental Control thing. Has always and will always be opt-in.
 
MacRumors.
There's been a whole article this week JUST to say this.

Basically, it's like a Parental Control thing. Has always and will always be opt-in.

No, you're referring to what Apple calls "Communication safety in Messages", not CSAM Detection. See here:

 
  • Like
Reactions: chabig
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.