Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Do you still believe in Apple's privacy values?

  • Yes, CSAM doesnt really affect/bother me

    Votes: 33 37.1%
  • Yes, CSAM is bad but Apple still cares about privacy more than the competition

    Votes: 15 16.9%
  • No, "privacy" is only a marketing term now

    Votes: 41 46.1%

  • Total voters
    89

The Game 161

macrumors Nehalem
Dec 15, 2010
30,965
20,163
UK
Of course they do and likely more than anybody but this is an on going thing that I don’t see stopping anybody from adding it.
 

xxray

macrumors 68040
Jul 27, 2013
3,115
9,412
I think Apple is still more privacy-focused than Google or Microsoft. Apple is the one that innovated and led the way with privacy-focused features, some of which, have now been brought to Android.

However, the CSAM scanning is a big stain. I don’t think Apple did it because their whole privacy value is fraudulent. I think the whole thing happened out of arrogance. Apple, being Apple, thought they could engineer a way to save privacy while getting rid of child porn. They were so proud of it that they announced it as a feature. Seems they got their heads too far stuck up their own asses and didn’t realize how difficult of a problem this is to solve.

Now they’re stuck with this announcement that ruined their reputation. If they get rid of it, they still can’t get rid of child porn on their servers without invading privacy, trust has already been broken with privacy advocates, and they would have wasted months and money building something that never was used. If they use it, they damage their reputation more. No easy option imo.

Personally, I want them to get rid of it. But I don’t believe that Apple is suddenly evil and trying to find ways to get rid of privacy while putting up a front. Not yet, at least.
 

1rottenapple

macrumors 601
Apr 21, 2004
4,753
2,774
The attitude of their own privacy head already showed what they think by implying "well don't do illegal things," which is highly offensive considering what considered illegal in other countries can differ. In some countries, homosexuality is illegal. The implication of an in-device mass scanning system is chilling. From all tech companies, Apple should've been the wiser.

A publicly traded company can change hands/management in a blink. What assurance that Apple users have that the next management team will keep the pinky promise? Look at Google, and how they deleted their own "do no evil" motto.
Exactly and let’s be real the largest market can dictate the terms if apple wants access. Like they do already storing data centers in those countries. So what’s the difference what’s ok in one country will not be ok in another. The high and mighty attitude in a dogooder smokes screen offends me. If you want money over principle say it! I respect that.
 

lkalliance

macrumors 65816
Jul 17, 2015
1,415
4,533
Considering how Apple is willing to break the line by baking in such system into iOS itself, and then tell people to just not do illegal stuff, is mind boggling.
I agree that the proof that Apple will hold the line is in China, and it will be very telling what happens there, but this bit above I disagree with. Apple so far has defended your ability to "do illegal stuff" on their phones by fighting the FBI; they are not willing to sacrifice your privacy, and if that means that some bad stuff happens that they won't uncover then so be it.

Now this comes along and at the moment, at least, (and hopefully for longer) Apple is putting into place this system to hamper a user's ability to do this one specific thing, child pornography. I don't think it's offputting at all for Apple to say "just don't do it." They're not saying "just don't do anything illegal;" they are specifically saying "just don't use us to traffic in child pornography, because now there is a framework that might identify that you're doing that." I don't find that disagreeable at all. They've been specific (child pornography, specifically, and not illegal activity generally), and if there is someone willing to stand up and fight for their right to specifically do this thing, to traffic in child pornography without fear of being caught...well, then this time I agree that this is something where your privacy should come second.
 

ehanneken

macrumors newbie
Sep 9, 2021
2
3
[T]his time I agree that this is something where your privacy should come second.
Next time it may be too late to do anything about it. That Apple is checking for CSAM (and nudity sent to minors) is purely a matter of policy. The mechanism that they intend to install on their customers' devices doesn't know what CSAM or porn is. All that's necessary to make it check for something else (dissent, homosexual images, etc.), or to check when the device isn't about to send something to iCloud, or to notify someone other than parents or Apple, is a configuration change and a different database.
 
  • Like
Reactions: mainemini

QCassidy352

macrumors G5
Mar 20, 2003
12,065
6,106
Bay Area
It's the equivalent of the cops showing up once at week at your house to check if you have something to hide. You're guilty until proven indocent.
You get that this scanning happens on device, right? It’s a process that’s invisible to the user, cannot identify any data other than known hash values, and the results of which are not shared with anyone unless there are multiple hash matches to known CSAM. it’s not in any way whatsoever the equivalent of cops showing up at your house.
Why not submit your own personal data voluntarily to Apple? Put all your data into a shared folder on iCloud and share it with Tim Cook. You got nothing to hide, no?
Surely you see the false equivalence between on-device scanning for known hash values within a single app (photos) and “sharing all my data with Tim Cook on iCloud”?
 
  • Like
Reactions: lkalliance

zakarhino

Contributor
Sep 13, 2014
2,607
6,958
The current Apple already allowed things...

Exactly, which is why the "now" in the "No" option of the poll is a mistake. Apple have not suddenly switched from being privacy focused to being privacy-marketing focused, that's what they always have been! For years they have refused to implement the laundry list of things they should do to maximize user privacy. Apple's privacy model has always been based upon "privacy means trusting us with your unencrypted data" when in reality it should be "privacy means we're literally incapable of reading practically any of your data via zero access encryption"
 

zakarhino

Contributor
Sep 13, 2014
2,607
6,958
For example, if Apple were coerced into reporting every user with the Chinese Tank Man photo, they couldn't do it with the CSAM system without re-engineering the entire thing and rebuilding every user's safety vouchers

That's literally not how it works lol. If they were "coerced into reporting every user" (which by the way they were coerced in private by senators into implementing this system in the first place, so much for the "Apple will resist government demands" theory) they simply use a different hash database for a given region. Swapping out a database remotely doesn't constitute a "re-engineering." It's not outside the realm of possibility that China (for example) would request Apple use their government provided database instead of Apple's. What do you think Apple would say then? No?
 

zakarhino

Contributor
Sep 13, 2014
2,607
6,958
A publicly traded company can change hands/management in a blink. What assurance that Apple users have that the next management team will keep the pinky promise? Look at Google, and how they deleted their own "do no evil" motto.

This is what geniuses in all these threads keep missing when they talk about how they trust Apple. Today's "trust" of Apple is completely irrelevant because they have all of your unencrypted data, their entire privacy model only has the "trust" part of "trust but verify." The problem is that they are capable of abusing your data at any time, the fact they're not doing so right now is meaningless so long as they have the capability. This is the same reason why we shouldn't pass laws that grant administrations sweeping powers to fix things however they want just because we trust that specific administration today, tomorrow that administration could be completely different.

Similarly, Apple have a s*** ton of my data unencrypted right now. They also have me held hostage because of ecosystem buy in. The fact that I have even a modicum of trust for today's Apple is irrelevant if tomorrow they could switch lanes and use my unencrypted data for all sorts of things, whether it's increased targeted advertising, etc. The sad part is even if Apple started to use people's data for invasive targeted ads tomorrow people on this forum would be defending it because "Apple is a privacy company! I totally trust them!" all because of some inevitable marketing document and white paper about how 'super private' the ad system is.... when in reality they would be using your data the same way Google does.

This is why customers need to have power and autonomy over their smart devices. At least for the longest time on a mac I could install whatever software I want and use the device without it phoning home to Apple every 5 minutes with analytics data or sending hashes of files and apps I'm opening because "muh malware and evil software." Nowadays even that isn't possible, and has never been possible on iOS.
 

Ries

macrumors 68020
Apr 21, 2007
2,330
2,918
You get that this scanning happens on device, right? It’s a process that’s invisible to the user, cannot identify any data other than known hash values, and the results of which are not shared with anyone unless there are multiple hash matches to known CSAM. it’s not in any way whatsoever the equivalent of cops showing up at your house.

Surely you see the false equivalence between on-device scanning for known hash values within a single app (photos) and “sharing all my data with Tim Cook on iCloud”?

You do get that the device is the most private thing you got and they are scanning it? You might as well install a camera in your bedroom, but don't worry, it'll only send videos if it thinks something illegal happened....

Yes, it's the same as the cops showing up, the device is looking for wrong doings in one of your most private spaces. it assumed any picture you take is CSAM, why else would it scan it? Currently it doesn't care about you speeding or downloading a torrent, are your sure it'll stop at CSAM pictures or is this just another step towards the Orwellian surveillance?
 
  • Like
Reactions: airbusking

mw360

macrumors 68020
Aug 15, 2010
2,067
2,476
That's literally not how it works lol. If they were "coerced into reporting every user" (which by the way they were coerced in private by senators into implementing this system in the first place, so much for the "Apple will resist government demands" theory) they simply use a different hash database for a given region. Swapping out a database remotely doesn't constitute a "re-engineering." It's not outside the realm of possibility that China (for example) would request Apple use their government provided database instead of Apple's. What do you think Apple would say then? No?
I don't think you have read much on how it works. According to Apple, the hash of the database itself would be published on Apple's website so owners and researchers can verify the same database is used on any phone throughout the world. I didn't say swapping a database needed reengineering, I said changing the threshold from 30 to 1 would, and it would because the nature of the threshold is baked into the encryption of the safety vouchers themselves when they are created on the phone.
 

mw360

macrumors 68020
Aug 15, 2010
2,067
2,476
You do get that the device is the most private thing you got and they are scanning it? You might as well install a camera in your bedroom, but don't worry, it'll only send videos if it thinks something illegal happened....

Yes, it's the same as the cops showing up, the device is looking for wrong doings in one of your most private spaces. it assumed any picture you take is CSAM, why else would it scan it? Currently it doesn't care about you speeding or downloading a torrent, are your sure it'll stop at CSAM pictures or is this just another step towards the Orwellian surveillance?
They aren't scanning the device. They are scanning the images you are sending to iCloud. There's a huge difference and you know it.
 

QCassidy352

macrumors G5
Mar 20, 2003
12,065
6,106
Bay Area
You do get that the device is the most private thing you got and they are scanning it? You might as well install a camera in your bedroom, but don't worry, it'll only send videos if it thinks something illegal happened....

Yes, it's the same as the cops showing up, the device is looking for wrong doings in one of your most private spaces. it assumed any picture you take is CSAM, why else would it scan it? Currently it doesn't care about you speeding or downloading a torrent, are your sure it'll stop at CSAM pictures or is this just another step towards the Orwellian surveillance?
It’s not “assuming” your pictures are CSAM. It’s checking the hash values of pictures you uploaded to iCloud against a list of known hash values for CSAM. The check is done on your device, and if you don’t have known CSAM, literally nothing gets sent to Apple. I’m really, really struggling to see the issue here.

and if the argument is “well they could abuse this in xyz way in the future,” I’d say, yeah, they could, and they also *could* look at the contents of your iCloud backups, or read your email, or track your location with GPS. But they don’t, and I’m not going to live my life freaking out about what wrongful action a company *could* choose to take. The fact that they’ve implemented this on-device and with lots of transparency is a good indication to me that they’re still taking privacy seriously.
 
  • Love
Reactions: lkalliance

Jimmy James

macrumors 603
Oct 26, 2008
5,489
4,067
Magicland
You should have added a "I couldn't care less" option. I honestly couldn't care less about any of this. I don't do anything in my life or with my phone that would have me worrying about much. There are far too many other areas of our day to day life that are tracked and monitored that go way beyond our phones. I don't think people truly realize just how little privacy most of us actually have.
 

Attachments

  • D79901B0-DD8D-4538-864E-E8BFE7053B17.jpeg
    D79901B0-DD8D-4538-864E-E8BFE7053B17.jpeg
    172.7 KB · Views: 71
  • Haha
Reactions: MikeAK

MikeAK

macrumors regular
Oct 26, 2011
218
241
Jimmy James, That has to be the biggest load of horse **** I've read all year. Thanks for the laugh.
 

Abazigal

Contributor
Jul 18, 2011
20,382
23,857
Singapore
Very interested to see if people still believe that Apple is a privacy-oriented company. Yes, in this day and age, it is impossible to have 100% privacy. But when you think about the competition, the other companies, do you still believe that Apple cares about privacy?
I believe Apple does care about privacy, but I also feel that it needs to be better nuanced.

To put it another way, Apple sees a huge market for more privacy-oriented alternatives to many products and services currently being offered in the market today.

For starters, Apple defines privacy as informing users and letting them know about what is going on with their data. This is precisely what has been happening with the whole CSAM saga. Apple has been extremely upfront with their intentions of implementing a new means of identifying child pornography on Apple devices in as non-invasive a manner as possible. There was (understandably) a lot of backlash to it, and Apple is taking that feedback to heart.

Another example is how Apple is running ads in the App Store using only first-party data they collect themselves. The difference between this and what Google / Facebook are doing is that Apple doesn't send this data to third parties, nor do they buy data from third parties. Third parties are not able to discern more granular information about individuals in any manner. And I welcome this, because it's increasingly clear that companies like Facebook are never going to change their data collection and privacy policies, and if legislation can't stop them, then Apple is probably the next best-positioned to offer ads themselves in a way that does not go against its privacy culture (i.e. no personalized profiles being created with the intent of delivering ads or changing behavior).

Such a product promises to be a win for both advertisers (who can get in front of users) and users (who receive relevant app suggestions).

We also see this in numerous other initiatives and products (eg: Apple Card keeping your credit card information private, Siri and maps being allowed to access less data than the competition). Apple isn't saying that they will never use any of your information, but they are (rightfully) pointing out that our data is being vacuumed up by the industry in an extremely aggressive and unrepentant manner, and there probably is room for some sort of middle ground where services can still be provided for users without having to scoop up my data in the quantities that they are now.

DDG will probably never be as good as google search, but it's good enough to surface relevant search results for me. Apple Maps isn't as detailed as google maps (and downright useless in certain countries like the more rural parts of Malaysia), but it's good enough to get me about reliably in Singapore. And I think this will be a potential thorn in the side for a company like Google, when competing services become "good enough" that users become indifferent to using them over a superior alternative like Google's, and this makes them more open to other intangible perks like better privacy focus.

So yes, I still believe in Apple as being a more privacy-focused company compared to the rest of the industry, but it's also a matter of being clear about what is being done with my data, and Apple has been a lot of upfront about this relative to the competition.
 

humpbacktwale

macrumors regular
Dec 20, 2019
204
33
Yes, because from a PR point of view, letting all of this happen on a user device, as opposed to them doing it on the cloud, does sound more privacy focused. People just aren't comfortable either due to misunderstanding it, or applying slippery slope fallacies.
 

Feyl

Cancelled
Aug 24, 2013
964
1,951
Apple maybe cared about privacy before they decided to make it part of their marketing and build image on that. After that it went downhill. They started pleasing heavily totalitarian countries and to this day they've made numerous blunders that contradicts their supposed privacy efforts, including the on-device scanning of your stuff. So no. Apple doesn't care about privacy.
 

lkalliance

macrumors 65816
Jul 17, 2015
1,415
4,533
Next time it may be too late to do anything about it. That Apple is checking for CSAM (and nudity sent to minors) is purely a matter of policy. The mechanism that they intend to install on their customers' devices doesn't know what CSAM or porn is. All that's necessary to make it check for something else (dissent, homosexual images, etc.), or to check when the device isn't about to send something to iCloud, or to notify someone other than parents or Apple, is a configuration change and a different database.
I think that trust--explicit or implicit, purposeful or subconscious--is part of our very decision to be online at all. I don't doubt there are a bazillion ways Apple could be searching our iCloud-uploaded photos for anything they like, without telling us, without having to release a new feature on our phones. But I trust that they're not doing that. Perhaps they could have done the same with this feature, just doing it on the server instead of building something on the phone, except for the publicity opportunity or perhaps they've learned not to try to do it clandestinely. Or perhaps they actually do care about privacy.

Though I was dismayed when I heard about all of this, and I remain open to the arguments that you and others are making, I'm satisfied that Apple is doing its best on this, just like it did its best (along with Google, credit where it's due) on the contact tracing framework. I don't think this is a new thing, as if Apple didn't have the ability to do privacy-awful things before this. The real test will be how it stands up against governments around the world, and I'll be watching that nervously, but that worry exists with our without this implementation.

(On a completely different and off-topic note, I appreciate your grammatic choice in the slight alteration to the text you quoted. It's refreshing to see someone in any online communication with a commitment to the fine details of presentation like that.)
 
  • Like
Reactions: MacCheetah3

MacCheetah3

macrumors 68020
Nov 14, 2003
2,270
1,208
Central MN
Yes, but their woke agenda trumps it. Now CSAM isn't exactly woke, but it shows they're willing to play cops on your device if they think they have the moral high ground. CSAM or not, im going to upgrade / buy a new iPhone, but I'll never look at Apple the same way again and Android is suddenly an alternative I never would have considered before. It's the equivalent of the cops showing up once at week at your house to check if you have something to hide. You're guilty until proven indocent.
Not yet anyway.

Taking the system at face value, only scanning photos headed to the cloud, a better analogy would be monitoring your (snail) mail or freight-type shipments for illegal material — which does already happen.

I think that trust--explicit or implicit, purposeful or subconscious--is part of our very decision to be online at all. I don't doubt there are a bazillion ways Apple could be searching our iCloud-uploaded photos for anything they like, without telling us, without having to release a new feature on our phones. But I trust that they're not doing that. Perhaps they could have done the same with this feature, just doing it on the server instead of building something on the phone, except for the publicity opportunity or perhaps they've learned not to try to do it clandestinely. Or perhaps they actually do care about privacy.

Though I was dismayed when I heard about all of this, and I remain open to the arguments that you and others are making, I'm satisfied that Apple is doing its best on this, just like it did its best (along with Google, credit where it's due) on the contact tracing framework. I don't think this is a new thing, as if Apple didn't have the ability to do privacy-awful things before this. The real test will be how it stands up against governments around the world, and I'll be watching that nervously, but that worry exists with our without this implementation.
I think, this is also a key pivot point to the argument. Whether accepting of the current, immediately upcoming implementation or not, most of us agree that Apple needs to be extra careful keeping the concept in check.
 

Jayson A

macrumors 68030
Sep 16, 2014
2,671
1,935
Imo the question should've been "Do you care having a black-box in-device mass scanning system that can communicate to the cloud on your phone?"
The issue is not about the CSAM, but it's about the system baked into iOS itself.
This is the first time your Apple devices have ever communicated with Apple’s servers.
 

Jimmy James

macrumors 603
Oct 26, 2008
5,489
4,067
Magicland
Jimmy James, That has to be the biggest load of horse **** I've read all year. Thanks for the laugh.
And still, the “nothing to hide, nothing to fear” approach was championed by Joseph Goebbels, “Nazi Minister of Propaganda.” You’re aligned.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.