Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Will you leave the Apple ecosystem because of CSAM?


  • Total voters
    1,040
Status
Not open for further replies.
In light of the recent scandal backlash, will you consider moving out of the Apple ecosystem?

Edit: For those who voted no: a new option has been added and you might want to revise your vote.

New: I will continue to phase Apple out. This will not change my timetable unless misuse shows up.

A version of your New option.
 
One thing I haven't seen discussed (though, honestly, I haven't read all the posts) is the impact this might have on future sales. Not just due to the 22% (by the latest poll numbers in this thread) that are or will be leaving the Apple ecosystem, but the relatives, friends, and acquaintances of theirs they may influence.

E.g.: I've been after one of my best friends to switch to Apple for its better privacy for years. When last we discussed it, a few months back, he conceded "Yeah, I should probably do that." I told him about this a few days ago. He was appalled. Needless to say: He won't be making the switch now. I regularly participate in "What phone/tablet/computer should I buy" threads on other forums. I have been recommending Apple. That won't be happening anymore, either.
 
I think a minute amount of people might leave, but for what device? Every other manufacturer is going to follow suit. If someone is that worried about it then they can turn off iCloud photos.

The reality is that the majority of the population isn’t even going to be aware of this.

That is a very big concern.
What we don’t know because Apple won’t say is “why?” they chose this route. It is not risk adverse by any means.
 
My post wasn't about speeding. It was about being presumed guilty until proven innocent.

Speeding is very easy to accidentally do whereas collecting 30 CSAM files is not. I would definitely not mind if a cop was in my car looking to see if I had CSAM images, but the catch would be, he must not look at the content of any of my photos. He actually must hold each image up to a robot without looking until 30 matches are found, then he may arrest me and I would deserve it.


Yeah.. but how do you define 'csam images'. What is allowed and what isn't? You don't know what the're looking for exactly. All you know is that a private organisation is supplying a database with hashes. What kind of people run this organisation.. are they tied to a political party for istance? You don't know that..
 
No, I'm all for CSAM and I believe Apple's implementation is still better compared to Google, Facebook, Microsoft etc while maintaining privacy of private photos stored on their server and their fight against child safety using CSAM database. We can't have it both ways while trying to fight against child safety behind the guise of privacy of those abusers. I believe Apple should have taken a better approach to educating the public before the media spun it off to create confusion and apprehension.

Still would love to know why Apple is doing this. Scanning for CSAM is not a requirement in the US. Now the EU … I don’t think it is yet.
 
OSes other than Android and iOS represent less than 1% of the market share. So let's be realistic for a second. Nearly all people who want to quit iOS because of CSAM almost have to opt for an Android phone because this is the market. Your other options are good (flip phone, no phone or iPod nano), but once again, you are clearly a minority if you do that.

If you want a smartphone, it really goes down to one thing : You have to pick you poison.

A company that's making privacy a #1 priority, who dares to say it out loud when they cross the line?
Or another company who's been profiling you nearly every single ***** day since the 90s, without ever asking you?

Why do folks keep harboring that it is iOS vs Android for selection. It isn’t.
It is iOS vs Google vs Samsung vs OnePlus vs Sony vs Huawei vs ….
There are a lot of flavors of Android and they are quite different.
 
Still would love to know why Apple is doing this. Scanning for CSAM is not a requirement in the US. Now the EU … I don’t think it is yet.

My guess is that they want to track your on-device activity for advertising, and this is the way to get people to accept on-device tracking.
 
Why do folks keep harboring that it is iOS vs Android for selection. It isn’t.
It is iOS vs Google vs Samsung vs OnePlus vs Sony vs Huawei vs ….
There are a lot of flavors of Android and they are quite different.

And why did folk say they didn't ask? They did ask. We just clicked on 'agree' without reading it.
 
  • Like
Reactions: dk001
One thing I haven't seen discussed (though, honestly, I haven't read all the posts) is the impact this might have on future sales. Not just due to the 22% (by the latest poll numbers in this thread) that are or will be leaving the Apple ecosystem, but the relatives, friends, and acquaintances of theirs they may influence.

E.g.: I've been after one of my best friends to switch to Apple for its better privacy for years. When last we discussed it, a few months back, he conceded "Yeah, I should probably do that." I told him about this a few days ago. He was appalled. Needless to say: He won't be making the switch now. I regularly participate in "What phone/tablet/computer should I buy" threads on other forums. I have been recommending Apple. That won't be happening anymore, either.

I suspect, in the scheme of things, not enough people care about this to make that much difference. This is stuff that only geeks tend to care about, and the majority of Apple's customers are not geeks.
 
My post wasn't about speeding. It was about being presumed guilty until proven innocent.

Speeding is very easy to accidentally do whereas collecting 30 CSAM files is not. I would definitely not mind if a cop was in my car looking to see if I had CSAM images, but the catch would be, he must not look at the content of any of my photos. He actually must hold each image up to a robot without looking until 30 matches are found, then he may arrest me and I would deserve it.
So the point is that you trust them to use a robot only? or that they won't make mistakes or it won't be used as a huge back door for other reasons? There's a lot more to unpack with this than catching CSAM collectors, and that's all they would be catching, producers photos will not tip the AI scan, it can't possibly.
I still don't know why they're doing this? They already scan icloud, now they're scanning your phone, in case you upload to icloud? It seems like without trying to be a conspiracy theory person, it's just an excuse for other reasons.

Apple have notoriously not handed over to the government unlocking abilities for smart phones, because the government is incompetent compared, that's the sad truth. The NSA literally created a virus that some hacker got ahold of and released, that's as idiotic as it can get. So IMO this is their version of a back door they can access to hand over to governments if they have to. Personally we as a country have lost the plot when it comes to privacy, the FBI got clearance for spying on the American people after Waco, after 911 spying on people became normal, to where no one batted an eye when it was found out that the NSA was spying on everyone, the government got the ability to hold any civilian for any length of time by simply saying they're a possible terrorist with no proof needed. This slide into a surveillance state, at what point do we realize we've traded safety for freedom? Any cursory glance at history points to the worst of the worst being the first targets of authoritarian governments, the first victims of concentration camps weren't jews, they were violent and perverted criminals followed by communist leaders, followed by the elderly and infirm, then jews, gays, regular citizens, POWS etc. etc.

It's a fine balance liberty and protection, and I don't think it's reactionary to worry about this just because they're using a universally reviled criminal class as an excuse.
 
  • Like
Reactions: pdoherty and dk001
That's actually incorrect. It doesn't even scan anything until it's time to upload the photo. During the upload process, it creates a hash and sends a safety voucher along with the photo. Apple can only see the safety vouchers which doesn't tell them anything about the file. Once a threshold of 30 CSAM vouchers come through, then Apple can unlock those 30 photos only and view them.

Where did you get "30" from? 🧐
 
My guess is that they want to track your on-device activity for advertising, and this is the way to get people to accept on-device tracking.
Eh? Apple already does this. Have you never actually read their ToS and privacy policies? Apple has their own in-house data-collection system for feeding advertisers for advertisement-based apps. They do it in such a way as to preserve privacy, but they are doing it.

Did you know about iBeacon? You may be in for a shock, if not.

Apple's privacy was a lot better than Google's, but it wasn't 100%.
 
Apple is not fighting against child abuse. They are only trying to not get caught with child porn images on their servers which they have a mandate to do so I believe.

Actually there is no mandate. In the US the laws here state they have to report it if they find it but the same laws state they are not required to search for it.
 
Yeah.. but how do you define 'csam images'. What is allowed and what isn't? You don't know what the're looking for exactly. All you know is that a private organisation is supplying a database with hashes. What kind of people run this organisation.. are they tied to a political party for istance? You don't know that..
Whatever they're looking for, I guarantee I don't have. Also, Apple said they're going to use 2 sources for their CSAM hashes and only use the hashes that are the same from both sources.
 
I ask you... would you rather for Apple not fight against Child Abuse when they have the technology to do so?
Gotta love the implication in this question. If you're opposed to Apple compromising your privacy and security, you must be in favor of child abuse :rolleyes: (I have a response to that implication, but it would probably get me banned.)
 
  • Like
Reactions: pdoherty
Gotta love the implication in this question. If you're opposed to Apple compromising your privacy and security, you must be in favor of child abuse :rolleyes:
Pretty good compromise in my opinion. If it catches the sickos with this stuff, it's okay with. me. I'll take my 1 in a trillion chances of potentially having 30 innocent images looked at by Apple.
 
There's a difference. Other vendors have the scanning software running on their server and that software can only scan what YOU feed it by uploading your images.

Apple CSAM software runs on your device. It basicly can scan all the mages on your phone, but Apple promises that it only scans your images before uploading them to the cloud.

But do we believe that promise? The software is already present on your device...
True but realistically, if one doesn't have questionable content, one should not need to worry. ;)
 
True but realistically, if one doesn't have questionable content, one should not need to worry. ;)
Yeah yeah, but WHAT IF CSAM isn't CSAM? What then? You don't know what they're putting on your phone! It could be anything! You will never know! They could be looking for pictures of door knobs and then BOOM you're arrested! Crazy man, just crazy
 
Pretty good compromise in my opinion.
I don't share your opinion, obviously.

If it catches the sickos with this stuff, it's okay with. me.
I subscribe to a different philosophy, expressed by "It is better that ten guilty persons escape than that one innocent suffer." -- English jurist William Blackstone in his seminal work Commentaries of the Laws of England, in the 1670's and "Those who would give up essential liberty, to purchase a little temporary safety, deserve neither liberty nor safety." -- Benjamin Franklin (1706-1790)

We move ever-deeper into a nanny state. I'm glad I was born when I was and probably won't live to see its culmination.
 
I don't share your opinion, obviously.


I subscribe to a different philosophy, expressed by "It is better that ten guilty persons escape than that one innocent suffer." -- English jurist William Blackstone in his seminal work Commentaries of the Laws of England, in the 1670's and "Those who would give up essential liberty, to purchase a little temporary safety, deserve neither liberty nor safety." -- Benjamin Franklin (1706-1790)

We move ever-deeper into a nanny state. I'm glad I was born when I was and probably won't live to see its culmination.
So because 1 in 1-trillion will suffer from Apple seeing 30 of their "innocent photos", then we shouldn't care about the thousands of CSAM files being shared by sickos?

I respect you for moving away from a platform you don't like, but I don't think anyone is going to "suffer" with odds like that.
 
Status
Not open for further replies.
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.