Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Will you leave the Apple ecosystem because of CSAM?


  • Total voters
    1,040
Status
Not open for further replies.
They are not. They are required to report it, but are not required to find it. Even in the cloud. All the cloud providers in question are choosing to look for it as an altruistic gesture it seems. Again, unless you have something that says different. They need do nothing.
So why is Apple the only one with very very very low CSAM numbers? I am sure they are getting push back for the miserable number of reports.
 
They are not.

Correct. Otherwise, one would have to conclude Apple was breaking the law all this time (and still is, since on-device scanning is not active yet).

Plus, they do not encrypt our data in iCloud in any meaningful way now, so that is a moot point. Plus, they have given that data up many times when requested by the government with a warrant.

The encryption is to protect your information from hackers, not from Apple or law enforcement (obviously Apple holds the encryption key and the TOS states they will comply with law enforcement). So the encryption is most definitely meaningful, unless you think that it's impossible for any iCloud servers to be hacked.

So there isn't a technical reason to scan the device except that I think they thought it was a cool way to do it?

The alternative is for Apple to decrypt and read billions of users' photos on their servers. On-device scanning is FAR more private than that, obviously (since Apple has no access to that).

P.S. @dk001: 🤣right back at you
 
Last edited:
  • Haha
Reactions: dk001
Correct. Otherwise, one would have to conclude Apple was breaking the law all this time (and still is, since on-device scanning is not active yet).
Yet I am sure Apple is seen as a "safe haven" for this kind of thing with their pitiful reported numbers. So they must be trying to resolve it the best they can.
 
a project came up were heavy photo editing and slide shows HTML5 something
were need for a firm i worked for since 2007 on and off
the PC laptop system (dell ups and affinity) ruled, but not as fluid and sharp as photoshop
So
the macbook air 2010 was plugged in and money was earned.
What’s wrong or different about Photoshop in Windows?
 
Smartphones and Internet used to be fun. Nowadays it's just surveillance and manipulating the users (news you may like...not those you should have heard about, which are normally those you don't like to read!), shaming people (your bmi is high, you ate the wrong and drank too less, you didn't exercise enough...says who?), scaring people (your heartbeat is suspicious, your sleep patterns give concern...how comes none of the experts who say this look healthy and live longer than anybody else?).

I must say, after twenty years using the internet, I'm about to leave it behind. I will buy books again, cd, go outside or rather do something if I'm bored than distract me with passive consumption.

Will use the internet for movies and music, but anything else has become chains.
 
I know this is a huge thread, and I haven't read every post -- but, I just need to post my two cents. The CHILD PORN scanner is absolutely no big deal. It's NOT going to flag the photo of your kid in their first bath. It's NOT going to flag the photo your kid took of themselves nekkid. It's NOT going to flag literally any "normal" photo parents take of their kids -- and it won't even flag photos 17 year olds take of themselves to share.

It *will* flag photos that are KNOWN TO BE CHILD PORN and are traded on the dark underbelly of the internet. If you have CHILD PORN ON YOUR PHONE you need help; and hopefully the Apple scanner will find you and get you the help you need.

If this tech existed years ago, Josh Duggar wouldn't have molested his sisters for anywhere near as long as he did.

Bottom line -- if you're WORRIED about Apple finding CHILD PORN on your phone, maybe you shouldn't have ILLEGAL CHILD PORN on your phone. In other news, water is wet, and illegal **** is illegal.
 
I believe as a hosting provider for files, Apple is legally required to scan for this sort of thing. Which is why Dropbox, OneDrive, Google Drive and others do the same. So Apple needs to do something, whether its on-device or somewhere else its up to them. But they need to do SOMETHING. Apple chose this option because, honestly, it really is the most privacy-focused approach. Alternative would be to not encrypt data or have a backdoor for the encrypted files on their servers in order for the scans to run.
Apple is not legally required to scan for CSAM in the USA. Under current law, Apple cannot be legally required to scan for CSAM in the USA.

An understanding of why this is and what it means is essential to understanding the implications of these announcements. It's been discussed elsewhere in this thread so you can start there if of interest.

edit: @eltoslightfoot covered it
 
Who’s 180’d and preordered everything in sight?
One thing I've wondered about...will sales really be impacted that much by the people opposed to this technology? I suppose it was always a case of "we'll see what happens." But I've had the feeling that the number of people who'd say I'm done with Apple--and make it stick--wouldn't be enough to make a meaningful difference to Apple long term. A good percentage of users will never switch. Maybe they don't care about anything, except having the Apple logo. Maybe they have concerns, but see Apple as the lesser evil. Etc.
 
  • Like
Reactions: dk001
I know this is a huge thread, and I haven't read every post -- but, I just need to post my two cents. The CHILD PORN scanner is absolutely no big deal. It's NOT going to flag the photo of your kid in their first bath. It's NOT going to flag the photo your kid took of themselves nekkid. It's NOT going to flag literally any "normal" photo parents take of their kids -- and it won't even flag photos 17 year olds take of themselves to share.

It *will* flag photos that are KNOWN TO BE CHILD PORN and are traded on the dark underbelly of the internet. If you have CHILD PORN ON YOUR PHONE you need help; and hopefully the Apple scanner will find you and get you the help you need.

If this tech existed years ago, Josh Duggar wouldn't have molested his sisters for anywhere near as long as he did.

Bottom line -- if you're WORRIED about Apple finding CHILD PORN on your phone, maybe you shouldn't have ILLEGAL CHILD PORN on your phone. In other news, water is wet, and illegal **** is illegal.

One last time, for those of us against this, it isn't the subject, it is the tool.
Additionally this tool will do nothing for existing CSAM in the iCloud nor for alternate methods to load photo's into the iCloud.
 
One thing I've wondered about...will sales really be impacted that much by the people opposed to this technology? I suppose it was always a case of "we'll see what happens." But I've had the feeling that the number of people who'd say I'm done with Apple--and make it stick--wouldn't be enough to make a meaningful difference to Apple long term. A good percentage of users will never switch. Maybe they don't care about anything, except having the Apple logo. Maybe they have concerns, but see Apple as the lesser evil. Etc.

There is a reason Apple postponed this.
Perhaps not that significant an impact to this years devices but rather the hit to longer term sales and a hit to their image / bottom line via investigations, lawsuits, and Government attempts to suborn the tool.
 
Apple is not legally required to scan for CSAM in the USA. Under current law, Apple cannot be legally required to scan for CSAM in the USA.

An understanding of why this is and what it means is essential to understanding the implications of these announcements. It's been discussed elsewhere in this thread so you can start there if of interest.

edit: @eltoslightfoot covered it
So every hosting provider is just doing it out of the goodness of their heart? I doubt that. Apple has 200 reported cases where the other big ones have thousands or millions.

I don't really know why some people on this site are so against this. Apple figured out the best privacy-focused way to handle CSAM scanning. While the picture is on your device, it is not encrypted. So it makes sense for Apple to scan on-device.
 
Last edited:
  • Haha
Reactions: dk001
I know this is a huge thread, and I haven't read every post -- but, I just need to post my two cents. The CHILD PORN scanner is absolutely no big deal. It's NOT going to flag the photo of your kid in their first bath. It's NOT going to flag the photo your kid took of themselves nekkid. It's NOT going to flag literally any "normal" photo parents take of their kids -- and it won't even flag photos 17 year olds take of themselves to share.

It *will* flag photos that are KNOWN TO BE CHILD PORN and are traded on the dark underbelly of the internet. If you have CHILD PORN ON YOUR PHONE you need help; and hopefully the Apple scanner will find you and get you the help you need.

If this tech existed years ago, Josh Duggar wouldn't have molested his sisters for anywhere near as long as he did.

Bottom line -- if you're WORRIED about Apple finding CHILD PORN on your phone, maybe you shouldn't have ILLEGAL CHILD PORN on your phone. In other news, water is wet, and illegal **** is illegal.
Let's keep things in perspective here. Having this in place would not have prevented someone from doing the act, instead of say, using a different phone or camera? Or maybe just not taking pics at all and just abusing?

This does not prevent abuse problems. People will either disable iCloud Photos, use a different standalone camera, or just not take the pictures but still abuse children.
 
Last edited:
  • Haha
Reactions: dk001
Let's keep things in perspective here. Having this in place would have prevented someone from doing the act, instead of say, using a different phone or camera? Or maybe just not taking pics at all and just abusing?

This does not prevent abuse problems. People will either disable iCloud Photos, use a different standalone camera, or just not take the pictures but still abuse children.
Ok, so by your logic we shouldn't do anything to prevent child porn because someone might be inconvenienced and need to use a different camera, or disable iCloud? I'm not seeing your point here..

I mean, unless you're trying to defend people who abuse children?!?
 
  • Haha
Reactions: dk001 and Pummers
Ok, so by your logic we shouldn't do anything to prevent child porn because someone might be inconvenienced and need to use a different camera, or disable iCloud? I'm not seeing your point here..

I mean, unless you're trying to defend people who abuse children?!?
No you are right, we should give up all our rights to everything and anything. Let the FBI in everywhere. Screw the Constitution. It's for the children!
 
No you are right, we should give up all our rights to everything and anything. Let the FBI in everywhere. Screw the Constitution. It's for the children!
Did you suddenly have the right to have child porn on your phone? I mean, is your phone some “child porn safe-zone”?
 
So every hosting provider is just doing it out of the goodness of their heart? I doubt that. Apple has 200 reported cases where the other big ones have thousands or millions.

I don't really know why some people on this site are so against this. Apple figured out the best privacy-focused way to handle CSAM scanning. While the picture is on your device, it is not encrypted. So it makes sense for Apple to scan on-device.

Instead of playing “but Apple ….”, why not try doing some basic research of your own or just read a couple of the threads here on the topic. Easy way to catch up. The relevant Federal laws have been quoted and linked more than a few times.

… you can lead a horse to water, but you can’t make it drink.
 
Let's keep things in perspective here. Having this in place would have prevented someone from doing the act, instead of say, using a different phone or camera? Or maybe just not taking pics at all and just abusing?

This does not prevent abuse problems. People will either disable iCloud Photos, use a different standalone camera, or just not take the pictures but still abuse children.

Kindly explain how scanning for a few old photos would have prevented anything?
This shows your lack of knowledge on this topic.
Here is a screenshot of some of the threads in this forum which can help educate you on the topic, contain many great links, and show all sides of the debate. Some really good stuff is in there.
B3E72AEE-0715-4DEE-B366-F3B0B14556D0.png
 
  • Like
Reactions: eltoslightfoot
Kindly explain how scanning for a few old photos would have prevented anything?
This shows your lack of knowledge on this topic.
Here is a screenshot of some of the threads in this forum which can help educate you on the topic, contain many great links, and show all sides of the debate. Some really good stuff is in there.
View attachment 1839539
Dude, you rock. Now will the kind poster take advantage? Outlook dubious.
 
  • Like
Reactions: dk001
Status
Not open for further replies.
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.