Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

keeper

macrumors 6502a
Apr 23, 2008
520
303
It only seems worse to you because you want to defend and justify what Apple is doing. If I go to the airport to get on a plane, I accept the security procedures. It is a public place; however, I do not accept someone coming into my home to look for contraband. This is fundamental to personal rights and that it is why we place such a premium on the sanctity of our homes and possessions. It is why we require our police to have warrants to come into our homes. I am not interested in giving Apple a pass that is not even afforded to the police and national security.
Sorry where did I defend what apple is doing?

My data has been moved to a personal NAS, I have nothing in iCloud.

If you are using iCloud you will be agreeing to the terms of service that this is going to happen.
It’s the same for all services, you have to agree to the terms of service, but people seem happy for private data to be un encrypted for scanning on server.
 

usagora

macrumors 601
Nov 17, 2017
4,869
4,456
What? That makes no sense in any fashion. Apple states A/B/C. I have questions on the information regarding B/C and you say just accept it all as gospel? LMAO!!!! Not until I can get answers that satisfy me and settle the claim fro all the professions that say “this is a BAD thing!”.

You don't see a difference between asking questions with the assumption someone is being honest (and you just need clarification) and asking questions with the assumption someone is being dishonest (and thus your questions are more accusatory than actually information-seeking)? Alrighty then.

Where am I looking at the future except in response to an Apple answer?

Huh? I can't make any sense out what you just said.

Still doesn’t answer my question. Apple even claimed this would be the initial setting and would change as the system proved itself.

Would you mind citing the source of this? Even if true, they'd still be taking precautions to avoid accounts being falsely flagged - how is that a bad thing? And even if true, that still doesn't mean they lack confidence in it, but are simply wanting to be sure everything is operating as intended. Would you rather they be complacent?

Yes and it makes little sense from a legal perspective. This reads more like a cya. That is why the question.

What's so legally nonsensical about reporting verified CSAM to NCMEC?

This is something @Jayson A pointed out when the topic first came up. Apple will use a second check to make sure something was not seeded into the original database. When I tried looking into who has databases and the sources this piece doesn’t fit well.

Ok? I'm pretty sure the manual reviewers won't be confused as to whether they're looking at CSAM or not, so I'm not sure what the concern here is.

Potential false positives has the ability to easily destroy someone life even if they are innocent. If a system is generating false positives I know I would like to be notified. Not just “surprise!”. Apple feels it can happen. They even put an appeals process in place.

This concern is 100% unfounded. Apple is only reporting confirmed CSAM to NCMEC, and even if they weren't, do you seriously think the police are going to arrest someone without looking at the pictures themselves? The ONLY scenario where your concern is possible is if someone uploads CSAM to someone's account in order to frame them. While I don't think that will be anything close to a common occurrence (if it even happens at all), it's no reason to not report crime anymore than not reporting any other crimes people can possibly be framed for. We live in an imperfect world, but that doesn't' mean you throw the baby out with the bathwater.

BS. You can run the exact same process server side and catch ALL THE CSAM. On device you only catch the small bit in the event they leave the iCloud Photo feature on And catch none of the items already in the iCloud.

And while doing so, Apple would be decrypting and reading everyone's photos. You're ok with that? You seemed so concerned about privacy, but this suggestion puts a huge dent in your credibility on that. And again, as I said, Apple's goal here is obviously not to eradicate every trace of existing CSAM on iCloud but rather to combat the further spread of it on iCloud. And if iPhone users don't have the iCloud photo feature on, they can't upload any photos to iCloud anyway.

You may think so but quite a few think otherwise. If this had not tanked from a marketing announcement perspective it would have been a great argument to “stay in the Apple Environment to be private”.

For those of us who don't have an agenda and simply take Apple's statements at their face value, we see that this move is exactly in line with their commitment to privacy. That's why we find it an astounding irony that people are suggesting Apple use a less private method (server-side scanning) instead, and acting like that's a more private solution. Crazy! Anyway, if Apple was faking all this technology as a PR move, they stink at PR. I think they're making a gutsy move by striking a balanced compromise between continuing to NOT scan at all and scanning in a far more invasive way. Again, iCloud has NEVER been a truly private arena, and for those who want ultimate privacy they should not use iCloud or any other cloud service at all (or at least not for data they're concerned about, such as photos).

Sad you feel that way. You asked for 2-3 and I served up a lot more, none of which you have done other than play the same silly game of “but Apple says” and “That’s silly/dumb/nobody asks that”. Based on your answers, why would I put up more?

Sad that you mischaracterize my answers like that. Anyone can go back and see that's simply not true. I did call the one question silly (to ask of Apple themselves), because it is, but I never said "nobody asks that" (YOU asked it, so why would I say that? LOL!) And you didn't "put up more" even BEFORE I answered them. so that's not the reason, obviously.

Feel what you want. You are entitled to your own opinion. Rather than really try to answer you just point and claim “you’ll never …”. I had hoped you could at least come up with something I had missed and possibly answer at least one item or add to my knowledge base. You have done neither.

At this point, it is quite clear your problem is not lack of information. I am done discussing this topic with you. You'll only be content in your echo chamber. I am more than willing to call Apple out if I see evidence of wrong-doing, but the best you and others can come up with is "We have questions we didn't like Apple's answers to." Sorry, but that doesn't cut it.

Bye.
 
Last edited:

Mendota

macrumors 6502a
Jan 9, 2019
617
1,209
Omaha
Bad analogy. Your device literally does nothing until you try to upload pictures to the cloud. The on-device hashing system is completely blind and your device knows nothing about whether a match has been found or not, those vouchers are then uploaded to Apple with your photo upload and only THEN does Apple know if it's a match. So literally, this system REQUIRES iCloud to work. No iCloud, no upload, no scan, no vouchers. It's really that simple.
I don't care about the "device" knowing, I care about Apple putting spyware on my personal devices. It does not require iCloud to work. It is working in the background all the time. Apple has said that they will only check and report if iCloud is used, but iCloud usage or not doesn't prevent the scanning being done on device. If this system was as benign as you and others are trying to claim, security experts would not have weighed in, (and no, not just Snowden) and sounded the warning. This is spyware, no getting around it. It is just that Apple expected their users to be all like you, accepting it because it is Apple.

I can just imagine the outrage that would have been generated by Apple fans if this was Google or Microsoft. Heck all Microsoft did with the roll out of Windows 10 was set telemetry by default and people had a field day over the so called privacy violation even though there was no personal identification involved.
 

usagora

macrumors 601
Nov 17, 2017
4,869
4,456
I posted it before.
Even recommend trying Websters.
It can be used a number of ways.


Again, the first definition is clearly what you're intending in this context and what ANYONE would assume in this context, and is also what is meant with the word SPYware

Screen Shot 2021-09-13 at 4.28.01 PM.png


Stop playing games. No one's falling for it.
 

Jayson A

macrumors 68030
Sep 16, 2014
2,671
1,935
I don't care about the "device" knowing, I care about Apple putting spyware on my personal devices. It does not require iCloud to work. It is working in the background all the time. Apple has said that they will only check and report if iCloud is used, but iCloud usage or not doesn't prevent the scanning being done on device. If this system was as benign as you and others are trying to claim, security experts would not have weighed in, (and no, not just Snowden) and sounded the warning. This is spyware, no getting around it. It is just that Apple expected their users to be all like you, accepting it because it is Apple.

I can just imagine the outrage that would have been generated by Apple fans if this was Google or Microsoft. Heck all Microsoft did with the roll out of Windows 10 was set telemetry by default and people had a field day over the so called privacy violation even though there was no personal identification involved.
Please point me to where it says the scanning is done regardless if iCloud Photos is on or off. Please.
 

MadeTheSwitch

macrumors 65816
Apr 20, 2009
1,193
15,781
I think this topic can be closed this by now OP had make up his mind..he left or not, his decision
12 pages for this kind of topic is already too much, is like people has nothing else to do :)
Why should a topic be closed just because you’re done with it? Just because YOU are done with the topic or find it “too much” doesn’t mean others feel that way. Privacy is one of the most important issues of our time and what happens now, will lay the foundation for the future. Thus it is vital that we keep discussing it, keep holding people accountable, and make sure things are done right. Because this could all go really wrong and nefarious in the future.
As I said in, I think it was the 2nd or 3rd thread on this topic, the social court has made its ruling based on emotion, influencer hype, and hypothetical what-ifs. Apple publicly stated they're going to implement client-side CSAM rather than burying it deep in the terms of service like most other companies doing it, or hiding it and hoping we never found out like big brother likes to do, that was wonderful transparency. The moment end-to-end encryption became a thing the advocates for groups like NCMEC started screaming because that meant there would be less reporting, wherein companies implemented CSAM scanning. Would you rather deal with a company that tells you they're going to do it or a company that hopes you never find out they're doing it?
That’s a false choice. How about neither? I’d rather not get into a situation of picking between the lessor of two evils.
Bad analogy. Your device literally does nothing until you try to upload pictures to the cloud.
Sure, you can say that NOW. But do you know for certain that it will always be that way? Do you know for certain that it will never change? All it takes is a software error, or a different CEO to come along and change that. Perhaps without your knowledge even. So it’s really odd to me that people would defend having their phones turned into spying devices.
 

Mendota

macrumors 6502a
Jan 9, 2019
617
1,209
Omaha
Please point me to where it says the scanning is done regardless if iCloud Photos is on or off. Please.
It is right there in their documentation. On device scanning, running in the background. It supposedly waits until you upload to iCloud to report it's findings. And as I have said before, this represents only the first step. Once the spyware is in place, they can do anything. The reason many are pushing back is because once that first step in taken, it is hard to reverse. It is like the point about government and taxes. The so-called income tax was only supposed to be used to pay for the war... That was the first step.
 

keeper

macrumors 6502a
Apr 23, 2008
520
303
It supposedly waits until you upload to iCloud to report it's findings.
That is what’s stated, unless you have a big dose of paranoia and are unable to accept the facts provided to you.

Supposedly does not have a definition of being a fact.
 

Jayson A

macrumors 68030
Sep 16, 2014
2,671
1,935
It is right there in their documentation. On device scanning, running in the background. It supposedly waits until you upload to iCloud to report it's findings. And as I have said before, this represents only the first step. Once the spyware is in place, they can do anything. The reason many are pushing back is because once that first step in taken, it is hard to reverse. It is like the point about government and taxes. The so-called income tax was only supposed to be used to pay for the war... That was the first step.
"For iCloud accounts which use iCloud Photos, this feature implements a privacy-preserving, hybrid on-device/server pipeline to detect collections of CSAM images being uploaded to iCloud Photos. The first phase runs code on the device to perform a blinded perceptual hash comparison of each photo being uploaded to iCloud Photos against an on-device encrypted database of known CSAM perceptual hashes. However, the result of each blinded match is not known to the device; it can only be determined by the second phase running on iCloud Photos servers, and only if that user’s iCloud Photos account exceeds a threshold of positive matches."

Seems pretty clear to me. ¯\_(ツ)_/¯

"Does this mean Apple is going to scan all the photos stored on my iPhone?
No. By design, this feature only applies to photos that the user chooses to upload to iCloud Photos, and even then Apple only learns about accounts that are storing collections of known CSAM images, and only the images that match to known CSAM. The system does not work for users who have iCloud Photos disabled. This feature does not work on your private iPhone photo library on the device."

Anything else you need?
 

Mendota

macrumors 6502a
Jan 9, 2019
617
1,209
Omaha
That is what’s stated, unless you have a big dose of paranoia and are unable to accept the facts provided to you.

Supposedly does not have a definition of being a fact.
I take it the same way I do a lot of what Apple says. Apple said that the hidden "feature" to slow down your phone was to save your battery. "We just forgot to mention it." Apple said that there was nothing wrong with the keyboard, it was users fault, that they were breaking... Apple had said a lot of things that have been proven to be less than honest.
 

Mendota

macrumors 6502a
Jan 9, 2019
617
1,209
Omaha
"For iCloud accounts which use iCloud Photos, this feature implements a privacy-preserving, hybrid on-device/server pipeline to detect collections of CSAM images being uploaded to iCloud Photos. The first phase runs code on the device to perform a blinded perceptual hash comparison of each photo being uploaded to iCloud Photos against an on-device encrypted database of known CSAM perceptual hashes. However, the result of each blinded match is not known to the device; it can only be determined by the second phase running on iCloud Photos servers, and only if that user’s iCloud Photos account exceeds a threshold of positive matches."

Seems pretty clear to me. ¯\_(ツ)_/¯

"Does this mean Apple is going to scan all the photos stored on my iPhone?
No. By design, this feature only applies to photos that the user chooses to upload to iCloud Photos, and even then Apple only learns about accounts that are storing collections of known CSAM images, and only the images that match to known CSAM. The system does not work for users who have iCloud Photos disabled. This feature does not work on your private iPhone photo library on the device."

Anything else you need?
We have been down this road before, and again I love how they call spying a feature. This is just an example of how they condescend to their audience. And this is what they said:

This feature implements a privacy-preserving, hybrid on-device/server pipeline to detect collections of CSAM images being uploaded to iCloud Photos. The first phase runs code on the device.

Umm, what part did I miss.
 
  • Like
Reactions: eltoslightfoot

Bandaman

Cancelled
Aug 28, 2019
2,005
4,091
I have and do read all terms of service for all products I use, to include Apple, Google, and Microsoft. Again THEY DO NOT DO DEVICE SCANNING! If it is not on their public servers, it is unknown to them.
I didn't say they do device scanning, I said they scan everything in your OneDrive. Reading helps.
 

keeper

macrumors 6502a
Apr 23, 2008
520
303
I take it the same way I do a lot of what Apple says. Apple said that the hidden "feature" to slow down your phone was to save your battery. "We just forgot to mention it." Apple said that there was nothing wrong with the keyboard, it was users fault, that they were breaking... Apple had said a lot of things that have been proven to be less than honest.
And yet here you are an Apple customer, how can that be with your lack of trust in Apple?
 
  • Like
Reactions: Jayson A

Jayson A

macrumors 68030
Sep 16, 2014
2,671
1,935
Umm, what part did I miss.
You're impossible. Just change the context based on your beliefs. Apple said the matching done on device is only done to content that is being uploaded to iCloud and the device itself has no idea what it's looking for. Only Apple knows when the files are uploaded to THEIR SERVERS and then they only know once a threshold has been met. You seem to be ignoring all of that.
 

Mendota

macrumors 6502a
Jan 9, 2019
617
1,209
Omaha
I didn't say they do device scanning, I said they scan everything in your OneDrive. Reading helps.
That is not what you said or implied. You are now trying to back out. Most of the people in this thread that are protesting what Apple is doing have acknowledge that public cloud servers to include OneDrive are scanned. We are not concerned with that. After being called out on your ridiculous statement that Microsoft deletes documents it doesn't like etc, you are now claiming what we already know, that OneDrive scans what is put on their servers. If you don't upload to OneDrive Microsoft has no knowledge or scan of it.
 

keeper

macrumors 6502a
Apr 23, 2008
520
303
I use Apple for fun, not serious work, never have. I have never fallen for their "marketing" hype or BS. I run everything because I like tech. I also use Linux as well as Windows, Chromebook, and Android.
But if you only run it for fun why are you spending so much energy decrying what they are doing? That’s not logical if you have nothing invested in it.
 
  • Angry
Reactions: Euronimus Sanchez

Mendota

macrumors 6502a
Jan 9, 2019
617
1,209
Omaha
You're impossible. Just change the context based on your beliefs. Apple said the matching done on device is only done to content that is being uploaded to iCloud and the device itself has no idea what it's looking for. Only Apple knows when the files are uploaded to THEIR SERVERS and then they only know once a threshold has been met. You seem to be ignoring all of that.
I didn't change the context. I simply copied exactly what they said... I don't want spyware on my devices. it is just that simple.

If someone said we want to install a camera in your house to monitor for child abuse, but the camera will be off most of the time, and only come on in the event it "hears" a child crying. It will then check to see why, and if you are not mistreating the child, it will go off. Oh, and others will decide what it constituted as mistreatment.

Yeah, I would not be okay with that either, even though I have never abused a child. I don't want spyware in my house or on my devices. If you are fine with that, good for you.
 

Mendota

macrumors 6502a
Jan 9, 2019
617
1,209
Omaha
But if you only run it for fun why are you spending so much energy decrying what they are doing? That’s not logical if you have nothing invested in it.
I have much invested in dealing with the numerous privacy issues that I see as a major issue in our time. There is a lot of discussion on the role that tech, and big tech in particular play in our lives. I am a part of that larger discussion. This issue is very pertinent, because it involves a company saying that your personal device should now have built in "policemen" monitoring your activity.
 

Bandaman

Cancelled
Aug 28, 2019
2,005
4,091
That is not what you said or implied. You are now trying to back out. Most of the people in this thread that are protesting what Apple is doing have acknowledge that public cloud servers to include OneDrive are scanned. We are not concerned with that. After being called out on your ridiculous statement that Microsoft deletes documents it doesn't like etc, you are now claiming what we already know, that OneDrive scans what is put on their servers. If you don't upload to OneDrive Microsoft has no knowledge or scan of it.
I'm not backing out of anything, you have poor reading comprehension. I said exactly what I meant in every post. The rest of your comment is just ridiculous. I literally said they scan everything in OneDrive, which is true, and you are held liable for anything they deem nefarious when they scan your files in your OneDrive, whether this is CSAM or other material they don't agree with. This is in their terms of service when you sign up. You are just arguing for the sake of arguing and you must be stressed or something. You need to chill out.
 

dk001

macrumors demi-god
Oct 3, 2014
11,142
15,496
Sage, Lightning, and Mountains
So it seems pepole are unhappy with apple scanning on devise before data is encrypted for upload to iCloud.

But the same people are happy for it to be scanned on server.
But surly for that to happen Microsoft etc must be unencrypting your data on server to perform the scan, that seems wrong to me…and worse than apple.

I wouldn’t say all are happy on server.
Some are accepting.
Some don’t like.
Some are surprised it was being done at all.
 

dk001

macrumors demi-god
Oct 3, 2014
11,142
15,496
Sage, Lightning, and Mountains
The secondary on-server scan doesn’t check to see if a non-CSAM has was injected into the system, it simply uses the actual image and compares it visually using a perceptual hashing process against a separate perceptual hash database. It’s basically to rule out false positives that may have accidentally triggered a match (like the ones of the dog that were forced collisions using software to generate an image with the same hash). The image wouldn’t visually match the CSAM counterpart, so that image would be discarded before humans even step in.

Thought I read it was hash to hash. Visual happens after this check if =>30.
Thanks.
 

Mendota

macrumors 6502a
Jan 9, 2019
617
1,209
Omaha
I'm not backing out of anything, you have poor reading comprehension. I said exactly what I meant in every post. The rest of your comment is just ridiculous. I literally said they scan everything in OneDrive, which is true, and you are held liable for anything they deem nefarious when they scan your files in your OneDrive, whether this is CSAM or other material they don't agree with. This is in their terms of service when you sign up. You are just arguing for the sake of arguing and you must be stressed or something. You need to chill out.
This is what you said word for word: You said Microsoft, not Onedrive. This is your exact quote. My reading comprehension is just fine, as well as my BS detector.

Literally every company scans everything you have in the cloud. Microsoft even scans all of your word files and whatnot for topics they don't agree with and will even ban your account if you're writing about something against their beliefs. And Google ... well Google is Google. Linux would be the only option as far as privacy, but again ... if you are using any cloud services you're right back in the same boat again. If you truly want privacy, disconnect yourself from all technology and the internet and go live in an Amish village.
 
  • Like
Reactions: MadeTheSwitch

dk001

macrumors demi-god
Oct 3, 2014
11,142
15,496
Sage, Lightning, and Mountains
Bad analogy. Your device literally does nothing until you try to upload pictures to the cloud. The on-device hashing system is completely blind and your device knows nothing about whether a match has been found or not, those vouchers are then uploaded to Apple with your photo upload and only THEN does Apple know if it's a match. So literally, this system REQUIRES iCloud to work. No iCloud, no upload, no scan, no vouchers. It's really that simple.

So to update your analogy. It's like arriving at the airport and putting your bag in the scanner before you get on the plane. Your iPhone is the bag, and the scanner is... well, you know and the plane is iCloud.

Only if the scanner was in your home.
Now if Apple set an intermediary server between you and the iCloud your example would be spot on.
 
  • Like
Reactions: Mendota

dk001

macrumors demi-god
Oct 3, 2014
11,142
15,496
Sage, Lightning, and Mountains
You don't see a difference between asking questions with the assumption someone is being honest (and you just need clarification) and asking questions with the assumption someone is being dishonest (and thus your questions are more accusatory than actually information-seeking)? Alrighty then.



Huh? I can't make any sense out what you just said.



Would you mind citing the source of this? Even if true, they'd still be taking precautions to avoid accounts being falsely flagged - how is that a bad thing? And even if true, that still doesn't mean they lack confidence in it, but are simply wanting to be sure everything is operating as intended. Would you rather they be complacent?



What's so legally nonsensical about reporting verified CSAM to NCMEC?



Ok? I'm pretty sure the manual reviewers won't be confused as to whether they're looking at CSAM or not, so I'm not sure what the concern here is.



This concern is 100% unfounded. Apple is only reporting confirmed CSAM to NCMEC, and even if they weren't, do you seriously think the police are going to arrest someone without looking at the pictures themselves? The ONLY scenario where your concern is possible is if someone uploads CSAM to someone's account in order to frame them. While I don't think that will be anything close to a common occurrence (if it even happens at all), it's no reason to not report crime anymore than not reporting any other crimes people can possibly be framed for. We live in an imperfect world, but that doesn't' mean you throw the baby out with the bathwater.



And while doing so, Apple would be decrypting and reading everyone's photos. You're ok with that? You seemed so concerned about privacy, but this suggestion puts a huge dent in your credibility on that. And again, as I said, Apple's goal here is obviously not to eradicate every trace of existing CSAM on iCloud but rather to combat the further spread of it on iCloud. And if iPhone users don't have the iCloud photo feature on, they can't upload any photos to iCloud anyway.



For those of us who don't have an agenda and simply take Apple's statements at their face value, we see that this move is exactly in line with their commitment to privacy. That's why we find it an astounding irony that people are suggesting Apple use a less private method (server-side scanning) instead, and acting like that's a more private solution. Crazy! Anyway, if Apple was faking all this technology as a PR move, they stink at PR. I think they're going out on a gutsy limb by striking a balanced compromise between continuing to NOT scan at all for CSAM and scanning in the most private way possible. Again, iCloud has NEVER been a truly private arena, and for those who want ultimate privacy they should not use iCloud or any other cloud service at all (or at least not for data they're concerned about, such as photos).



Sad that you mischaracterize my answers like that. Anyone can go back and see that's simply not true. I did call the one question silly (to ask of Apple themselves), because it is, but I never said "nobody asks that" (YOU asked it, so why would I say that? LOL!) And you didn't "put up more" even BEFORE I answered them. so that's not the reason, obviously.



At this point, it is quite clear your problem is not lack of information. I am done discussing this topic with you. You'll only be content in your echo chamber. I am more than willing to call Apple out if I see evidence of wrong-doing, but the best you and others can come up with is "We have questions we didn't like Apple's answers to." Sorry, but that doesn't cut it.

Bye.

Same stuff, different words. You still are providing nada to the questions I posted at your request.
It’s like you are trying to convince me you are right, I am wrong, while providing nothing but your stance as proof.
How about a little factual stuff to my questions to support your opinion?
Thanks.
 
  • Like
Reactions: MadeTheSwitch
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.