What? That makes no sense in any fashion. Apple states A/B/C. I have questions on the information regarding B/C and you say just accept it all as gospel? LMAO!!!! Not until I can get answers that satisfy me and settle the claim fro all the professions that say “this is a BAD thing!”.
You don't see a difference between asking questions with the assumption someone is being honest (and you just need clarification) and asking questions with the assumption someone is being dishonest (and thus your questions are more accusatory than actually information-seeking)? Alrighty then.
Where am I looking at the future except in response to an Apple answer?
Huh? I can't make any sense out what you just said.
Still doesn’t answer my question. Apple even claimed this would be the initial setting and would change as the system proved itself.
Would you mind citing the source of this? Even if true, they'd still be taking precautions to avoid accounts being falsely flagged - how is that a bad thing? And even if true, that still doesn't mean they lack confidence in it, but are simply wanting to be sure everything is operating as intended. Would you rather they be complacent?
Yes and it makes little sense from a legal perspective. This reads more like a cya. That is why the question.
What's so legally nonsensical about reporting verified CSAM to NCMEC?
This is something
@Jayson A pointed out when the topic first came up. Apple will use a second check to make sure something was not seeded into the original database. When I tried looking into who has databases and the sources this piece doesn’t fit well.
Ok? I'm pretty sure the manual reviewers won't be confused as to whether they're looking at CSAM or not, so I'm not sure what the concern here is.
Potential false positives has the ability to easily destroy someone life even if they are innocent. If a system is generating false positives I know I would like to be notified. Not just “surprise!”. Apple feels it can happen. They even put an appeals process in place.
This concern is 100% unfounded. Apple is only reporting confirmed CSAM to NCMEC, and even if they weren't, do you seriously think the police are going to arrest someone without looking at the pictures themselves? The ONLY scenario where your concern is possible is if someone uploads CSAM to someone's account in order to frame them. While I don't think that will be anything close to a common occurrence (if it even happens at all), it's no reason to not report crime anymore than not reporting any other crimes people can possibly be framed for. We live in an imperfect world, but that doesn't' mean you throw the baby out with the bathwater.
BS. You can run the exact same process server side and catch ALL THE CSAM. On device you only catch the small bit in the event they leave the iCloud Photo feature on And catch none of the items already in the iCloud.
And while doing so, Apple would be decrypting and reading everyone's photos. You're ok with that? You seemed so concerned about privacy, but this suggestion puts a huge dent in your credibility on that. And again, as I said, Apple's goal here is obviously not to eradicate every trace of existing CSAM on iCloud but rather to combat the further spread of it on iCloud. And if iPhone users don't have the iCloud photo feature on, they can't upload any photos to iCloud anyway.
You may think so but quite a few think otherwise. If this had not tanked from a marketing announcement perspective it would have been a great argument to “stay in the Apple Environment to be private”.
For those of us who don't have an agenda and simply take Apple's statements at their face value, we see that this move is exactly in line with their commitment to privacy. That's why we find it an astounding irony that people are suggesting Apple use a less private method (server-side scanning) instead, and acting like that's a more private solution. Crazy! Anyway, if Apple was faking all this technology as a PR move, they stink at PR. I think they're making a gutsy move by striking a balanced compromise between continuing to NOT scan at all and scanning in a far more invasive way. Again, iCloud has NEVER been a truly private arena, and for those who want ultimate privacy they should not use iCloud or any other cloud service at all (or at least not for data they're concerned about, such as photos).
Sad you feel that way. You asked for 2-3 and I served up a lot more, none of which you have done other than play the same silly game of “but Apple says” and “That’s silly/dumb/nobody asks that”. Based on your answers, why would I put up more?
Sad that you mischaracterize my answers like that. Anyone can go back and see that's simply not true. I did call the one question silly (to ask of Apple themselves), because it is, but I never said "nobody asks that" (YOU asked it, so why would I say that? LOL!) And you didn't "put up more" even BEFORE I answered them. so that's not the reason, obviously.
Feel what you want. You are entitled to your own opinion. Rather than really try to answer you just point and claim “you’ll never …”. I had hoped you could at least come up with something I had missed and possibly answer at least one item or add to my knowledge base. You have done neither.
At this point, it is quite clear your problem is not lack of information.
I am done discussing this topic with you. You'll only be content in your echo chamber. I am more than willing to call Apple out if I see evidence of wrong-doing, but the best you and others can come up with is "We have questions we didn't like Apple's answers to." Sorry, but that doesn't cut it.
Bye.