So riddle me this... Apple has been doing facial and object recognition on your devices for quite awhile now...since iOS 10, have they not? They told us we could search by people, animals, plants, landscapes, etc. How do we know what other types of recognition they could have been doing behind the scenes and not being exposed to us? Obviously they been working to identify body parts, because soon they'll be able to warn kids if an image may contain one (this feature is not to be confused with the CSAM stuff). So theoretically, Apple could have been keeping track of how many explicit photos you may have, or how many gun photos you may have all along since iOS 10. The "scanning" of your photos has been around for quite awhile, so why the sudden uproar now? It feels a little too late in my opinion (and maybe should have stayed on iOS 9).
Not to mention, this type of scanning for object recognition (using AI), is not the same type of scanning used for CSAM (hashes). I really couldn't care less if Apple knows that one of my photos hashes out to "D9CCE882EE690AF3A78C77". That is keeping privacy in mind, as the hash does not identify what was in the photo, nor can it be reversed to regenerate the photo. (Of course if it was a match in the database, then yes, they can see the photo to verify if it was a positive match or not.)
Google has been doing CSAM scanning for years now. However, not only do they check the hashes against a database with known material, they also use AI to to help detect new material that haven't been identified yet. This is where it's crossing the line of privacy, and I think Apple knows that.