Thanks. Can you be more specific in regards to the topic, what relevant laws do businesses need to obey in this case?
AFAIK, wrt to CSAM, all service providers operating in the US would have to report CSAM materials if they know that such materials are stored in their servers. Apple's approach, as far as I understand it, is to satisfy this requirements. Some folks have stated that Apple is not required to actively scan for such materials, and I agree to this point. I would think Apple will get into trouble with the law if the users of their services are found to possess such materials in Apple's servers and is being accused to not doing anything. That is the reason I think why Facebook, Google are doing such scans.
From what I've read, Apple already does this (CSAM scans) already when photos are uploaded into iCloud Photo. So why is Apple moving this to their devices? The only logical assumption is to finally enable E2EE for iCloud Photo uploads. Otherwise they cannot satisfy this legal requirements. I believe this is the main reason why Apple abandoned E2EE for iCloud Photo previously. I don't believe the FBI is the one pressuring Apple to not do it, because I believe the FBI has no legal standing in this matter.
Apple's proposed approach for CSAM, for all intent and purposes, to me, is equivalent to server side scanning, as Apple has stated that the hash is only happening when an upload occurs. Otherwise no hashing is done. There's no scanning of photos as far as I know. At no point is Apple able to view anything on device. It is up to anyone to believe what Apple has stated, but I have no reason to think why Apple would be lying.
Now, many folks have floated many 'what ifs' scenarios, but have missed the fact that Apple already have the ability to 'scan' and report 'illegal' materials to authorities. It could have been done right at this moment, for all their devices. Ditto for Google with Android, and Microsoft for Windows. But the question would be why would they do that? It serves Apple no purpose in doing this, at least to me anyway.
To sum it up, I do not think Apple wants to police anyone (which some are alluding to) in this instance. They are just trying to comply to legal requirements, while trying to provide features that their users find compelling, privacy being one of them. Apple is not being altruistic. They just think that doing so will help make better business.