A secure hashing algorithm should produce a totally different hash if even one pixel is modified, which would make circumventing that system trivial. To work around that, Apple’s system (and others) apply some fuzzing to help “undo” modifications. That fuzzing inherently also increases the likelihood of hash collisions, where two different images produce the same hash.
The consequences of a hash collision are, of course, pretty dire in this specific case. Even an account being flagged for review could mean massive headaches if the person was innocent. Apple tried to design around this by requiring manual review and requiring a number of matches before that review would occur, but that doesn’t fix the unrelated issue of who controls the hash databases.
Apple was going to require a hash to be present in multiple databases, but as we’ve seen recently with Russia and Belarus for example, it’s possible for a state to develop so much influence over its region that it effectively has puppet states…and it’s not like Russia’s government at this time is particularly known for its love of freedom of expression. So what then, does an oppressive government with puppet states tell Apple to add its preferred hashes or ban Apple imports? This leaves the system vulnerable to abuse against innocent people for reasons completely unrelated to CSAM.
That’s what the issue was.