If they pass CSAM verfied by hash on to human verification inside Apple they break the law. Not even the FBI are allowed to do that. Only NMCEC is an allowed recipient by US federal law.
The FBI was given clearance to take over a website serving CSAM in order to catch more users of the site. As such, the FBI has technically distributed CSAM in the past.
Seems to be a misunderstanding between what the law appears to say and what the actual practice is. Law enforcement's interest is not served by trying to prosecute moderators or companies acting in good faith because they have CSAM in their possession.
There is a difference between moderators manually identifying illegal content in a stream of mostly-legal material and a process where content which has already been matched against the database and classified as almost-certainly-illegal is subjected to further internal review.
AFAIK moderators at other organizations are also only reviewing content that has already been flagged somehow. I don't think it makes a difference. It comes down to good faith. If the company follows the recommendations of NCMEC on handling the material (and NCMEC absolutely does provide those recommendations), I doubt they're in any danger at all.
Obviously you could not make the same argument yourself unless you were also a reputable megacorp. There are upsides to being king. In this case, NCMEC wants the actual perps in jail so they're not going to take shots at Apple or its employees on technicalities.
The chance of a match being CSAM is not almost certain, though. Further, Apple only gets a low-resolution version of the image. In any case, presumably such issues have been addressed, as neither the FBI nor NCMEC have raised a stink about it.
> The chance of a match being CSAM is not almost certain, though.
Not according to Apple. They're publicly claiming a one-in-a-trillion false positive rate from the automated matching. Either that's blatant false advertising or they're putting known (as in: more likely than not) CSAM in front of their human reviewers. Can't have it both ways.
> Further, Apple only gets a low-resolution version of the image.
Which makes zero difference with regard to the content being illegal. Do you think they would overlook you possessing an equally low-resolution version of the same photo?
> In any case, presumably such issues have been addressed, as neither the FBI nor NCMEC have raised a stink about it.
Selective enforcement; what else is new? It's still a huge risk for Apple to take when the ethically superior (and cheaper and simpler) solution would be to encrypt the files on the customer's device, not scanning them first, and store the backups with proper E2E encryption such that Apple has no access to or knowledge of any of the content.