> to detect csam, they use hashes of regions of images
> no one talks about how they got the hashes
> the hashes are a proprietary product
> you must pay in to use the hashes to block/filter images
> if you try to make your own hashes, you're admitting to possession of csam
> the owners of the hashes say that the hashes could be used for nefarious purposes if made public (this claim falls flat on its face when you realise that finding a hash of a subset of an image is extremely expensive for a processor, so searching for that needle in the internet's haystack would be more in the realm of blackhat work than criminal consumer work)
> a monopoly controls protection against csam with hashes for which they cannot disclose their sources, and they charge a premium for what seems to be a broad societal need
It's really hard not to think that at least some subset of the governmental and nonprofit operations that deal with CSAM are pedophiles.