@aral Now if we can find, say, 50 more absolutely non CSAM pictures that trigger the system and put those in albums on many iDevices, the whole thing becomes rather useless.
@jwildeboer @aral as far as I understand the issue, only the hash of these two pictures in the issue collide, there is no information on classification on the CSAM database.
According to reddid the hashes (of every picture on the device) is checked on apples servers only, so there is no way to know if the hash is actually in the DB.
But yes, once hashes are public, one could generate colliding pictures quite easily.
Mastodon instance for people with Wildeboer as their last name