OK. This is quite technical. But TL;DR - the neural hash system #Apple uses for their CSAM detection has been confronted with its first possible collision by some good hackers. This dog might be marked by the system as suspicious. Ouch. Issue 1 at https://github.com/AsuharietYgvar/AppleNeuralHash2ONNX
@aral Now if we can find, say, 50 more absolutely non CSAM pictures that trigger the system and put those in albums on many iDevices, the whole thing becomes rather useless.
Reddit discussion at https://old.reddit.com/r/MachineLearning/comments/p6hsoh/p_appleneuralhash2onnx_reverseengineered_apple/ HackerNews discussion at https://news.ycombinator.com/item?id=28218391
@jwildeboer @aral wouldn't that be a kind of (D)DoS 😂
@ostfriese Naah, the scanning happens on your iDevice, so no, not a DDoS ;) @aral
@jwildeboer @aral as I wrote: "a kind of". I know it's not a DDos in the true sense of the word.
@jwildeboer @aral as far as I understand the issue, only the hash of these two pictures in the issue collide, there is no information on classification on the CSAM database.
According to reddid the hashes (of every picture on the device) is checked on apples servers only, so there is no way to know if the hash is actually in the DB.
But yes, once hashes are public, one could generate colliding pictures quite easily.
@jwildeboer @aral also, I always wondered why Apple said they will only alert if there are a few matching files... which makes sense if they already knew that there will be hash collisions.
@jwildeboer To be fair, that's a pretty suspicious expression that dog has.
@jwildeboer If we argue against technological shortcomings, we're fighting a losing battle. Tech will probably get better over time.
We need to fight this from a human rights perspective. Privacy, personhood, who owns your device. This kind of stuff.
@claudius Absolutely. But that is not an either/or question. You need both IMHO. The technical arguments against this implementation AND the complete erosion of privacy it introduces.
@jwildeboer That's a good point.
@hans it is pretty shifty eyed.
@jwildeboer from its facial expression, I know how this 🐕 thinks of the system
[Image description]
@jwildeboer a white and brown dog with a weird expression on his face.
@jwildeboer this means collisions are rather easy. The thing is, if you can then get the db, and then generate innocuous pics that match the same hashes, you can flood a targets phone without having to actually use illegal content, with potentially interesting consequences. Denial of Privacy attacks... @aral
@jwildeboer Credit where it's due, though: the AI appears correct that the dog _is_ indeed suspicious .. probably they're looking straight at an Apple "privacy" advert...
cc @aral FYI. This dog would be classified as possible CSAM by the Apple Neuralhash system. Do NOT save this on your iDevice.