this post was submitted on 28 Aug 2023
1736 points (97.9% liked)
Lemmy.World Announcements
29163 readers
36 users here now
This Community is intended for posts about the Lemmy.world server by the admins.
Follow us for server news π
Outages π₯
https://status.lemmy.world/
For support with issues at Lemmy.world, go to the Lemmy.world Support community.
Support e-mail
Any support requests are best sent to info@lemmy.world e-mail.
Report contact
- DM https://lemmy.world/u/lwreport
- Email report@lemmy.world (PGP Supported)
Donations π
If you would like to make a donation to support the cost of running this platform, please do so at the following donation URLs.
If you can, please use / switch to Ko-Fi, it has the lowest fees for us
Join the team
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Probably hashing and scanning any uploaded media against some of the known DBs of CSAM hashes.
Iirc thatβs how Reddit/FB/Insta/Etc. handle it
They're sent to a 3rd party that does the checks. For example https://developers.cloudflare.com/cache/reference/csam-scanning/
The actual DB of hashes isn't released to the public as it would enable those who traffic in such content to use it to find the material that doesn't match much more easily.
https://protectingchildren.google/#tools-to-fight-csam
That appears to be the one that Facebook and Reddit use.
The sad thing is that all we can usually do is make it harder for attackers. Which is absolutely still worth doing, to be clear. But if an attacker wants to cause trouble badly enough, there's always ways around everything. Eg, image detection can be foiled with enough transformation, account age limits can be gotten past by a patient attacker. Minimum karma can be botted (even easier than ever with AI) and Lemmy is especially easy to bot karma because you can just spin up an instance with all the bots your heart desires. If posts have to be approved, attackers can even just hotlink to innocent images and then change the image after it's approved.
Law enforcement can do a lot more than we can, by subpoenaing ISPs or VPNs. But law enforcement is slow and unreliable, so that's also imperfect.