this post was submitted on 17 Sep 2023
144 points (100.0% liked)

Chat

7498 readers
42 users here now

Relaxed section for discussion and debate that doesn't fit anywhere else. Whether it's advice, how your week is going, a link that's at the back of your mind, or something like that, it can likely go here.


Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

Several months ago Beehaw received a report about CSAM (i.e. Child Sexual Abuse Material). As an admin, I had to investigate this in order to verify and take the next steps. This was the first time in my life that I had ever seen images such as these. Not to go into great detail, but the images were of a very young child performing sexual acts with an adult.

The explicit nature of these images, the gut-wrenching shock and horror, the disgust and helplessness were very overwhelming to me. Those images are burnt into my mind and I would love to get rid of them but I don’t know how or if it is possible. Maybe time will take them out of my mind.

In my strong opinion, Beehaw must seek a platform where NO ONE will ever have to see these types of images. A software platform that makes it nearly impossible for Beehaw to host, in any way, CSAM.

If the other admins want to give their opinions about this, then I am all ears.

I, simply, cannot move forward with the Beehaw project unless this is one of our top priorities when choosing where we are going to go.

you are viewing a single comment's thread
view the rest of the comments
[–] Thevenin@beehaw.org 2 points 1 year ago (1 children)

So this just got posted on lemmy.dbzer0. They've got an AI-based CSAM screen up and running with promising initial results. The model was trained using CLIP, which as far as I understand it means they used written descriptions of what CSAM is or is not.

Could something like this work for Beehaw?

[–] Intelligence_Gap@beehaw.org 1 points 1 year ago

I’m sure the mods saw that, and it’s really more of a question for them tbh, but if it works for other Lemmy instances I’m not sure why it wouldn’t work here.