this post was submitted on 25 Jul 2023
116 points (84.9% liked)

Fediverse

28406 readers
1317 users here now

A community to talk about the Fediverse and all it's related services using ActivityPub (Mastodon, Lemmy, KBin, etc).

If you wanted to get help with moderating your own community then head over to !moderators@lemmy.world!

Rules

Learn more at these websites: Join The Fediverse Wiki, Fediverse.info, Wikipedia Page, The Federation Info (Stats), FediDB (Stats), Sub Rehab (Reddit Migration), Search Lemmy

founded 1 year ago
MODERATORS
 

Not the best news in this report. We need to find ways to do more.

you are viewing a single comment's thread
view the rest of the comments
[–] etrotta@kbin.social 6 points 1 year ago (1 children)

Here's a link to the report: https://stacks.stanford.edu/file/druid:vb515nd6874/20230724-fediverse-csam-report.pdf
It is from 2023-07-24, so there's a considerable chance it is not the one you were thinking about?

[–] chaogomu@kbin.social 14 points 1 year ago (1 children)

Since the release of Stable Diffusion 1.5, there has been a steady increase in the
prevalence of Computer-Generated CSAM (CG-CSAM) in online forums, with
increasing levels of realism.17 This content is highly prevalent on the Fediverse,
primarily on servers within Japanese jurisdiction.18 While CSAM is illegal in
Japan, its laws exclude computer-generated content as well as manga and anime.

Nope, seems to be the one. They lump the entire Fediverse together, even though most of the shit they found was in Japan.

The report notes 112 non-Japanese items found, which is a problem, but not a world shaking issue. There may be issues with federation and deletion orders, which is also an issue, but not a massive world shaking one.

Really, what the report seems to be about is the fact that moderation is hard. Bad actors will work around any moderation you put in place, so it's a constant game of whack-a-mole. The report doesn't understand this basic fact and pretends that no one is doing any moderation, and then they add in Japan.

[–] dustyData@lemmy.world 7 points 1 year ago* (last edited 1 year ago) (1 children)

I can't seem to find the source for the report about it right now, but there's literal child porn being posted to Instagram. We don't see this kind of alarmist reports about it because it is not something new, foreign and flashy for the general public. All internet platforms are susceptible to this kind of misuse. The question is what moderation tools and strategies are in place to deal with that. Then there's stuff like on TOR where CSAM was used as a basis to discredit the use of the whole technology then it turned out that the biggest repository was an FBI honey pot operation.

[–] chaogomu@kbin.social 10 points 1 year ago

Buried in this very report, they note that Instagram and Twitter have vastly more (self generated) child porn than the Fediverse. But that's deep into section 4, which is on page 8. No one is going to read that far into the report, they might get through the intro, which is all doom and gloom about decentralized content.