this post was submitted on 25 Jul 2023
18 points (90.9% liked)

Privacy

31991 readers
676 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

Chat rooms

much thanks to @gary_host_laptop for the logo design :)

founded 5 years ago
MODERATORS
top 5 comments
sorted by: hot top controversial new old
[–] dingus@lemmy.ml 7 points 1 year ago* (last edited 1 year ago) (1 children)

Excellent thread and tough questions to be answered.

I do think documentation of known CSAM hosting instances is really important for people who want to become admins.

As Dana Fried posits, it really should be opt-in and not opt-out.

Also, this isn't the first time the Fediverse has run into this issue. I'm having trouble finding links to support this, but early on in Mastadon's life, the largest instance was a Japanese instance that hosted a lot of manga images, including images that would be considered CSAM in the US but not in Japan, due to differing laws on drawings of minors in sexual situations. ~~The US bans all of it,~~ (EDIT: corrected, please see TauZero's comment) while Japan has an exception for drawings, seemingly under the idea that "no one is being harmed in a drawing." This created huge problems initially and resulted in the Japanese instance essentially being blacklisted by most Western instances.


Finally, took some digging, here is a good link with a breakdown on when Pixiv spun up their own Mastodon instance:

https://ethanzuckerman.com/2017/08/18/mastodon-is-big-in-japan-the-reason-why-is-uncomfortable/

In April 2017, Pixiv began hosting a Mastodon instance – Pawoo.net – that quickly became the most popular Mastodon server in the world. If you have a Pixiv account, it’s a single click to establish a Pawoo.net account. And if you monitor the feed on pawoo.net, you’ll see that a great deal of content features lolicon, much of it behind content warning tags. In response to the growth of pawoo.net, a number of large, predominantly North American/European Mastodon servers stopped federating posts from the Japanese site, as they were uncomfortable with lolicon appearing as part of their feed. Scala reports that Rochko modified the database on mastodon.social to make it possible to “silence” pawoo.net, so that posts only appear if you explicitly choose to subscribe to users of that server.

Needless to say, not every Mastodon administrator is excited that the protocol is being used to harbor lolicon. The terms of service for mastodon.cloud – the fifth largest Mastodon instance, and the largest based in the US – now explicitly prohibit “lolicon, immoral and indecent child pics”.


In other words, this has been a problem, and will continue to be a problem, and I hate to say it, but the developers at Lemmy probably need to add more tools to be able to combat this problem, possibly up to and including making Federation and opt-in instead of opt-out, where when you initially set up your server, you have to choose with whom to Federate.

It's doubtful small admins could afford the corporate tools to detect and remove CSAM automatically. It's also unlikely that the FBI would just hand off such tools to small-potatoes admins, probably arguing that access to the CSAM database could be misused by people who want to find CSAM, and access to the database could give them tools to do so. Thus some rando won't just be given the same access to such tools.

[–] TauZero@mander.xyz 9 points 1 year ago (1 children)

The US bans all of it, while Japan has an exception for drawings

Absolutely incorrect. You are thinking of Canada or UK. In US, drawings are fine. Rather it is photorealistic depictions "indistinguishable from that of a minor" that are prohibited, almost presciently pre-empting techniques like deepfake and stablediffusion by 20 years, a rare win by legislators.

[–] dingus@lemmy.ml 3 points 1 year ago

Thanks for the correction!

[–] Kolanaki@yiffit.net 4 points 1 year ago (1 children)

Decentralization and encryption and all these things for privacy are always going to attract those who will use it for sinister shit. The best way to combat it is, unfortunately, giving up some of those expectations of privacy. Which sucks. Because there is a lot of good that can also be done, and has to be done, out of sight. Such as fighting tyranny or oppression.

[–] dingus@lemmy.ml 9 points 1 year ago* (last edited 1 year ago)

Fifteen years ago a friend and I discussed this. He posited it like this:

"If you want to help Chinese dissidents, you run a Tor exit node so they have some place to connect to. The problem is that you have no actual control over who gets to use your exit node, and you run the risk of being arrested for child porn because you want to support political dissidents in authoritarian countries."

Not long after that conversation a man was arrested for child porn for.... running a Tor exit node.

load more comments
view more: next ›