this post was submitted on 31 Aug 2023
376 points (95.4% liked)

Selfhosted

40219 readers
986 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 1 year ago
MODERATORS
 

I am seeing a lot of fearmongering and misinformation regarding recent events (CSAM being posted in now closed large lemmy.world communities). I say this as someone who brought attention to this with other admins as I noticed things were federating out.

Yes, this is an issue and what has happened in regards to CSAM is deeply troubling but there are solutions and ideas being discussed and worked on as we speak. This is not just a lemmy issue but an overall internet issue that affects all forms of social media, there is no clear cut solution but most jurisdictions have some form of safe harbor policy for server operators operating in good faith.

A good analogy to think of here is if someone was to drop something illegal into your yard that is open to the public. If someone stumbled upon said items you aren't going to be hunted down for it unless there is evidence showing you knew about the items and left them there without reporting them or selling/trading said items. If someone comes up to you and says "hey, there's this illegal thing on your property" you report it and hand it over to the relevant authorities and potentially look at security cameras if you have any and send them over with the authorities then you'd be fine.

A similar principle exists online, specifically on platforms such as this. Obviously the FBI is going to raid whoever they want and will find reasons to if they need to, but I can tell you for near certainty they probably aren't as concerned with a bunch of nerds hosting a (currently) niche software created by 2 communists as a pet project that gained popularity over the summer because a internet business decided to shoot itself in the foot. They are specifically out to find people who are selling, trading, and making CSAM. Those that knowingly and intentionally distribute and host such content are the ones that they are out for blood for.

I get it. This is anxiety inducing especially as an admin, but so long as you preserving and reporting any content that is brought to your attention in a timely manner and are following development and active mitigation efforts, you should be fine. If you want to know in more detail click the link above.

I am not a lawyer, and of course things vary from country to country so it's a good idea to check from reputable sources on this matter as well.

As well, this is a topic that is distressing for most normal well adjusted people for pretty obvious reasons. I get the anxiety over this, I really do. It's been a rough few days for many of us. But playing into other peoples anxiety over this is not helping anyone. What is helping is following and contributing the discussion of potential fixes/mitigation efforts and taking the time to calmly understand what you as an operator are responsible for within your jurisdiction.

Also, if you witnessed the content being discussed here no one will fault you for taking a step away from lemmy. Don't sacrifice your mental health over a volunteer project, it's seriously not worth it. Even more so if this has made you question self hosting lemmy or any other platform like it, that is valid as well as it should be made more clearer that this is a risk you are taking on when making any kind of website that is connected to the open internet.

you are viewing a single comment's thread
view the rest of the comments
[–] secret_j@kbin.social 12 points 1 year ago (1 children)

I think its also a good prompt, as a self hoster, to assess what services you are hosting and what kind of risk profile that exposes you to. Making yourself aware of any regulations or legal implications and their potential consequences (if any) may mean that self hosting a service becomes much less fun/cool and not worth it.

[–] secret_j@kbin.social 2 points 1 year ago

To expand the conversation; NOTE: I am NOT a Lawyer
People hosting a federated instance in Australia would likely be classed as a Social Media service and be bound by the relevant safety code on the eSafety commissioners site here: https://www.esafety.gov.au/industry/codes/register-online-industry-codes-standards. This is planned to take effect in December 2023 but serves as a guide.

First perform an assessment on your risk factor to determine a Tier (1,2,3) which dictates your required actions. Services that assess between tiers should assume higher risk, which means, potentially, you may be classed higher risk due to the general nature of the content (its not a club so conversation is around a specific topic).

Minimum compliance (assuming you are classed as a Tier 3 Social Media Service)
Section 7, Objective 1, Outcome 1.1 and Outcome 1.5:

Should you be determined to be Tier 2 or 1, there are a whole raft of additional actions including ensuring you are staffed to oversee the safety (1.4), and child account protections (1.7) (preventing unwanted contact), and active detection of CSAM material (1.8)


1.1
Notifying appropriate entities about class 1A material on their services
If a provider of a social media service:
a) identifies CSEM and/or pro-terror materials on its service; and
b) forms a good faith belief that the CSEM or pro-terror material is evidence of serious
and immediate threat to the life or physical health or safety of an adult or child in
Australia,
it must report such material to an appropriate entity within 24 hours or as soon as
reasonably practicable.
An appropriate entity means foreign or local law enforcement (including, Australian
federal or state police) or organisations acting in the public interest against child sexual
abuse, such as the National Centre for Missing and Exploited Children (who may then
facilitate reporting to law enforcement).
Note: Measure 1 is intended to supplement any existing laws requiring social media service providers
to report CSEM and pro-terror materials under foreign laws, e.g., to report materials to the National
Centre for Missing and Exploited Children and/or under State and Territory laws that require reporting
of child sexual abuse to law enforcement.
Guidance:
A provider should seek to make a report to an appropriate entity as soon as reasonably
practicable in light of the circumstances surrounding that report, noting that the referral of
materials under this measure to appropriate authorities is time critical. For example, in
some circumstances, a provider acting in good faith, may need time to investigate the
authenticity of a report, but when a report has been authenticated, an appropriate authority
should be informed without delay. A provider should ensure that such report is compliant
with other applicable laws such as Privacy Law.

1.5
Safety by design assessments
If a provider of a social media service:
a) has previously done a risk assessment under this Code and implements a significant
new feature that may result in the service falling within a higher risk Tier; or
b) has not previously done a risk assessment under this Code (due to falling into a
category of service that does not require a risk assessment) and subsequently
implements a significant new feature that would take it outside that category and
require the provider to undertake a risk assessment under this Code,
then that provider must (re)assess its risk profile in accordance with clause 4.4 of this Code
and take reasonable steps to mitigate any additional risks to Australian end-users
concerning material covered by this Code that result from the new feature, subject to the
limitations in section 6.1 of the Head Terms.