this post was submitted on 06 Mar 2024
307 points (88.9% liked)

Fediverse

28380 readers
765 users here now

A community to talk about the Fediverse and all it's related services using ActivityPub (Mastodon, Lemmy, KBin, etc).

If you wanted to get help with moderating your own community then head over to !moderators@lemmy.world!

Rules

Learn more at these websites: Join The Fediverse Wiki, Fediverse.info, Wikipedia Page, The Federation Info (Stats), FediDB (Stats), Sub Rehab (Reddit Migration), Search Lemmy

founded 1 year ago
MODERATORS
 

Highlighting the recent report of users and admins being unable to delete images, and how Trust & Safety tooling is currently lacking.

you are viewing a single comment's thread
view the rest of the comments
[–] Murvel@lemm.ee 38 points 8 months ago (1 children)

Lemmy devs being man children when confronted with GDPR compliance.

And if Lemmy if supposed to better Reddit in basic fucking decency then GDPR is absolutely crucial.

[–] Jumuta@sh.itjust.works 22 points 8 months ago (3 children)

how are you supposed to do gdpr compliance on a federated system though?

[–] maynarkh@feddit.nl 30 points 8 months ago (1 children)

You are responsible for data collected by your own instance. If a deletion request comes through, you are responsible for deleting it from your account, and forwarding the deletion request and responses to other instance you federate with. You are in the clear as long as you don't keep data you legally can't, and have sufficiently informed other instances of your obligations.

[–] Badeendje@lemmy.world 3 points 8 months ago (1 children)

No, if you collected the data and shared it with others, simply informing the others is not enough. This is why the platform needs tools for admins to comply.

A proper method, that allows the users to nume their account could already be enough.

[–] maynarkh@feddit.nl 4 points 8 months ago (1 children)

What I mean by informing others is that you have to explicitly forward the deletion request. Not much else you can do I think.

[–] Badeendje@lemmy.world 2 points 8 months ago (1 children)

I get that, but this is where it gets tricky. As "there is nothing we can do" was the number one reason used under the law predating the GDPR. So in the GDPR there is a stipulation that you stay responsible or share responsibility with the other party If you share the data. Because large companies used this to send data through clearing houses allowing them to hash their hands.

GDPR is really the cranky brother of its predecessors, because there was so much fuckery going on.

And while I doubt Admins will be a prime target for privacy watchdogs, it is good that they also have to think about the privacy of their users. Since privacy is a basic human right.

[–] maynarkh@feddit.nl 1 points 8 months ago (1 children)

Oh, that's actually neat. But at the same time, that means every instance owner is responsible for the whole of the Fediverse.

I can imagine that would mean non-compliant instances will get defederated at some point? Or ActivityPub will get some compliance features? It's not like the EU is unaware of the Fediverse, they are the main monetary supporters behind Lemmy.

[–] Badeendje@lemmy.world 1 points 8 months ago (1 children)

I have no clue how jurisprudence would turn out. But keep in mind, this is not about the posts people make. The framework just needs to collect/store as little information as possible that can be considered PII. And it should have a way to remove it.

If Deleting your account results in the PII actually being removed (username, ip address, other profile info, whatever data is stored under the hood) and these removals actually get federated.. there should not be an issue.

Then admins maybe have to do something if people start posting PII as messages, but that would probably be doxing and up for removal anyway.

So mainly the issus boil down to:

  • is there a way for people to scrub their account
  • does the scrubbing remove all the data
  • is the platform clear about what data is being collected and is all collected data actually needed
[–] maynarkh@feddit.nl 1 points 8 months ago (1 children)

The issue I see is that if my instance is on the hook for the fediverse at large, and I operate on an allowlist basis, malicious actors can scrape PII and ignore the GDPR, and that would make me the one on the hook for that, isn't that right?

[–] Badeendje@lemmy.world 1 points 8 months ago

There is plenty of jurisprudence and clarity needed, so..... maybe. Hence the importance for the framework itself to be as GDPR compliant as possible and not store PII if not nessecary and remove it once no longer nessecary. (Storing someone's IP for login, and post validation, bans etc should be limited to the period that makes sense, not infinitely.)

And in your example, the 'malicious' part of the 3rd party probably makes it different. Maybe then it is a dataleak.

[–] Badeendje@lemmy.world 12 points 8 months ago* (last edited 8 months ago)
  • By defining all information that is processed and why.
  • By not processing and storing any personal identifiable information (an IP address is PII for example) without a clearly defined need.
  • When stored ONLY using data for the defined purposes. This also means shielding data that should be shielded.
  • By implementing the mechanics for someone to be forgotten (delete my account, should delete all info, especially PII).
  • Making sure the mechanics to federate these changes/deletions exist.