151

Federated services have always had privacy issues but I expected Lemmy would have the fewest, but it's visibly worse for privacy than even Reddit.

  • Deleted comments remain on the server but hidden to non-admins, the username remains visible
  • Deleted account usernames remain visible too
  • Anything remains visible on federated servers!
  • When you delete your account, media does not get deleted on any server
you are viewing a single comment's thread
view the rest of the comments
[-] 0xtero@kbin.social 64 points 1 year ago* (last edited 1 year ago)

First - we're all using alpha/beta software (Lemmy is 0.17.4, Kbin is 0.10.). None of these services are "production quality" software yet, so let's keep that in our minds - we're all early adopters.

The points mentioned in the OP are a bad look. Naturally. User should have expectation of their data being deleted on request - especially since this request might be regulatory privacy request (GDPR related). It's a clear failure from the software and should be improved and iterated upon.

The expectation shouldn't be "oh well it's on the Internet, live with it". While Facebook might keep mining your data after deletion request, our software shouldn't behave like that, we should strive to be better with this stuff.

And finally, ensuring privacy in federated system is hard. Mastodon suffers from same problems. We shouldn't give up on the idea though.

[-] lovesyouandhugsyou@beehaw.org 5 points 1 year ago

But is it solvable at all in principle? The only enforcement policy available is defederation, but that just means future posts won't go to that instance, the older posts will still be there. Plus an instance could just lie when confirming delete requests and you'd never know unless the non-deleted posts leaked.

[-] Dee_Imaginarium@beehaw.org 5 points 1 year ago

Not really, same as email. Once you send it out and it's on somebody else's server, you can request they delete it but that's about it. They have a copy of your message and can do whatever they want with that.

This is not a principle that needs solving imo, it's the nature of Internet. If you post it online then you should know that there's a chance it'll be there permanently.

[-] Mikina@programming.dev 1 points 1 year ago* (last edited 1 year ago)

Hmm, it's an interesting problem. I'm afraid you are right and there's really nothing left but defederation - on the other hand, then it's the same as with stuff like the parsers that could show deleted reddit messages, or things like waybackmachine, which basically do the same, so the core logic of base lemmy source should be as privacy-respecting as possible.

I remember few years ago when I was reading about Signal that there is some way how you can verify that their server is running on the same code as the one published (and audited heavily), so you can be 100% sure that there were no modifications. Wouldn't something like that be a solution? That would prevent servers from modifying the code that deletes data. I don't know how it works, and I couldn't find it when I tried looking for it again, but assuming such a thing is possible, each Lemmy instance could just have a verify widget on their VCS and you could be sure that this instance really does delete your data, since they didn't modify the deletion code.

But this is just a theorycrafting, I wouldn't really have enough experience to create something like that and I can imagine that it's not an easy thing. But if anyone knows more details about the way Signal verification works, assuming I'm just didn't misunderstood something (since it's literally a memory I have of a single sentence from one random article when I was researching best private messages app), I would love to read more about the way it works!

But yeah, outside of that, I'm afraid that the following set of features is mutually exclusive:

  • An user is able to delete their data, and it's guaranteed that they are deleted from everywhere.
  • If a lemmy instance dies, it's data is not lost.
  • There is not a single centralized authority for anything.

Another option would be to create some kind of reputation system, where self-hosted bots could check for servers that still provide posts and comments that should be deleted, and flag offenders. But that's overengineering anyway, and as I've already said - there's still no way how to stop scraper or anyone from simply copying your data when they see it.

load more comments (8 replies)
this post was submitted on 19 Jun 2023
151 points (100.0% liked)

Technology

37621 readers
355 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS