this post was submitted on 21 Jul 2023
34 points (100.0% liked)

Announcements

240 readers
1 users here now

Community updates and announcements.

Admins will post any updates here so be sure to follow!

Important updates will be pinned to the local feed.

founded 1 year ago
MODERATORS
 

Hello,

There is a unique problem of the fediverse. When an instance goes offline, its communities will never sync again.

Recently, vlemmy.com shutdown. Quite a few communities synced with discuss.online and other instances. Because vlemmy.com is not longer brokering communication, these communities will never be in sync again.

We have a several options:

  1. Leave them there. Do nothing.
  2. Leave them there but make a post that it's dead and hope people see it.
  3. Purge the communities. Act like they never existed.
  4. Build some elaborate system to work around vlemmy being gone. This would take a lot of work and collaboration with other instances.

Let me know what makes the most sense to you as users. Are any of you still using vlemmy communities? What about long-term planning? Maybe this isn't an issue now but what if lemmy.world vanished?

Please, let me know what you think. I'm torn on this one.

Thanks, Jason

you are viewing a single comment's thread
view the rest of the comments
[–] Methane_Magnate 2 points 1 year ago (3 children)

This topic is of extreme importance to the success of widespread adoption of the fediverse.

And I'm not seeing much discussion about it, despite having posted about it elsewhere, As an aside, I can't atm log in to the account where I brought this up, due to instability, or maybe a ddos. No telling, but that I can't log in shows a weakness that the general users won't tolerate. I suspect I've lost another account, along with the community I created and the attendant work, when a fmhy.ml went offline.

While non-tech people will come, there's a good chance they'll leave when lemmy.fmhy.ml, for example, disappears, taking their community and all their contributions with it.

There are many reasons why servers and domains will evaporate. And users will emigrate from an unstable, unreliable environment.

Redundancy and backup are critical. Maybe p2p is the solution, I don't know. The fix is above my pay grade.

But this weakness will be exploited. Pissed off instance owners, blackhats working for moneyed interests to whom the fediverse represents a multi-billion dollar threat, drive failures, lightning strikes, the 'how' doesn't matter. It's simply a matter of 'when,' and what will result.

This concept must become more resilient in order to be viable over the long term.

[–] jgrim 4 points 1 year ago (1 children)

You're welcome to bring your community here. I've survived a few attacks that have taken others down. I'm a software engineer and infrastructure guy. My goal is 99.9% uptime.

I'm also working on some of these issues, or at least the tools to help identify them.

I think many servers will come and go mostly due to cost. There isn't a great way to cover costs today, and the cost get higher and higher as you grow. I'm in a unique situation than most, where I'm willing to take losses for a long time before donations catch up.

Based on my experience and observations, it appears that things will vanish, and I want to find a way to preserve them and react to it without ruining the image of the fediverse or Lemmy.

[–] Methane_Magnate 2 points 1 year ago

Thank you for the invitation, your efforts, and your financial commitment.

Servers will come and go, and for many reasons. Those to whom computers are black boxes will come, get lost, be easily discouraged, and go elsewhere. For the fediverse to supplant the dominant, monolithic platforms, it has to be stable and simple to use.

I'm insufficiently conversant to discuss the 'how' of making things secure and redundant. Maybe a p2p architecture, I don't know. But something along those lines that would have the fediverse resistant to decay.

I had restarted the lost community before receiving your kind offer. But if I start another, it will be here.

And again, thank you for all you're doing.

load more comments (1 replies)