this post was submitted on 13 Jun 2023
163 points (97.1% liked)

Lemmy.World Announcements

29066 readers
339 users here now

This Community is intended for posts about the Lemmy.world server by the admins.

Follow us for server news ๐Ÿ˜

Outages ๐Ÿ”ฅ

https://status.lemmy.world/

For support with issues at Lemmy.world, go to the Lemmy.world Support community.

Support e-mail

Any support requests are best sent to info@lemmy.world e-mail.

Report contact

Donations ๐Ÿ’—

If you would like to make a donation to support the cost of running this platform, please do so at the following donation URLs.

If you can, please use / switch to Ko-Fi, it has the lowest fees for us

Ko-Fi (Donate)

Bunq (Donate)

Open Collective backers and sponsors

Patreon

Join the team

founded 1 year ago
MODERATORS
 

We're still working to find a solution for the posting slowness in large communities.

We have seen that a post does get submitted right away, but yet the page keeps 'spinning'

So right after you clicked 'Post' or 'Reply' you can refresh the page and the post should be there.

(But maybe to be sure you could copy the contents of your post first, so you can paste again if anything would go wrong..)

you are viewing a single comment's thread
view the rest of the comments
[โ€“] deathworlder@lemmy.world 1 points 1 year ago* (last edited 1 year ago)

One of the large applications I was working on had the same issue, to solve it we ended up creating multiple smaller instances and started hosting a set of related API's in each server.

for example read operations like list posts, comments etc could be in one server. write operations can be clusered in one server.

Later, whichever server is getting overloaded can be split up again. In our case 20% of API's used around 3/4th of server resources, so we split those 20% API's in 4 large servers and kept the remaining 80% API's in 3 small servers.

This worked for us because the DB's were maintained in seperate servers.

I wonder if a quasi micro-services approach will solve the issue here.

Edit 1: If done properly this approach can be cost effective, in some cases it might cost 10 to 20 percentage more in server costs, however it will lead to a visible improvement in performance.