this post was submitted on 29 Oct 2023
94 points (93.5% liked)
Programming
17416 readers
41 users here now
Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!
Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.
Hope you enjoy the instance!
Rules
Rules
- Follow the programming.dev instance rules
- Keep content related to programming in some way
- If you're posting long videos try to add in some form of tldr for those who don't want to watch videos
Wormhole
Follow the wormhole through a path of communities !webdev@programming.dev
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
A database optimization I made was changing a tables id generation from a manual generation scheme (some other table had an entry with the next usable id, it was updated with every entry written) to a uuid generation scheme. The table stores data from a daily import, on a fresh import all previous data is deleted. On some systems, there are more than 10 000 000 entries to be imported on a daily basis, which took 8 hours. Now, with batched inserts and the mentioned improvement in the db scheme, it's at about 20 minutes.
TLDR: Reducing the amount of queries sent is good, because although network is usually fast (ms), db requests are still slow compared to the speed of an application (clock cycles).
And yeah, there is an option to only import changes daily, but sadly that isn't supported in every environment.