this post was submitted on 25 Oct 2024
24 points (96.2% liked)

Programming

17416 readers
104 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities !webdev@programming.dev



founded 1 year ago
MODERATORS
 

Hi programmers,

I work from two computers: a desktop and laptop. I often interrupt my work on one computer and continue on the other, where I don't have access to uncommitted progress on the first computer. Frustrating!

Potential solution: using git to auto save progress.

I'm posting this to get feedback. Maybe I'm missing something and this is over complicated?

Here is how it could work:

Creating and managing the separate branch

Alias git commands (such as git checkout), such that I am always on a branch called "[branch]-autosave" where [branch] is the branch I intend to be on, and the autosave branch always branches from it. If the branch doesn't exist, it is always created.

handling commits

Whenever I commit, the auto save branch would be squashed and merged with the underlying branch.

autosave functionality

I use neovim as my editor, but this could work for other editors.

I will write an editor hook that will always pull the latest from the autosave branch before opening a file.

Another hook will always commit and push to origin upon the file being saved from the editor.

This way, when I get on any of my devices, it will sync the changes pushed from the other device automatically.

Please share your thoughts.

top 37 comments
sorted by: hot top controversial new old
[–] NegativeLookBehind@lemmy.world 27 points 3 weeks ago (3 children)

Write code on a machine you can remote into from each computer? Less commits, possibly less reverts, less chance of forgetting to git pull after switching machines…idk.

[–] lucas@startrek.website 5 points 3 weeks ago (2 children)

Don't even need to remote in to anything, just store your working code on a network share

[–] matcha_addict@lemy.lol -4 points 3 weeks ago (1 children)

I mean... That's kinda what git does, in a way... Right?

[–] Kkmou@lemm.ee 5 points 3 weeks ago* (last edited 3 weeks ago)

Don't think git as a sync storage, more like to merge works.

If you need to share files between computers use a shared storage.

Always use the right tool for the job. Mount a shared storage or use synctools rsync, etc

[–] hakunawazo@lemmy.world 1 points 2 weeks ago

Yes, and use something like GNU Screen to work seamless on the other machine again.

[–] matcha_addict@lemy.lol 1 points 3 weeks ago (2 children)

I have considered this approach, but there are several things I had issues with.

  • there is still a degree of latency. It's not a deal breaker, but it is annoying
  • clipboard programs don't work. They copy to the remote host's clipboard. I bet there's a solution to this, but I couldn't find it from spending a limited time looking into it.
  • in the rare case the host is unreachable, I am kinda screwed. Not a deal breaker since its rare, but the host has to be always on, whether the git solution only requires it to be on when it syncs

To address the issues you brought up:

  • less commits: this would be resolved by squashing every time I make a commit. The auto save commits will be wiped. If I really hated commits, I could just amend instead of commit, but I rather have the history.
  • forgetting to git pull: the hooks I talked about will take care of that. I won't have to ever worry about forgetting anymore.
[–] actually@lemmy.world 1 points 2 weeks ago

I once used a virtual desktop in the cloud, and I could access that from anywhere. It was just a regular OS that had all my tools, and it was where my work was done changes. Ultimately, that remote desktop went away when I changed jobs. But, it would be something I would think about again for me.

There is a danger of things going poof, or not being accessible. It cannot be helped at all. But a push to a backup repo during each commit, would allow an emergency restore. Doing a snapshot every few days of the machine, for example if its on AWS or other, helps lessen the loss when and if it goes poof.

To solve the issue of the internet going out, have one of your local computers do a regular pull as a cron job of the backup repo

[–] Strykker@programming.dev 0 points 2 weeks ago (1 children)

Your git solution still has all of these issues, as you need the git server to be alive, for number 3 use something like rsync so you keep a local copy that is backed up if you are concerned about the file share being offline.

[–] matcha_addict@lemy.lol 1 points 2 weeks ago* (last edited 2 weeks ago)

I don't need the client computers to be alive, only the central server (which could be github.com for example, so not even a server I manage).

[–] demesisx@infosec.pub 14 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

I do this on NixOS. I have a NAS at home where I store most of the files I work on. My computers are internally immutable and almost all the files that change reside solely on the NAS as NFS shares. All of my computers are configured to auto-mount one of its folders at boot. NixOS sees that as an internal drive.
Then, simply navigate to the project folder where I have a flake and a .envrc file containing the command use flake .which will make direnv use Nix to provision the dependencies automatically. Whenever I save, those changes are reflected on all computers.

I like to also version control everything using git and this method allows that transparently.

The only part that I am missing is getting the permissions to align between all computers accessing that same folder. Sometimes I have to create a temp folder that uses rsync to keep up with any changes. If anyone has any pointers, I’m all ears. It rarely gets in my way but does rear its head sometimes. Otherwise, this setup is perfect when I’m at home.

[–] leetnewb@beehaw.org 2 points 2 weeks ago

I use rclone to mount the Linux NAS from my Linux and Windows computers - SFTP backend is usually fine. Then I am uniformly reading/writing the NAS files as the local NAS user.

[–] MajorHavoc@programming.dev 13 points 3 weeks ago (1 children)

I set that up, once. It went poorly for me. Git behaves much better, for me, when used thoughtfully and manually.

What I now do instead, is work on certain projects on an SSH accessible host. This gives the same benefits of having my last state easily accesses, without causing noise in my development tools such as git.

[–] Danitos@reddthat.com 3 points 2 weeks ago

If working on Linux, combine SSH with tmux (and the attach/detach commands) and you have a very solid workflow. Learning tmux has been one of the best tools of the year for me.

[–] xmunk@sh.itjust.works 9 points 3 weeks ago (1 children)

Git doesn't need to have a single pull source. It's probably worth just configuring the visibility on each machine so you can do peer pulls.

I don't hate the idea of autocommitting in that scenario, though.

[–] matcha_addict@lemy.lol 2 points 3 weeks ago (1 children)

Sorry, but I'm not really following here. Do you mean like git add remote and have another remote? What would the source be?

[–] sip@programming.dev 4 points 3 weeks ago (1 children)

your machines

git add remote laptop ...

[–] matcha_addict@lemy.lol 2 points 3 weeks ago (1 children)

That would require my machines to be git servers, right? And hence they should also be on, right? Or am I missing something? Most of the time, my laptop is shut off.

[–] xmunk@sh.itjust.works 8 points 3 weeks ago

It's no worries - most people don't realize this but every git repository is, well, a fully functional git repository. Git shell runs over ssh so as long as your machines have sshd running you should be good.

https://git-scm.com/book/en/v2/Git-on-the-Server-Setting-Up-the-Server

[–] Kissaki@programming.dev 4 points 2 weeks ago* (last edited 2 weeks ago)

I would consider ~~three~~ four approaches.

1. Commit and push manually and deliberately

I commit changes early and often anyway. I also push regularly, seeing the remote as a safe and remote (as in backup) baseline and reference state.

The question would be: Do I switch when I'm still exploring things in the workspace, without committing when switching or moving away from it, and I would want those on the other PC? Then this would not be enough.

2. Auto-push all local git references into a separate space on the git remote

Git branches are refs, commit pointers, just like other refs are. And they can be put under arbitrary paths. refs/heads/ holds branches. I can replicate and regularly update all my branches under refs/pcreplica/laptop/*. And then on the other PC, list or fetch those, individually, or all of them, regularly automatically, or manually.

git push origin refs/heads/*:refs/pcreplica/laptop/*
git ls-remote
git fetch origin refs/pcreplica/laptop/*:refs/laptop/*

3. Auto-push the/a local branch like you suggested

my concern here would be; is only one branch enough? is only the current branch enough?

4. Remoting into the other system

Are the systems both online? Can I remote into / connect into it when need be?

[–] GetOffMyLan@programming.dev 2 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

Honestly I would just commit your in progress work then on the other machine check it out and reset to the previous commit.

Then you have your in progress work on the new machine with no random commits.

You could set up an alias that does commit with message "switching machines" and pushes.

Similar have one that pulls and resets.

[–] matcha_addict@lemy.lol 1 points 3 weeks ago (1 children)

That doesn't solve the problem of forgetting :(

I could train myself to get in the habit, but maybe auto saving is easier, no?

[–] zlatko@programming.dev 1 points 2 weeks ago

I wonder if JJ anonymous branches would be something that solves this. I've only read about it, have not used JJ yet.

[–] solrize@lemmy.world 2 points 3 weeks ago (2 children)

I just manually push and it's fine. Or as the other commenter says, use a single remote machine.

[–] matcha_addict@lemy.lol 1 points 3 weeks ago

Two issues with manual pushing that I have:

  • you have to remember to push
  • you either get more commits than you need, or you have to do extra work to either clean up or put them in separate branches
[–] Mihies@programming.dev 1 points 3 weeks ago (1 children)

A remote machine might not be always possible, such as when you develop mobile apps or when you have more than one monitor available. Sadly all options have problems. And (auto) pushing is not an option when you work on a team project where pushing non compilable code is not a welcome option.

[–] solrize@lemmy.world 1 points 3 weeks ago (1 children)
[–] Mihies@programming.dev 1 points 2 weeks ago

They might work, but then one is bound to be online. Also different computers might have different configurations and that is something to pay attention to as well. Alternative is a synchronisation to source (nextcloud sounds a good fit) but then you might bump to synchronisation conflicts and such. Both ways will produce a lot of traffic unless you redirect creation of build artifacts to a local directory. Which might not be always possible.

[–] Matty_r@programming.dev 2 points 3 weeks ago

I used to have a similar situation, I used Vscode remote development to effectively work from any machine. Another thing I tried was using Nextcloud to watch the working directory, which automatically synchronized files when they change.

[–] BrianTheeBiscuiteer@lemmy.world 2 points 3 weeks ago

I have a very similar script. I basically have one branch that's only manual commits and a "sister branch" that includes all manual commits plus some automatic ones. I determine what is auto-committed based on a simple test script. The test might be as simple as, "Did it build without errors? Commit."

[–] CatPoop@lemmy.world 1 points 3 weeks ago

I have a script that runs every 5 mins that does a robocopy for each local repo to OneDrive, excluding all the git system files. I don’t really like the idea of pushing half-finished / broken code.

Rarely need to actually copy stuff back out of onedrive, but it’s good enough on the few occasions I forget to push before changing machines.

[–] talkingpumpkin@lemmy.world 1 points 3 weeks ago

Syncthing or unison might be what you want

[–] Lysergid@lemmy.ml 1 points 3 weeks ago

Post commit hook to push + always squash on merging feature branches

[–] drew_belloc@programming.dev 1 points 3 weeks ago (1 children)

You could also use syncthing so when you are conected to your lan you work folder will always be synced from one pc to another, that way you won't have a bunch useless commits on git and you can still commit and push the changes from any computer

[–] matcha_addict@lemy.lol 1 points 3 weeks ago (1 children)

When I looked into solutions, I thought of syncthing, but read comments from people saying they had issues with this approach, especially regarding the .git directory

[–] SaveMotherEarthEDF@lemmy.world 1 points 3 weeks ago* (last edited 3 weeks ago)

Sorry what issues? I'm using syncthing to sync between a macos laptop, a macos vm, and my main linux desktop. I sync my projects folder with a lot of git repos. Maybe I've been lucky so far but no issues with syncthing and git so far

It's really like magic sometimes

[–] Nighed@feddit.uk 0 points 2 weeks ago

For my personal projects I somehow ended up with git being on a OneDrive synced folder - carries over the general changes, then explicitly commit and push to get it to GitHub etc.