'Best way to develop from multiple computers?

In my organization we share several computers and i need to change computer each day. Therefore, as a scientific programmer i always keep a version of my code on an usb key to transfer to the computer I will be working on this day. Also Sometimes I work on my code at home after work .

What suggestions do you have for me to manage this without the usb key anymore. I presume GitHub where i need to create a local repository on each machine ?

best



Solution 1:[1]

good question.

I think you already know what you need but a little confused on how it works. Let's say you used GitHub and have a public repository (your code would go here instead of your usb) then you would have the GitHub client (sourcetree works good too) on your work computers. From here you 'push' your code to the repository when you leave your work computer then at your new work computer you would 'pull' the most current version to your local machine.

There are plenty of 'how-to' videos online for using version control like github or bitbucket that are very good at describing the entire process.

Good luck!

Solution 2:[2]

If those are the same computers then also any cloud account is very useful: I personally use OneDrive which gives me 1TB of space (since I have Office 365 subsd) and files are visible by any applications as if they were actually on the computer (Windows quickly fetches them in the background).

Apart from that I have a bitbucket account where I have my private repositories (tried with a self-hosted Git managing application Gitea, but it took too much resources [IO]).

In order to use same environments I use and create myself specific Docker images. Many people use Vagrant for doing the same.

The main issue is database migration: whenever possible I try to export structure with data at the end of any work and later re-sync it when working on a different machine (you can configure your Docker image to pull it during loading). For my pet projects I simply use one remote, hosted, database for all the applications on local machines.

I use also different Github, bitbucket keys on different machines to easily disable access from one of them if needed.

Solution 3:[3]

It is not usually needed - to be able to develop the same code from multiple computers. Mostly it is easy - we have version control for this. Git push on one machine and pull on another. All good. Enough for web and backend development.

But there is a small number of cases when the projects are a bit too complex for this and involve the installation of system packages, config files, ssh keys, datasets, setting env variables for different branches. It is still possible to sync both machines, but now you will probably spend an evening.

This happened to me with machine learning/data science projects. Such projects typically require a lot of experiments to train the most optimal ML model. And all of these experiments need to be reproducible. So all of the configs, settings, for each experiment should be saved in a different branch.

I started developing directly inside the docker images. They can be committed pushed to a private registry and pulled back on another machine. Having it in docker allowed us to isolate environments, try something new in a separate environment without creating a mess in a single environment. And then I was able to move the entire environment from laptop to powerful PC.

I made a small solution by putting browser-based VS-code version, terminal, scheduler, and file browser in a single docker image. This way I made an isolated, movable, and shareable environment. I described in this article how to move such environments between computers

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Ronan R
Solution 2 Bartosz Pacho?ek
Solution 3