this post was submitted on 22 Aug 2024
31 points (100.0% liked)

Linux

48331 readers
574 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS
 

I use 2 different computers in 2 different locations both running Universal Blue.

I was wondering if there is any way to create a backup system where i could backup Computer1 over the internet to Computer2 and continue work like nothing happened with all the user data and installed applications being there. The goal is to only need to transfer the user data/applications and no system data (that should be the same for both because of Ublue, right?), to keep the backup size small.

To be clear, i need help figuring out the backup part, not the transfering over the internet part.

If I were to backup the directories on Computer1, which store user data, with for example borgbackup, could I restore them on Computer2 and have a working system? Or would there be conflicts because of more low level stuff missing like applications and configs? Which directories would I need and which could be excluded?

Is there a better option? Any advice is appreciated!

I also came across btrfs snapshot capabilities and thought they could possibly used for this. But as far as I understand it, that would mean transferring the whole system and not only the data and applications. Am i missing something?

top 12 comments
sorted by: hot top controversial new old
[–] unreachable@lemmy.world 17 points 3 months ago (3 children)
[–] Deckweiss@lemmy.world 7 points 3 months ago* (last edited 3 months ago)

or 3 way with an always on server (like a raspi or cheapest VPS with just enough storage) so that you don't have to have both computers on at the same time (thats what I am doing currently and it works great).

[–] gi1242@lemmy.world 5 points 3 months ago

I use syncthing for this purpose all the time. I seemlessly move from my work PC, home PC or laptop. I sync my data directories and most of my config settings. some are different per system (monitors, etc). 10/10 highly recommend

[–] bobs_monkey@lemm.ee 2 points 3 months ago (1 children)

Question if you don't mind: is it theoretically possible to use syncthing on the root directory of a given arch install, somehow blacklist hardware specific components, and basically have a running clone between both systems? I've never heard of syncthing before this but it sounds intriguing

[–] Deckweiss@lemmy.world 5 points 3 months ago* (last edited 3 months ago)

I am not sure technically, but even if possible it would be a nightmare of resolving conflicts manually, since a lot of system files are constantly written to and read from and it would mess everything up if syncthing is overwriting the file at the same time.

[–] nehal3m@sh.itjust.works 5 points 3 months ago

As a sysadmin I would try making the PC’s hypervisors and syncing a VM? Might be over engineered but I think it would work.

https://pve.proxmox.com/wiki/High_Availability_Cluster

[–] drwho@beehaw.org 5 points 3 months ago

Syncthing could do it.

[–] RmDebArc_5@sh.itjust.works 3 points 3 months ago

This should work, as on Linux you can also share a home directory. In my experience (using the same home partition for different installations) there might be minor issues like a additional plasmoid not working on both systems, although this was on two different distros so you may not experience any issues.

[–] utopiah@lemmy.ml 3 points 3 months ago* (last edited 3 months ago) (1 children)

Regardless of what technical solution you decide to rely on, e.g borgbackup, Synchting or rsync, the biggest question is "what" do you actually need. You indeed do not need system files, you probably also applications (which can fetch back anyway) so what left is actually data. You might want to then save your ~ directory but that might still conflict with some things, e.g ~/.bashrc or ~/.local so instead you might want to start with individual applications, e.g Blender, and see where it implicitly or you explicitly save the .blend files and all their dependency.

How I would do it :

  • over the course of a day, write down each application I'm using, probably a dozen at most (excluding CLI tools)
  • identify for each where data is stored and possibly simplify that, e.g all my Blender files in a single directory with subdirectory
  • using whatever solution I have chosen, synchronize those directories
  • test on the other device while being on the same network (should be much faster and with a change of fixing problems)

then I would iterate over time. If I were to often have to move and can't really iterate, I would make the entire ~ directory available even though it's overkill, and only pick from it on a per needed basis. I would also insure to exclude some directories that could be large, maybe ~/Downloads

PS: I'd also explore Nix for the system and applications side of things but honestly only AFTER taking care of what's actually unique to you, i.e data.

[–] unskilled5117 1 points 3 months ago (1 children)

Thank you for the detailed response! Yes, the what data and how to not create conflicts has been troubling me the most.

I think I might first narrow it down with test VMs first, to skip the transfer part, before I actually use it “in production“.

[–] utopiah@lemmy.ml 1 points 3 months ago

Honestly a very imperfect alternatives but that's been sufficient for me for years is... NextCloud of documents.

There are few dozen documents I need regardless of the device, e.g national ID, billing template, but the vast VAST majority of my files I can get on my desktop... which is why I replied to you in depth rather than actually doing it. I even wrote some software for a "broader" view on resuming across devices including offline, namely https://git.benetou.fr/utopiah/offline-octopus as a network of NodeJS HTTP servers but ... same, that's more for the intellectual curiosity than a pragmatic need. So yes explore with VMs if you prefer but I'd argue remain pragmatic, i.e what you genuinely do need versus an "idealized" system that you don't actually use yet makes your workflow and setup more complex and less secure.

[–] Fizz@lemmy.nz 1 points 3 months ago

There is the overkill method of proxmox clustering VMs. You could work from a cloud instance of your distro. There is NixOS, you would clearly define your whole system and then back up and import your home folder when switching between the PCs. Since you are using an immutable distro already you can probably skip nix and use a onedrive type solution setup to sync your home directory. I havent used it but other people have suggested Syncthing and it seems like it would work for your use case and be the simplest option.