Hello together,
the last 4 days I am busy to finally find a backup solution that works. I’m doing this because I’m on vacation at the moment and have been putting it off for years.
Actually(!) I would like to have something like Acronis True Image or Macrium Reflect (Windows only). A service that creates incremental backups every hour and saves them to a network drive.
And if the hard drive should break, or catch a virus, or accidentally delete various folders while drunk, you can simply restore the state before with 2 clicks. Or if the hard disk is really defective, then you replace it, boot from the respective Live CD, and restore the whole image to a new hard disk.
Now I know, because I have dealt with it 2 years ago last that such a software for Linux simply does not exist. At that time I gave up, and as a “solution” simply installed Proxmox as host, created a VM with all my hardware I need (including various PCI devices passed through) and installed my Arch on it. Then configured Proxmox to take a snapshot every hour and wrote a script (more precisely Pacman-hook) that connects to Proxmox via SSH and creates a snapshot with “Pacman Update $Date” as description so I can go back if any update messes up my system.
That worked so far so good. Unfortunately the performance suffered a bit, and sometimes it happened when Proxmox started a snapshot, that my “real” system froze for a short time which didn’t happen often, but still annoyed me.
Since then I backup my hard drive once a month with clonezilla.
But since then this also gets on my nerves, because I (not only out of curiosity and fun but also because of my job) very often experiment with my system and sometimes when I’m done with a “session” it’s easier to restore a 1:1 image than to uninstall 42389472 packages and search for their remains. But then I have to start Clonezilla again every time and pull a fresh image. And in the time nothing can be done with the machine.
So, for these reasons, I’m looking for a solution right now. Since, as mentioned, programs like Macrium or Acronis do not exist for Linux, it is clear to me that I have to go two ways.
My idea is:
-
I install my system as usual with the usual partition scheme, but minimal. There I create my mountpoints in fstab and install backup software X (which must be found), configure it, and install a browser. Nothing more. From this I then pull an image with Clonezilla.
-
After point 1 is done, I just keep working with my Endeavouros and the backup software X does my hourly backups/snapshots.
-
?
-
Profit!
And with the solution I can then restore any X snapshot/backup after I’ve experimented around. And if for some reason the hard drive breaks, I can just launch clonezilla, restore the image and start the OS where the backup software X is already present and configured and I can then easily restore a backup/snapshot. And after a reboot, voila, everything as if nothing had ever happened.
Well, Timeshift (rsync) or BTRFS (or Timeshift with BTRFS) can do exactly that. BUT(!!) they don’t allow the snapshots to be stored on a different location than on the machine itself. This is total nonsense. Because if the hard disk suddenly gives up, the backups/snapshots are also gone. So BTRFS as a solution is already out of the question. But then Timeshift can also work with rsync.
Unfortunately this doesn’t work either, because Timeshift stores the backups/snapshots only locally on the machine (Path: /timeshift). Well, I thought, I am so clever, quit Timeshift incl all services, delete /timeshift, and create a NFS mountpoint under /timeshift and trick Timeshift. Unfortunately, Timeshift recognizes this and prevents the backup. Short internet research has shown that many people complain about this, but the developer does not want to change this, because this would bring unexpected complications because of hard links and that NFS would not support this (which is not true) etc (source as an example: https://github.com/teejee2008/timeshift/issues/52). So Timeshift is unfortunately also out.
Then I found the day before yesterday but the so promising hero! BackInTime. Installed it, configured it and made a backup. Great. Everything as desired, since then various snapshots were created which are now all nicely on my 3x redunant server. Again Pacman-hooks created etc etc and everything worked. Until yesterday I had the idea to test a restore.
So I used Bash to create a few hundred empty test folders and test files in a folder. I deleted my downloads folder (because there is always a lot of junk in there that can go away), installed Firefox, VLC and a few other small programs with Pacman, and as the icing on the cake I configured my pacman.conf so broken that no update is possible anymore.
After that I started BackInTime full of anticipation and restored the last snapshot. After that was done, restarted, and was disappointed…:
- Good: The deleted files from downloads are back.
- Bad: My test folder with all the empty test folders and files is also still there.
- Even worse: The installed test programs are also there.
- Worst: The broken pacman.conf is also still there.
God I cursed… But then I found that there is an option to prevent exactly that. BUT(!) that doesn’t work either, because it explicitly mentions that folders that have been excluded during backup/snapshot creation will be deleted during restore. And that means, for example, everything that is mounted under /mnt … And then my whole server with 80TB data!
So… BackInTime is unfortunately also out.
Then there is duplicity/deja-dup. Exactly the same problem/behavior as BackInTime. Also disclaficated.
And now I’m really desperate and at a loss. I just do not know how to do the following:
- Create hourly incremental backups/snapshots and back them up externally.
- When restoring, also restore exactly the state of the respective backup/snapshot 1:1.
I am really sorry for you poor people that you have read all this. But I hope I can be helped.
This text was translated with deepl from german to english. But reading through it, everything seems to be right
Thanks a lot!