KBackup is a user data backup program from KDE. It is not meant as an imaging solution or a solution to backup the OS and its configuration. Rather it is meant to be used for backing up user created data or user files or user documents or user coding projects and so on. The way it does a backup is that the user selects the files that he/she wants to backup and then a compressed file is created by KBackup in a location of users choice. It can be on the same disk or on a USB memory stick or a mounted network drive or some other location.
Now the problem with Kbackup is that it compresses the files and then puts them inside an archive/tar file. The issue with this is that if we want to restore the files then we have to extract them and then uncompress/unzip the individual files. This becomes tedious after some time as it requires too many commands on the terminal or too many clicks in Dolphin, i.e. KDE File Manager. This was made painfully obvious to me recently as my Acer laptop died.
I like KBackup as it is a GUI solution and allows the selection of files from a tree and creation of backup profiles. So I can have a backup profiles for all of documents/project/code and other for settings/configuration setup.
So what are GUI alternatives that one can use, where one can restore files by using simple Copy/Move/extract functionallity? In Windows world there are some excellent tools like SyncToy 2.1 or RoboCopy(and its various different UI) that are available. What can be alternatives to these in Linux eco system?
Ideally I want a GUI tool like KBackup which can copy all the files into a single tar/archive file. But without it compressing individual files of the backup. The whole tar/archive can be compressed using XZ/BZip/GZip/etc, that would be fine, but not individual files inside the tar.
It would also be fine if the backup is like SyncToy or Robocopy where files are copied to a destination folder and a single tar file.
I am not looking for a disk imaging tool or something like Timeshift. I want something that does not start a daemon/service and I can run when I want. If I want to automate it or schedule it, I would rather use a cron job or task scheduler.
I will have a look at Vorta. It looks like a promising solution for KDE/QT based DE. While going through the documentation on restoring the backup, it seems to imply that Vorta has to be installed on the new system where files are to be restored. Is that the case? Can the restoration be done by using the file manager/ark/Terminal alone?
I use Syncthing to copy files between my laptops and unRAID server. The server handles backups to local drives, secondary NAS, and cloud.
The server runs 24/7 and backs up when everyone’s sleeping. Syncthing just sits in the background monitoring for changed files. The devices don’t even need to be on the same network to sync files, which is great when traveling.
*Edit.
Yes I’m aware Syncthing is not a backup (in the purest sense of the word), but simply a cog in the big backup strategy. My unRAID server handles backup (currently Backblaze and local NAS). I am looking into other apps to handle backup compression/encryption/deduplication as part of the 3-2-1 backup strategy.
Vorta is just a GUI for https://archlinux.org/packages/extra/x86_64/borg/ which handles the backups. So to restore anything you will have to have borg installed at least. AFAIK no other application can read/mount borg repositories.
Which is also a major reason why I stopped using Vorta/borg a while ago. I want simple restores with tools that are readily available on any distro/live environment etc.
Another big reason was that handling the repositories was kinda finicky especially for major updates of borg and they broke frequently.
Also you kinda will have to backup borg related files (configs, cache, …) with a different application than Vorta and restore those before even attempting to mount the repositories on a different system.
Another issue is that Vorta does not have a lot of active developers. E.g it was almost a year between the releases of 0.10.3 and v0.11.0. This can and did lead to problems on fast moving distros like Arch when new versions of borg can cause problems with Vorta and it takes them quite a while to adabt.
But at least Vorta is in the Arch repo now and not just in the AUR like it used to be.
I am currently also just using KBackup (with compression disabled). Every other backup solution I tried / looked at for Linux turns out to be way overblown / complicated.
I have used rsync and tar for years as they are standard packages that come with EndeavourOS. I have had to reset a couple of systems and have never lost anything doing it this way. I have a script I run for the backup and its all local. I can take them encrypt them and then upload to a cloud service or another device to remove from the property.
I use backintime which uses rsync under the hood. You specify the folders to backup and a schedule (it needs something like cronie) and to restore you can use the gui or simply copy the files using the file manager.
@thefrog Any good GUI available for rsync which is compatible with KDE Plasma 6. I read on some reditt posts and they were also recommending using rsync.
@HBR yep a simple backup utility with GUI to backup files to a memory stick or external drive or nas drive is solely missing in Linux. I was wondering whether it would be possible to write a script that takes the tar output of kbackup and then compresses it?
I think there is a front end called Gsync or Grsync but I really don’t remember I don’t use a gui front end for rsync i just run it as a command as its generally much faster than a gui
Whenever you are using a sync tool to create backups, think about how you will handle versioning.
Too many times I have seen someone use a sync tool to create a replica and then realize that they lost substantial portions of their data because any issues with the source data get synced to the replica and they have no actual backups.
That is part of what a backup tool gives you as opposed to a sync tool.
True I have been doing my way so long I forget about versioning.
My work around for this is that I use rsync to create an intial backup then I tar that backup before the next sync. I also remove the previous files and just make a whole new sync so that I avoid not having things “sync” Yes I fell victim to this in the past to to my lack of understanding and that is why i created the method I use. I just make sure to add a date to the tar file.
Hi, I use restic currently, with backrest for web gui.
Functionality wise it is similar to vorta+borg, but I personally found restic+backrest easier to approach and manage. You can see what floats your boat.
I should however mention, that please ensure you don’t use a transfer utility as your backup utility. Something like Syncthing is great at transferring your files and keeping them up to date across different systems, but it is not a backup strategy.
if you wish to know how I learnt it the hard way, here it is:
I sync folders from my android device to may laptop via syncthing. They are usually always in sync if I’m at home. I once accidentally deleted a few gigabytes worth of data from my laptop’s storage. It took me a few seconds to realize I deleted the wrong folder and another few seconds to realize that it’ll soon “sync” with my Android device to wipe off the data from my phone too. Fortunately, I quickly picked up my phone and disabled its wifi and nothing important was deleted.
Yes, I should’ve used versioning which syncthing offers out of the box, but even that doesn’t make it a backup tool.
I switched to vorta+borg post that episode and recently to restic+backrest.
With syncing or plain copying if any file at source becomes corrupted that you are unaware of then you can end up overwriting a good file with a corrupted one at the backup destination, which makes the backup flawed, depending on the file it could be weeks or months until you realise it’s corrupted and it will be too late and impossible to recover.
With versioning there will be duplicate copies of files (of there are changes, usually detected via file checksum), so if the file becomes corrupted you can still get hold of the good copy prior to it. Even things like a drive or other hardware becoming faulty can cause corruption of a file and may not be noticeable until it’s too late.
I personally use Borg with a custom script file I made to make 5 different repositories, automated with a systemd timer, with backups on two different drives. But Vorta is a GUI that uses Borg and can essentially do the same thing as what I done with Borg.
Only way this could be partially done with tar files is to keep multiple copies with different timestamps but this will use up a lot of space as files will be duplicated multiple times, even ones with no changes.
This is so true. From my point of view there are basically 2 ways to do that:
With btrfs/zfs on the receiving side. After each successful copy of data (e.g. with rsync) create a snapshot on the receiving side. With zfs snapshots are read-only, with btrfs you need to tell btrfs to create read-only snapshots. This prevents accidental damage to the snapshots.
backup with borg or kopia. They both do versioning out of the box on any filesystem.
I use kopia since 2 years after using borg before. I prefer kopia over borg. I like speed and handling better.
So if rsync is used then can versioning be turned off? For example I have a big enough usb memory stick and a semi NFS drive connected to local LAN. Can I force rsync, using one of its GUI, to do a full backup and overwrite everything that is in a destination? Or for rsync to create a new tar file having all the required content from $HOME as a backup? I can keep 1 or 2 tar file.
The plan is to setup a schedule, triggered by the user or via cron job, to copy all the files from $HOME to the usb memory stick every 1st and 3rd weekend. While another cron job, or an event triggered by the user backsup the data to the NFS drive every 2nd and 4th weekend.
If there is a 5th weekend in a month then it would be considered as rest day or day off or as my Jewish friends would say Shabbat.
Doing it via a cron job is optional and not mandatory.
I would just use a systemd timer instead of a cron job.
if versioning is going to be an issue then you will want to probably use the borg backups. When I looked at my backup script just a few minutes ago It appears I use rsync for the dot files which are sent to a place before I tar them. And My Videos folder. The rest I use tar for. I add compression to the tar file to help make the sizes a bit smaller on the backup device.
Take your time to find the best solution you feel will work for you. And no matter what solution you chose remember to always test your backups or you don’t have backups.