But then I’d need a writer and DVD lol. I don’t even know if I have those things anymore
I just use clonezilla to backup and clone my drive weekly on an external nvme drive. I also back up important files and documents to cloud like Pcloud or dropbox . I sometimes use rsync to sync documents to another external drive…Overkill but its good when you have alot of backup options I find in case one backup fails…
Absolutely correct!
I backup my /home/Userdirectory and another data partition using Back In Time since years.
Since I use Linux as my main OS (3 or 4 years), I used it on an Intel i5 (and Intel GPU) laptop and now a desktop with AMD GPU.
I’ve never had a system upgrade break my system on any distro, including Manjaro and EndeavourOS as Arch derivates. So I only recently started using BTRFS snapshots for the system partition ( / and all except /efi) but not yet had to make use of restoring it. Since I opted for SystemD boot manager while installing EndeavourOS, the simple way of going back to a previous snapshot doesn’t work To lazy to convert to Grub-Booting.
What I do have to do from time to time is chrooting in to fix Windows having messed up my Linux EFI configurations This still happens on bigger Windows updates.
TL;DR: Back In Time Backup of user files it is for me. Some documents are also available in the cloud.
anacronopete is really great, I use it personally you can schedule a backup every x amount of time. If you forget to do it, it runs as soon as possible, once the storage medium is connected …
You can also easily restore whatever you want.
It works in “Time-machine” mode, so you can go back in your backups day by day, month by month.
https://aur.archlinux.org/packages?O=0&SeB=nd&K=cronopete&outdated=&SB=p&SO=d&PP=50&submit=Go
I hope this would be of any help for everybody.
Thanks to @azxr I have looked again at Kopia, I found that though I can backup and restore I cant actually access the files unless Kopia is installed and “connected” to the same “account”.
So, I searched and found the best ever backup solution, back to the basics, a simple KISS.
Here is the script that I worked out and worked for me:
#!/bin/bash
# Set the source and destination paths
src="/home/limo/"
dest="/run/media/limo/USB_Backup/rsynchome/"
backup="/run/media/limo/USB_Backup/rsynhomebak/"
# Create the destination folder if it doesn't exist
mkdir -p $dest
# Create the backup folder if it doesn't exist
mkdir -p $backup
# Run the rsync command to backup the files
rsync -avh --delete --backup --suffix=~ --backup-dir=$backup $src $dest
What it does is just copies the files at my home folder to the backup folder on the external USB.
If a file gets edited or deleted from home folder then used the script to sync again, it will copy the file to another backup folder and add a “~” after the extension. With each edit it creates again the same file with an extra “~”
So, I have a few advantages:
- I can just plug in the external USB Drive to any computer with any OS and read the files and access them normally.
- I will be having the file even if it is deleted accidentally or have earlier versions of the file.
- it only copies the new files or modified files not all files.
As it uses rsync so the sync is just one way from home folder to the external USB drive.
Of course you need to modify the paths in your script according to your actual path/user name.
I hope you will like this script.
I hope I contributed something useful to the wonderful community.
I use btrfs with snapper and btrfs-assistant to take snapshots of my drives (root and home), my root and home drives are on different SSDs.
I also daily take backups to my Proxmox backup server using the proxmox-backup-client in case my SSDs just die suddenly. Although I don’t keep any really important data on my desktop computer, it is nicer to restore my old drive than rebuild from scratch.
In order to use the proxmox-backup-client I wrote a script (with some help) which runs as a cron job one per day. If anyone wants that script I can send a GitHub link.
only problem I can see with your script is your backing up cache files and trash files. Seems to be a waste of space on the backup drive.
I add
rm -Rf /.cache to my backup script at the beginning to remove such unnecessary files.
I would like to add the main issue with a clone backup is the needed space. Clones do a partition not a directory so clones have lots of wasted white space. If you only need to go to a backup and recover a file or two Clones/Snap Shots make this difficult. Those are great to take you back to a moment in time but not great for extracting files out of backup for just a few files.
I am using pCloud for the documents I need to be there anytime anywhere no matter what.
For books, music, photos, my humble scripts I regularly backup all my home folder with rsync.
Moreover I have a very old laptop and syncing my main laptop folders to it using resilio sync (unfortunately syncthing has no free iPhone app)
So, feel relatively safe.