Backup Scripts & Aliases - Working with AI

My backup strategy is a little longwinded ; (it’s all rsync to keep data raw and unpackaged)

  1. Backup to TrueNAS on local LAN
  2. Backup to Local External Drive from Linux Box
  3. Mirror TrueNAS to Synology
  4. Backup to External Drive attached to Synology
  5. Cloud Backup from Synology via Hyperbackup (encrypted)

All of this was done via aliases in .bash_aliases, - well over a dozen of them, often chained together. The cognitive load on having to remember it all was getting ridiculous, so I pushed my aliases file to Claude Code.

It sanity checked my choices, pointed out some stupid things I’d done like not verified my mount points before running rsync, which meant I could have been wiping my rsync destinations, and then rebuilt my .bash_aliases and produced a script to run backups via a simple menu-driven TUI.

Yes I could have probably spent a weekend doing this by hand, and still manage to get it wrong, but it did it in less than a minute. Cognitive load lowered by 90%. I’m not a fan of vibe coding, - I’d like to always build and work with a baseline of code that I understand the mechanisms for, so this was a reasonable usecase I think. I’m a little blown away by how good Claude is. Out of all the current crop of AI tools, I think it rates pretty high for the no-nonsense “get sh*t done” approach.

Genuinely impressed with Claude Code! (ChatGPT was objectively dangerous in comparison) Would I be using it for everything? Heck no, but when it comes to smoothing down rough edges and lowering the mental load with a second pair of eyes, absolutely! Interested to hear if you’ve used it for anything else!

My backup strategy is much less complicated and uses a few Grsync Sessions. Good to hear that Claude Code is useful in some supervised scenarios. I wish they would do some bug triaging though - https://github.com/anthropics/claude-code/issues :scream:

2 Likes

Yeah, I absolutely wouldn’t trust it to run solo on any production environment, but it takes some of the legwork out of tidying up and building tools as long as you have a decent baseline of code and scripting knowledge. Do I want AI built into my tools and OS? Hell no. Do I want to selectively use what I feel fits for the work I’m doing? Potentially yes..

2 Likes

not at all interested in Claude the AI, but this routine. How long do these 5 steps take? hours? once a month?

my backup is 100% grsync based so love reading these stories.

what I’m seeing here is a backup that originates in one or two places then gets bounced (mirrored) to three more..even in Cloud.

I would say you appear to have far too many vulnerable places….but when I look at myself I just grsync to a simple external drive. Now I feel shortsighted pinning my backup to an “eggs in one basket” approach.

You really got me thinking…wonder what ‘best practices’ really are for backing up an OS? Great post.

  • Backup to TrueNAS on local LAN - Manually run whenever my main rig is online. A differential sync takes less than a minute on Gigabit ethernet.

  • Backup to Local External Drive from Linux Box - Ditto.

  • Mirror TrueNAS to Synology - Manual. Whenever my main rig is online. This one runs for a little longer, - only because it’s the entirety of the TrueNAS spread across a handful of shares and datasets with a total of 7TB.

  • Backup to External Drive attached to Synology - This one runs via Hyperbackup at 2am nightly, Synology Hyperbackup reports it takes a couple of mins for a diff sync.

  • Cloud Backup from Synology via Hyperbackup (encrypted) - This one goes to Google Drive as an encrypted backup, weekly on a Sunday morning at 3am, and even at 500mb fibre, takes a little while if there’s anything more suddenly added.

In terms of your own backups, one backup is no backup. Two backups are no backups if none of them restore properly (test restores regularly). Three backups in discrete physical locations is ideal (Air gapped, Cloud, Online, - my airgapped backup is an old WD Black 4TB that gets a sync a couple of times a year).

Vorta and Borg are a great option too, - I just prefer the control and accessibility of rsync working with raw data that can be accessed on any platform in an emergency if I don’t have borg and vorta installed to browse the encrypted containers.

1 Like

wow I appreciate your reasoning, thanks for indulging, moreover kudos to you in making me see it from another perspective. ran a popular cli backup program that gave me gpg’s along with the rsync physical stuff but have grown lazy. of course I saved them both to a single external :slight_smile:

No stress, if you want my script to take a look at how it all works (and maybe tweak to what might fit for you, give me a poke). Plain and simple, you can never have too many backups :wink: