My trusty Thinkpad E590 died friday. Buddy of mine with the same model tested swapping out parts: battery & RAM (he’s a better tech than I). Determined it can’t even get past POST. Possibly the CPU or MoBo.
I bought this miniPC several months ago so I could use my printer https://a.co/d/i7bLZgC
So I wish to fully clone #1 to #3, btrfs filesystem, and 4 years of data. WebDev, so hundred of git repos. npx npkill shows over 60GB of spaced used by node_modules
I’ve not done this in years.
Seeking advice, best practices, what software to use, etc
There is a reason why node_modules is added by default to .gitignore.
Do you really need all that data on a new drive right from the start? I’d just copy config files, including package.json - it’ll be able to fetch all the necessary packages anyway.
I wouldn’t be cloning over those node_modules, and they are indeed in their respective .gitignore files. It’s amazing how much junk gets accumulated over time I guess after this I would start using pnpm instead of npm/yarn. More efficient, space saving, and faster.
Used dd command to burn the Resuezilla ISO to USB. I made sure checksum matched. It couldn’t recognize the disks. I let it run for a while (at least 15 minutes0 and it made no progress.
So I chose to go with Clonezilla on USB. Made some progress there. but it failed in the end, as the destination SSD is smaller than the source (500GB to 1TB).
So then I ran Gparted (again, iso to usb) and resized my 1TB. Resized /dev/sda to about 410GB with 20GB swap. I was trying to get the unallocated segment to 500GB. but Clonezilla failed again, stating the destination was smaller than the source.
So it looks like ultimately I will just settle with fully reinstalling EndeavourOS, btrfs, bspwm, etc.
I’ve kept a readme.md at the root of home, with a list of all packages I manually installed. I’m just gonna bite the bullet and reinstall tomorrow.