Maclean - A script to automate some (safe) cleaning

OK I got rid about 3GB in Solus. I am amazed. The Flatpak command went deep like a mole.
PS–when you just run /.mclean, I got a y/n for every move, so I did have pinpoint control without the flags.
For transparency: I let the tool go to the races uninhibited. (answered yes to everything).

Nice.

Your program did not tell me how much I lost…luckily I had checked free space before I scanned.

2 Likes

Oh yeah, I shoulda clarified that.
I thought the idea was to ‘block’ other pieces from running/querying at all - which the category flags will do.

Probably will include this in a coming update.
I already have the mock-up .. but its pretty dumb.
Just takes a snapshot of size of / before the script and after the script and prints the difference.

## Accounting Functs

_mc_calcgain() {
    _finsz=$(df -BM / | awk 'NR==2 {print $3}')
    _finsz=${_finsz::-1}
    _difsz=$(( "$_cursz" - "$_finsz" ))
    echo -ne "\nReclaimed $_difsz MB!\n"
}

_mc_calccurr() {
    _cursz=$(df -BM / | awk 'NR==2 {print $3}')
    _cursz=${_cursz::-1}
}

Thanks. :slight_smile:

1 Like

Yeah, I wonder it may not account for subtle changes made while the process is running, outside of Maclean’s efforts. Like if someone happens to be downloading something, or moving files about, while running Maclean.

It’ll slow Maclean down, but perhaps checking specific folder sizes before cleaning those folders and accumulating the result, would be a bit more accurate.

3 Likes

Exactly the case.
I first went for something rather .. well, quick and dumb.

The new version is a bit smarter. Not by a whole darn lot. But a bit.

This is pretty much what was done.
I feel it is still a little imprecise but much better than before.
And I dont think it incurred much of a speed penalty either. :slight_smile:

There is also now a -p (package-related) category that was split off from -b (basic), and options can be combined as one might expect (maclean -bdp, etc).

And a few other things. Like the ~/.config/maclean.conf file where users can more easily configure their own ‘junk’ paths.

With that said. Updated script is available for folks to download at the project page.

4 Likes

nice ^^

1 Like

Just trying on my laptop

fresh download

I get ./maclean: line 47: XDG_CONFIG_HOME: unbound variable

1 Like

Thought I would give maclean a go, got an error

./maclean: line 47: XDG_CONFIG_HOME: unbound variable

Easy fix …

Total Recovered: 1.96GiB

1 Like

Great! Thanks for sharing! :sweat_smile:

3 Likes

The error message “XDG_CONFIG_HOME: unbound variable” typically indicates that the environment variable XDG_CONFIG_HOME is not set in your shell session. This variable is part of the XDG Base Directory Specification, which defines where user-specific configuration files should be stored.

If you’re running a script or command that expects this variable to be set and isn’t checking if it’s defined before using it, you might see this error. Here’s how you can resolve this issue:

1. Set the Variable Temporarily:

You can set the XDG_CONFIG_HOME variable temporarily within your terminal session by running:

export XDG_CONFIG_HOME="$HOME/.config"

2. Set the Variable Permanently:

To make this change permanent, you can add the export line to your shell profile file, such as ~/.bashrc, ~/.bash_profile, or ~/.zshrc, depending on the shell you are using. Add this line at the end of the file:

export XDG_CONFIG_HOME="$HOME/.config"
3 Likes

I installed and ran it and got 252 Mib cleaned off my system. . . . I run Stacer, Bleachbit, and Sweeper most all of the time and also Scc in terminal. . . . not a big improvement but I guess it helps a little. . . .

Rich :wink:

2 Likes

Normally we would expect these to exist.

But thanks for reporting it.

The logic was such that it didnt check for the existence of this directory - just tried to use it and then would create the equivalent path if that didnt work.

Which would have been fine - but I keep forgetting I leave the set -eu on this script .. which means it will error out and exit immediately if something like a variable cannot be defined. This is a safety measure to make sure any of the delete functions dont work on an undefined variable - which could result in unintended paths being removed.

So I have refactored the CONFIG_HOME section ever so slightly so that it will actually check for the XDG variable and if it does not exist then create/use the path we want.

From

_xdg_base_config=$XDG_CONFIG_HOME
if [[ -z $_xdg_base_config ]]; then
    _xdg_base_config=$HOME/.config
fi

To

if [[ -z $XDG_CONFIG_HOME ]]; then
    _xdg_base_config=$HOME/.config
else
    _xdg_base_config=$XDG_CONFIG_HOME
fi

( Update push to Gitlab to follow… )

3 Likes

I just came across your script.
I didn’t realize that was part of your script.
It may not be elaborate, but it effectively accomplishes its purpose.
I might have approached it somewhat differently myself,
but it’s great to see others crafting their own scripts.

1 Like

Think it may relate to a package in Archlinux, but some DEs will install/create them by default.

As you say set -eu is good catch all for errors and very useful for testing scripts.

1 Like

I recommend to use set -eu always.

Best might be

set -euo pipefail

Using this means to code differently in some ways. A good source for learning is: http://redsymbol.net/articles/unofficial-bash-strict-mode/

1 Like

I use zsh, but I prefer to put my variables in .xprofile :

# XDG
(( ${+XDG_STATE_HOME}  )) || export XDG_STATE_HOME="$HOME/.local/var"
(( ${+XDG_CONFIG_HOME} )) || export XDG_CONFIG_HOME="$HOME/.config"
(( ${+XDG_CACHE_HOME}  )) || export XDG_CACHE_HOME="$HOME/.cache"
(( ${+XDG_DATA_HOME}   )) || export XDG_DATA_HOME="$HOME/.local/share"

1 Like

I’ve been using the $XDG_* env vars in my scripts for many years, and only now stumbled over the XDG Spec that says you should use certain defaults if they’re not set.

EOS/Cinnamon definitely doesn’ŧ set many of these (STATE/CONFIG/CACHE/DATA). I think it should.

Something for my .bashrc for now, like the workaround @vazicebon mentioned.

Excellent script.

Keeps track of what is done, calculates savings, and isn’t overly aggressive. The way the script gives the information is neither too terse nor too verbose.

The one thing I would like to see improved is the handling of orphans.

You get a list of orphans which is all or nothing. This is a bit too coarse for my liking.

Would it be possible to get this list numbered, the way yay or paru present their lists, and then give numbers which you would like to keep or delete?

And if this is too complicated, maybe present the orphans one by one, and you can decide if you want to keep/discard/keep all/discard all?

The script should go into the AUR as well. It has close to 1000 LOC now, so definitely AUR-worthy.

3 Likes

My ~ directory holds less than 1G of files, but has double the amount of ~/.cache files. I don’t want to back up more junk than relevant files. The script found and deleted also some stale go files which found their way to my drive despite me trying to delete make-only dependencies.

I didn’t even know they were there.

Can you do without such a cleaning script? Yeah, sure. Does it make your life easier? Also yes.

Hm.
This may be possible by creating an array or loop.
For now it rightly reports the whole of the orphans as they will all be removed if the action is affirmed - the idea being that systems should generally not have orphans .. packages that are otherwise orphans but desirable should be marked --asexplicit.
Still undecided on what to do here, if anything.
Maybe creating such a loop/array with the option to mark items as explicitly installed before running full orphan removal? :thinking:

I do not bother backing up dot files and folders. I was not saying unequivocally that a cleanup script has no use for anyone, just that I find little use in such a comprehensive cleanup script. I have always kept my package cache clean. I have a couple of functions in my cleanup script to trim the journal logs to 5 days and delete everything in the ~/.cache folder, but I almost never use those functions. I do not install software unless I am positive I will use it, and have a total of two items from the AUR (downgrade and trizen), so I never think of build dependencies. I guess for those who experiment and install stuff from the AUR regularly, this type of cleaning can come in handy.