OK I got rid about 3GB in Solus. I am amazed. The Flatpak command went deep like a mole.
PS–when you just run /.mclean, I got a y/n for every move, so I did have pinpoint control without the flags.
For transparency: I let the tool go to the races uninhibited. (answered yes to everything).
Nice.
Your program did not tell me how much I lost…luckily I had checked free space before I scanned.
Oh yeah, I shoulda clarified that.
I thought the idea was to ‘block’ other pieces from running/querying at all - which the category flags will do.
Probably will include this in a coming update.
I already have the mock-up .. but its pretty dumb.
Just takes a snapshot of size of / before the script and after the script and prints the difference.
Yeah, I wonder it may not account for subtle changes made while the process is running, outside of Maclean’s efforts. Like if someone happens to be downloading something, or moving files about, while running Maclean.
It’ll slow Maclean down, but perhaps checking specific folder sizes before cleaning those folders and accumulating the result, would be a bit more accurate.
Exactly the case.
I first went for something rather .. well, quick and dumb.
The new version is a bit smarter. Not by a whole darn lot. But a bit.
This is pretty much what was done.
I feel it is still a little imprecise but much better than before.
And I dont think it incurred much of a speed penalty either.
There is also now a -p (package-related) category that was split off from -b (basic), and options can be combined as one might expect (maclean -bdp, etc).
And a few other things. Like the ~/.config/maclean.conf file where users can more easily configure their own ‘junk’ paths.
With that said. Updated script is available for folks to download at the project page.
The error message “XDG_CONFIG_HOME: unbound variable” typically indicates that the environment variable XDG_CONFIG_HOME is not set in your shell session. This variable is part of the XDG Base Directory Specification, which defines where user-specific configuration files should be stored.
If you’re running a script or command that expects this variable to be set and isn’t checking if it’s defined before using it, you might see this error. Here’s how you can resolve this issue:
1. Set the Variable Temporarily:
You can set the XDG_CONFIG_HOME variable temporarily within your terminal session by running:
export XDG_CONFIG_HOME="$HOME/.config"
2. Set the Variable Permanently:
To make this change permanent, you can add the export line to your shell profile file, such as ~/.bashrc, ~/.bash_profile, or ~/.zshrc, depending on the shell you are using. Add this line at the end of the file:
I installed and ran it and got 252 Mib cleaned off my system. . . . I run Stacer, Bleachbit, and Sweeper most all of the time and also Scc in terminal. . . . not a big improvement but I guess it helps a little. . . .
The logic was such that it didnt check for the existence of this directory - just tried to use it and then would create the equivalent path if that didnt work.
Which would have been fine - but I keep forgetting I leave the set -eu on this script .. which means it will error out and exit immediately if something like a variable cannot be defined. This is a safety measure to make sure any of the delete functions dont work on an undefined variable - which could result in unintended paths being removed.
So I have refactored the CONFIG_HOME section ever so slightly so that it will actually check for the XDG variable and if it does not exist then create/use the path we want.
From
_xdg_base_config=$XDG_CONFIG_HOME
if [[ -z $_xdg_base_config ]]; then
_xdg_base_config=$HOME/.config
fi
To
if [[ -z $XDG_CONFIG_HOME ]]; then
_xdg_base_config=$HOME/.config
else
_xdg_base_config=$XDG_CONFIG_HOME
fi
I just came across your script.
I didn’t realize that was part of your script.
It may not be elaborate, but it effectively accomplishes its purpose.
I might have approached it somewhat differently myself,
but it’s great to see others crafting their own scripts.
Keeps track of what is done, calculates savings, and isn’t overly aggressive. The way the script gives the information is neither too terse nor too verbose.
The one thing I would like to see improved is the handling of orphans.
You get a list of orphans which is all or nothing. This is a bit too coarse for my liking.
Would it be possible to get this list numbered, the way yay or paru present their lists, and then give numbers which you would like to keep or delete?
And if this is too complicated, maybe present the orphans one by one, and you can decide if you want to keep/discard/keep all/discard all?
The script should go into the AUR as well. It has close to 1000 LOC now, so definitely AUR-worthy.
My ~ directory holds less than 1G of files, but has double the amount of ~/.cache files. I don’t want to back up more junk than relevant files. The script found and deleted also some stale go files which found their way to my drive despite me trying to delete make-only dependencies.
I didn’t even know they were there.
Can you do without such a cleaning script? Yeah, sure. Does it make your life easier? Also yes.
Hm.
This may be possible by creating an array or loop.
For now it rightly reports the whole of the orphans as they will all be removed if the action is affirmed - the idea being that systems should generally not have orphans .. packages that are otherwise orphans but desirable should be marked --asexplicit.
Still undecided on what to do here, if anything.
Maybe creating such a loop/array with the option to mark items as explicitly installed before running full orphan removal?
I do not bother backing up dot files and folders. I was not saying unequivocally that a cleanup script has no use for anyone, just that I find little use in such a comprehensive cleanup script. I have always kept my package cache clean. I have a couple of functions in my cleanup script to trim the journal logs to 5 days and delete everything in the ~/.cache folder, but I almost never use those functions. I do not install software unless I am positive I will use it, and have a total of two items from the AUR (downgrade and trizen), so I never think of build dependencies. I guess for those who experiment and install stuff from the AUR regularly, this type of cleaning can come in handy.