I want to find all folders (within a folder) which are less than 100mb large and delete them. I actually don't want to use a bash script. But probably there is some neat one-line-loop possibility to do this. But unfortunately my shell knowledge isn't that good
What I've tried
du -sh * | grep -E "^[0-9]{1,2}M" | xargs -0 rm
This won't work since the output of du -sh * | grep _E ".."
seems to be one single string.
What I also have tried is
find . -maxdepth 1 -type d -size 100M [-delete]
But I guess the -size
flag isn't what I'm looking for
Best Answer
The simple approach is to find all directories, get their size and delete them if they are under a given threshold:
However, that will fail on directory names containing newlines or other strange characters. A safer syntax is:
Since this will process subdirectories before their parents, by the time
dir1
is processed,dir2
anddir3
will already have been deleted so its size will be below the threshold and it too will be removed. Whether or not you actually want this will depend on what exactly you are trying to do.This, however, is a simplistic approach. Consider the following scenario:
Here, we have 2 subdirectories under
dir1
, each containing an 80M file. The command above will first finddir1
whose size is >100M so it will not be deleted. It will then finddir1/dir2
anddir1/dir3
and delete both of them since they are <100M. The final result will be an emptydir1
whose size, of course, will be <100M since it is empty.So, this solution will work fine if you only have a single level of subdirectories. If you have more complex file structures, you need to think about how you want to deal with that. One approach would be to use
-depth
which ensures that subdirectories are shown first:This way,
dir1
will be processed afterdir2
anddir3
so it will be empty, fail the threshold and be deleted as well. Whether or not you want this will depend on what exactly you are trying to do.