Linux – Efficiently delete large directory containing thousands of files

command linefileslinuxrm

We have an issue with a folder becoming unwieldy with hundreds of thousands of tiny files.

There are so many files that performing rm -rf returns an error and instead what we need to do is something like:

find /path/to/folder -name "filenamestart*" -type f -exec rm -f {} \;

This works but is very slow and constantly fails from running out of memory.

Is there a better way to do this? Ideally I would like to remove the entire directory without caring about the contents inside it.

Best Answer

Using rsync is surprising fast and simple.

mkdir empty_dir
rsync -a --delete empty_dir/    yourdirectory/

@sarath's answer mentioned another fast choice: Perl! Its benchmarks are faster than rsync -a --delete.

cd yourdirectory
perl -e 'for(<*>){((stat)[9]<(unlink))}'

Sources:

  1. https://stackoverflow.com/questions/1795370/unix-fast-remove-directory-for-cleaning-up-daily-builds
  2. http://www.slashroot.in/which-is-the-fastest-method-to-delete-files-in-linux
Related Question