Ubuntu – Ls command very slow after deletion of thousand files on a directory

filesfilesystemlsperformance

I have a dir under /home/myuser dir which an application stores and deletes millions of temporary files. When I do ls on this dir (now it only has a hundred files only) it is too slow (actually I do not get any output for many minutes). After some google search and doing:

ls -dl ~/mydir/

I get this

drwxrwxrwx 2 myuser myuser 160108544 Oct 12 11:31 /home/myuser/mydir/

which basically means I have to re-index directory entries for this directory (if I understood correctly). How do I do force such a re-indexing on this dir?

Best Answer

It depends of the underlaying file system type you use, most of the file system do not compact directories after a deletion.

Unmount the filesystem and use e2fsck -D to optimize the directories.

If that problem occurs many time you should consider to use a dedicated file system for that directory. You should use a different file system type, I don't know which one will dynamically compact the directory entries, Btrfs by design is advertised to not suffer of that problem.

Related Question