I've been using the following command to list the most recently updated files (recursively) within a specific directory and order by modification time:
$ stat --printf="%y %n\n" $(ls -tr $(find * -type f))
However, within the hierarchy, there is one directory that is full of files that get updated on a near minute-by-minute basis, that makes the above command near useless for finding files outside of the offending directory.
I tried using the ls -I
flag but to no avail:
$ stat --printf="%y %n\n" $(ls -trI 'bad-dir' $(find * -type f))
Is there a simple way of excluding a single/specific directory using the ls
command? Or should I be pushing the search into the find
command?
Best Answer
You don't need that extra
ls -tr
. This is equivalent to your command and faster:Something like this will exclude a subdirectory of files:
This will still check every file, if you want to ignore an entire subdirectory use
-prune
. You'll have to reorder things slightly so that we don't find all the files first.And in the interests in being as efficient as possible, the
stat
was redundant, given find is already accessing the file. So you're actually hitting the filesystem 2X with thefind ... | stat ...
approach so here's a more efficient method that hasfind
doing all the work.To make this version work I've had to adapt the printf directives since
stat
andfind
use different ones for the various filesystem meta data.References