Improve performance of find -exec …

findperformance

I need the list of sub-directories (not files) in a directory so I can pass it to a Java program. So I am using this command to get the list on Linux machine:

find /some_directory -depth -maxdepth 1 -mindepth 1 -exec basename {} \; > listfile.txt

And then I pass listfile.txt to Java program as an argument. There are some issues to get the list directories from the Java program itself, hence I am doing this. But the above find command is taking a lot of time (~ 35 mins) as there are more than 200k files.

Can this be optimized or is there a better alternative?

Best Answer

To print only file name instead of path, with GNU¹ find, you can replace -exec basename with -printf '%f\n'. Explained in GNU find man page:

%f

File's name with any leading directories removed (only the last element).

Also if you want only directories in your output you probably should use -type d option:

find /some_directory -maxdepth 1 -mindepth 1 -type d -printf '%f\n' > listfile.txt

-depth is superfluous as you're only finding files at one depth (1).

¹ -maxdepth and -mindepth are also GNU extensions, but contrary to -printf, they are also found in some other find implementations nowadays.

Related Question