I am searching all files containing a specific string on a filer (on an old HP-UX workstation).
I do not know where the files are located in the file system (there are many directories, with hudge number of scripts, plain-text and binary files).
I precise that the grep -R option does not exist on this system; so I am using find and grep in order to retrieve which files contains my string:
find . -type f -exec grep -i "mystring" {} \;
I am not satisfied with this command: it is too slow, and it does not print the name and path of file on which grep matched my string.
Moreover if there is an error it will be echoed on my console output.
So I thought that I could do better:
find . -type f -exec grep -l -i "mystring" {} 2>/dev/null \;
But it is very slow.
Do you have a more efficient alternative to this command?
Thanks you.
Best Answer
The fastest I can come up with is to use
xargs
to share the load:Running some benchmarks on a directory containing 3631 files:
Your other options would be to streamline either by limiting the file list using
find
:Or by tweaking
grep
:You are already using
grep
's-l
option which cause the file name to be printed and, more importantly, stops at the first match:The only other thing I can think of to speed things up would be to make sure your pattern is not interpreted as a regex (as suggested by @suspectus) by using the
-F
option.