I am thinking I can do this with wc somehow, if there was a recursive option but I am not sure. I want a grand total of the total number of words in the files under a directory and its subdirectories (not just a per-file word count).
Note I am performing this with my mac.
Ok, I just tried this command
find enwiki/ -type f | xargs wc -w > output.txt
The resulting output file has 6425104 lines, indicating that many files. But the total word count in the end was only 381609. Did perhaps, the grand total of words counted exceed the maximum allowed in bash? I'm not sure if that happened or if I used wc incorrectly.
Best Answer
Using
find
to find all files, then concatenating them withcat
and counting the words in the concatenated stream withwc
:The issue with your command is the
wc
will be called multiple times on batches of files if you have many thousands of files to process. In the command above,cat
will be called multiple times on batches of files, but all output is sent to a single invocation ofwc
.