Lots of text files into one big text file

catfiles

I want to combine thousands of little text files into one big text file. I have them in directories with the structure: timestamp1/status.txt. For example: 20130430133144/status.txt.
So far, I know that

cat */* > bigtextfile.txt

works for small numbers of files. But will it work for higher numbers? I wonder if cat is going to gather the content of all files and then try to save to the bigtextfile. Otherwise, I suppose there must another way to do it, like fetch one file, append it to bigtextfile, then fetch another and so on.

Best Answer

No cat will not buffer all the files before it starts writing out.

However if you have a large number of files you can run into an issue with the number of arguments passed to cat. By default the linux kernel only allows a fixed number of arguments to be passed to any program (I can't remember how to get the value, but its a few thousand in most cases).
To solve this issue you can do something like this instead:

find -mindepth 2 -maxdepth 2 -type f -exec cat {} \; > bigtextfile.txt

This will basically call cat separately for each and every file found by find.

Related Question