Bash – Running a command multiple times with arguments (filenames) from a file

bashcommand lineshell-script

I have a file with a long list of filenames (with full paths). I've also got a program I'd like to run multiple times, using one and one filename from this list as argument. The program I'd like to run, is sadly a self-made script; so it can only take one filename at a time, and it cannot accept input from stdin (like a list of filenames) – so no piping or input-redirection possible.

What I need, is a command that will run another command (my script), using the lines in the file as argument.

Something like find with the -exec-action… (find . -name "*.txt" -exec command \; ). I guess I actually could use find here, but I'd like for the input-files to be sorted…

So what I need is something like this:

for_each_command1 -f list_of files -c './my_command {}'
for_each_command2 -f list_of_files -exec ./my_command {} \;
for_each_command3 -c './my_command {}' 

My usual way to handle such tasks is with sed - unfortunately, there's a lot of overhead and it's not pretty (but it does work...):

$ wc -l list_of_files 232 $ for (( i=1; i do > ./my_command "`sed -n "${i}p" list_of_files`" > done

So is there a command or shell built-in to handle something like this?

Best Answer

Conveniently, xargs -I does exactly what you want:

$ xargs <my_file_list.txt -I filename ./my_command "filename"
-I replace-str
    Replace occurrences of replace-str in the initial-arguments with names read
    from standard input.
    Also, unquoted blanks do not terminate input items;
    instead the separator is the newline character.
    Implies -x and -L 1.

This means it takes exactly one newline-delimited input line and invokes your command with it as a single argument. Nothing apart from newline is treated as a delimiter, so spaces and other special characters are fine.

Note that if you wanted to allow newlines in your filenames, the nul terminator (as per Wildcard's answer) will also work in a file.