Bash – How to Capture stdout Line by Line in Bash Script

bashprocessstdout

In a bash script, I would like to capture the standard output of a long command line by line, so that they can be analysed and reported while initial command is still running. This is the complicated way I can imagine of doing it:

# Start long command in a separated process and redirect stdout to temp file
longcommand > /tmp/tmp$$.out &

#loop until process completes
ps cax | grep longcommand > /dev/null
while [ $? -eq 0 ]
do
    #capture the last lines in temp file and determine if there is new content to analyse
    tail /tmp/tmp$$.out

    # ...

    sleep 1 s  # sleep in order not to clog cpu

    ps cax | grep longcommand > /dev/null
done

I would like to know if there is a simpler way of doing so.

EDIT:

In order to clarify my question, I will add this. The longcommanddisplays its status line by line once per second. I would like to catch the output before the longcommandcompletes.

This way, I can potentially kill the longcommand if it does not provide the results I expect.

I have tried:

longcommand |
  while IFS= read -r line
  do
    whatever "$line"
  done

But whatever (e.g. echo) only executes after longcommand completes.

Best Answer

Just pipe the command into a while loop. There are a number of nuances to this, but basically (in bash or any POSIX shell):

longcommand |
  while IFS= read -r line
  do
    whatever "$line"
  done

The other main gotcha with this (other than the IFS stuff below) is when you try to use variables from inside the loop once it has finished. This is because the loop is actually executed in a sub-shell (just another shell process) which you can't access variables from (also it finishes when the loop does, at which point the variables are completely gone. To get around this, you can do:

longcommand | {
  while IFS= read -r line
  do
    whatever "$line"
    lastline="$line"
  done

  # This won't work without the braces.
  echo "The last line was: $lastline"
}

Hauke's example of setting lastpipe in bash is another solution.

Update

To make sure you are processing the output of the command 'as it happens', you can use stdbuf to set the process' stdout to be line buffered.

stdbuf -oL longcommand |
  while IFS= read -r line
  do
    whatever "$line"
  done

This will configure the process to write one line at a time into the pipe instead of internally buffering its output into blocks. Beware that the program can change this setting itself internally. A similar effect can be achieved with unbuffer (part of expect) or script.

stdbuf is available on GNU and FreeBSD systems, it only affects the stdio buffering and only works for non-setuid, non-setgid applications that are dynamically linked (as it uses a LD_PRELOAD trick).

Related Question