Bash Pipeline – Continue Cascaded Commands After Failure

basherror handlingpipe

I am running a command and manipulating the output with my own script, but I don't want the main command to stop when my script fails.

For example:

a-command | tee logfile.txt | myscript

when my script fails midway, I look in the logfile.txt, I find it breaks where my script stopped working, but what I want is for the a-commmand to contninue and the logfile.txt to have the full log so that I can debug my script and fix the error.

What is the way to modify my command to treat the last pipeline part as optional or ignored in case of errors so that the first two parts (the command and the tee) to finish their jobs.

[EDIT] a-command is a lengthy task and my script is basically manipulating the output to better report status while the command is running. So I don't want to run my script after a-command finishes its work.

Best Answer

a-command | tee logfile.txt | { myscript; cat >/dev/null; }

This would run your pipeline as usual at first, until myscript terminates (for whatever reason). At that point, cat would take over reading from tee until there in no more data arriving. The data read by cat is discarded by piping it to /dev/null.

If a-command finishes without myscript ending/failing first, myscript would fail to read more data and would presumably terminate (?). At the point when myscript terminates, cat is started, but as there is no more data to read, it would immediately terminate and the pipeline would be done.


Addressing TooTea's comment about making sure that we still get the correct exit status for the pipeline:

a-command | tee logfile.txt | ( myscript; err=$?; cat >/dev/null; exit "$err" )
Related Question