I have a bash script containing a group of commands in curly braces { ... }
. This group contains some initial echo
commands and then one loop. At each iteration the loop executes various slow commands (basically with curl
and some extra parsing). Each iteration is slow (because of network interaction) but it prints one line (of python code); as far as I can see, there should be no buffering issue coming from the commands themselves because they terminate their job and leave.
The whole group of commands is piped to python -u
(I also tried with tail -f
in order to check) and obviously the whole loop is executed before anything is read by python -u
or tail -f
.
I know how to unbuffer (when possible) one command with various tools like stdbuf
but I don't think it can help here because it looks like the issue comes from the command-grouping rather than from such or such command.
Any hint?
Best Answer
(Note to future readers: the tone of exasperation here is not for the question, but for the mistakes I made trying to answer it and the multiple edits they entailed.)
Oh, for pity's sake. The problem is in
tail -f
. This works just fine:It's not the pipe, it's not the group. It's
tail
. As in, chasing our own tails!So,
tail -f
failed because it doesn't output right away for some reason. Not sure whypython -u
is failing, but I don't think it's anything in the script. Maybe tryunbuffer
with it. Try your script withcat
, at least, and verify that it's unbuffered in that case.Earlier failed attempt intentionally left here so future readers can make sense of the comments.
This script exhibits the same kind of buffering problem you're getting:
This one does not. Output inside the group is redirected to stderr, then stderr from the whole group is piped to the command. Since it's stderr, it's unbuffered.
Adapted from Wang HongQin's answer in this question. The difficulty was in finding a way to unbuffer the pipe with braces rather than an explicit command. Had to fiddle around a while to get the redirection working properly.