Force a shell script to fflush
I was wondering if it was possible to tell bash that all calls to echo
or printf
should be followed up by a subsequent call to fflush()
on stdout/stderr respectively?
A quick and dirty solution would be to write my own printf implementation开发者_运维问答 that did this and use it in lieu of either built in, but it occurred to me that I might not need to.
I'm writing several build scripts that run at once, for debugging needs I really need to see messages that they write in order.
If comands use stdio and are connected to a terminal they'll be flushed per line. Otherwise you'll need to use something like stdbuf on commands in a pipe line http://www.pixelbeat.org/programming/stdio_buffering/
tl;dr: instead of printf ...
try to put to the script stdbuf -o0 printf ..
, or stdbuf -oL printf ...
If you force the file to be read, it seems to cause the buffer to flush. These work for me.
Either read the data into a useless variable:
x=$(<$logfile)
Or do a UUOC:
cat $logfile > /dev/null
Maybe "stty raw" can help with some other tricks for end-of-lines handling. AFAIK "raw" mode turns off line based buffering, at least when used for serial port ("stty raw < /dev/ttyS0").
精彩评论