How to atomically print lines from multiple background processes in a bash script?
In a bash function I'm writing I start mu开发者_高级运维ltiple remote commands via ssh, and have them run in separate background processes. Each of these processes produces many lines of text, which are merged together and then sorted. My problem is that sometimes these lines are mixed together. That is one line starts printing and before that line is finished printing, another line starts printing on the same line.
My question is what is the simplest way to make this print output atomic so that the individual lines don't blend together (interspersing whole lines is ok -- I just want the columns to line up)? One idea I had was to save the output for each parallel background process and then merge them serially, but I have not been able to get this to work (this method should work fine for me if I knew how to properly do it). For reference, here is an outline of the type of script I'm trying to write:
foo() {
(
pids=()
for x in "$@"
do
(
ssh $x 'some-high-latency-command-with-200-lines-of-data-output'
) &
pids+=( $! )
done
for x in "${pids[@]}"
do
wait $x
done
) 2> /dev/null
}
I would redirect each ssh
run to its own file and merge them afterward. I also wouldn't use a wait
loop; wait
by itself will wait for all backgrounded processes, or you can say wait ${pids[*]}
if you really want just the ssh
processes.
I finally stumbled onto a solution that appears to work without creating files. Apparently, if I assign the output of ssh to a variable with declare
, the lines are preserved, and using echo
to print from this variable appears to be atomic. See the following:
foo() {
(
pids=()
for x in "$@"
do
(
declare output=$(ssh $x 'some-command-with-multiline-output')
echo "$output"
) &
pids+=( $! )
done
wait ${pids[*]}
) 2> /dev/null
}
use some program to recompose lines locally before printing them, for instance:
ssh $x 'some-high-latency-command-with-200-lines-of-data-output' | perl -pe1
精彩评论