Unstable bash statement
I have the code in my bash scripts that works unstable:
# check every line of check_list file presents in my_prog output
MY_LIST=`./my_prog`
for l in $(cat check_list); do
if ! echo -n "$MY_LIST" | grep -q -x "$l"; then
die "Bad line: '$l'"
fi
done
This piece of code of my huge scripting pool shows "Bad line: 'smthng'" with probability around 1/5000. I wasn't able to represent this event by the naked script but only in my huge scripting pool.
However this code seems to work very fine:
# check every line of check_list file presents in my_prog output
./my_prog > my_list
for l in $(cat check_list); do
if ! grep -q -x "$l" "my_list"; then
die "Bad line: '$l'"
fi
done
The reason why I don't like the second开发者_运维知识库 statement is that its use an intermediate file "my_list". What could be a problem of unstable working of the first statement?
Instead of calling grep for every line in your check_list, you can run one awk program:
awk '
FILENAME == ARGV[1] {check_list[$0]; next}
$0 in check_list {
print "bad line: " $0
exit 1
}
' check_list <(./my_prog)
Or, see if there are any common lines between your program's output and your check_list:
common=$( comm -12 <(sort -u check_list) <(./my_prog | sort -u) )
if [ -n "$common" ]; then
echo "bad lines: "
echo "$common"
die
fi
I don't know what's wrong with the first version but you can easily eliminate the creation of a temporary file. Note the you'll have to correct the logic, i did not really understand that, probably you'll want to update a variable in the inner loop and decide whether to die after the inner loop.
for i in $*; do
for l in $(cat check_list); do
if ! echo "$i" | grep -q -x "$l"; then
die "Bad line: '$i', '$l'"
fi
done
done | ./my_prog
精彩评论