Bash: Terminate on Timeout/File Overflow while Executing Command
I'm writing a mock-grading script in bash. It's supposed to execute a C program which will give some output (which I redirect to a开发者_开发问答 file.) I'm trying to (1) make it timeout after a certain duration and also (2) terminate if the output file reaches a certain file size limit. Not sure how to go about either of these. Any help? Thanks.
There's a GNU coreutil command timeout
to do timeouts.
Investigate ulimit -f 32
to set the maximum file size (to 16 KiB; it counts in 512 byte blocks).
Objection:
ulimit is [not] suitable because I have to create other files as well. I need to limit only one of them.
Counter: Unless the program must create a big file and a little file and you have to limit just the little file, you can use a sub-shell to good effect:
(
ulimit -f 32
timeout 10m -- command arg >file
)
The limit on file size is restricted to the commands in the sub-shell (which is marked by the pair of parentheses).
you can use timeout
command eg
timeout -s 9 5s ./c_program > file
to check file size, you can stat the file, then do if/else
limit=1234 #bytes
size=$(stat -c "%s" file)
if [ "$size" -gt "$limit" ] ;then
exit
fi
see also here if you can't use these GNU tools, or here for some other inspirations.
This starts yourcommand, redirecting output via dd to youroutputfile and putting a limit of 10000000 bytes on it: dd will terminate and SIGPIPE will be sent to yourcommand
yourcommand | dd of=youroutputfile bs=1 count=10000000 &
This will wait 5 seconds and kill yourcommand if not already terminated:
sleep 5
kill %yourcommand
精彩评论