Script for finding average runtime of a program
I found partial solutions on several sites, so I pulled several parts together, but I still couldn't figure it out.
Here is what I am doing:
I am running a simple java program from Terminal, and need to find the average runtime for the program.
What I am doing is running the command several times, finding the total time, and then dividing that total time by the number of times I ran the program.
I would also like to acquire the output of the program rather than displaying it on standard output.
Here is my current code and the output.
Shell Script:
startTime=$(date +%s%N)
for ((i = 0; i < $runTimes; i++))
do
java Program test.txt > /dev/null
done
endTime=$(date +%s%N)
timeDiff=$(( $endTime - $startTime ))
timeAvg=$(( $timeDiff / $numTimes ))
echo "Avg Time Taken: 开发者_运维问答"
echo $timeAvg
Output:
./run: line 12: 1305249784N: value too great for base (error token is "1305249784N")
The line number 12 is off because this code is part of a larger file.
The line number 12 is the line with timeDiff
being evaluated.
I appreciate any help, and apologize if this question is redundant or off-topic.
On my machine, I don't see what the %N
format for date is getting you, as the value seems to be 7 zeros, BUT it is making a much bigger number to evaluate in the math, i.e. 1305250833570000000
. Do you really need nano-second precision? I'll bet if you go with just %s
it will be fine.
Otherwise you look to be on the right track.
P.S.
Oh yeah, minor point,
echo "Avg Time Taken: $timeAvg"
Is a a simpler way to achieve your required output ;-)
Option 2. You could take out the date calculations all together, and turn your loop into a small script. Then you can use a built-in feature of the shell
time myJavaTest.sh
Will give you details like
real 0m0.049s
user 0m0.016s
sys 0m0.015s
I hope this helps.
精彩评论