How to calculate the execution time in C?
How can I calculate the execution time in the following code:
#include <stdio.h> /* Core input/output operations */
#include <stdlib.h> /* Conversions, random numbers, memory allocation, etc. */
#include <math.h> /* Common mathematical functions */
#include <time.h> /* Converting between various date/time formats */
#include <sys/time.h>
#define PI 3.1415926535 /* Known vaue of PI */
#define NDARTS 128 /* Number of darts thrown */
double pseudo_random(double a, double b) {
double r; /* Random number */
r = ((b - a) * ((double) rand()/(double) RAND_MAX)) + a;
return r;
}
int main (int argc, char *argv[]) {
int n_procs, /* Number of processors */
llimit, /* Lower limit for random numbers */
ulimit, /* Upper limit for random numbers */
n_circle, /* Number o开发者_StackOverflowf darts that hit the circle */
i; /* Dummy/Running index */
double pi_sum, /* Sum of PI values from each WORKER */
x, /* x coordinate, betwen -1 & +1 */
y, /* y coordinate, betwen -1 & +1 */
z, /* Sum of x^2 and y^2 */
error; /* Error in calculation of PI */
clock_t start_time, /* Wall clock - start time */
end_time; /* Wall clock - end time */
struct timeval stime, starttime1, endtime1;
struct timeval tv1, tv2, diff;
llimit = -1;
ulimit = 1;
n_circle = 0;
printf("\n Monte Carlo method of finding PI\n\n");
printf(" Number of processors : %d\n", n_procs);
printf(" Number of darts : %d\n\n", NDARTS);
gettimeofday(&tv1, NULL);
gettimeofday(&stime, NULL);
srand(stime.tv_usec * stime.tv_usec * stime.tv_usec * stime.tv_usec);
for (i = 1; i <= NDARTS; i++) {
x = pseudo_random(llimit, ulimit);
y = pseudo_random(llimit, ulimit);
z = pow(x, 2) + pow(y, 2);
if (z <= 1.0) {
n_circle++;
}
}
pi_sum = 4.0 * (double)n_circle/(double)NDARTS;
pi_sum = pi_sum / n_procs;
error = fabs((pi_sum - PI)/PI) * 100;
gettimeofday(&tv2, NULL);
double timeval_subtract (result, x, y)
{
result = ((double) x - (double) y ) / (double)CLOCKS_PER_SEC;
}
double result1 = timeval_subtract(&diff, &tv1, &tv2);
printf(" Known value of PI : %11.10f\n", PI);
printf(" Average value of PI : %11.10f\n", pi_sum);
printf(" Percentage Error : %10.8f\n", error);
printf(" Time : \n", clock() );
printf(" Start Time : \n",&tv1);
printf(" End Time :\n", &tv2);
printf(" Time elapsed (sec) : \n", result1 );
return 0;
}
I used timeval_subtract function and when I execute the code, I got:
Monte Carlo method of finding PI
Number of processors : 16372
Number of darts : 128
Known value of PI : 3.1415926535
Average value of PI : 0.0002004184
Percentage Error : 99.99362048
Time :
Start Time :
End Time :
Time elapsed (sec) :
First, I couldn't find the mistake in finding the number of processors (I must get 1 processor).
Second "which is the most important point", Why do I get the Time, Start Time, End Time and Time elapsed empty?
Because you don't have adequate format strings for them, you need something starting with a '%', like:
printf(" Time :%d \n", clock() );
n_procs is never initialized, the 16372-value that gets printed just happens to be what was previously on the stack.
The C standard library doesn't provide functionality to query processor count or high-performance timers, so you will have to look at other means of querying this. For instance, both POSIX and Windows API provides functionality like this.
edit: See Programmatically find the number of cores on a machine for how to initialize n_procs. Seeing how you use gettimeofday, you're probably on some unix-variant; "n_procs = sysconf(_SC_NPROCESSORS_ONLN);" is probably what you want.
Try this:
printf(" Time : %lu\n", clock() );
printf(" Start Time : %lds %ldus\n", tv1.tv_sec, tv1.tv_usec);
printf(" End Time : %lds %ldus\n", tv2.tv_sec, tv2.tv_usec);
And for:
double timeval_subtract (result, x, y)
use the following to return the time difference in micro seconds:
long timeval_subtract (struct timeval * result, struct timeval * x, struct timeval * y)
{
long usec = x->tv_sec * 1000000L + x->tv_usec;
usec -= (y->tv_sec * 1000000L + y->tv_usec);
result->tv_sec = usec / 1000000L;
result->tv_usec = usec % 1000000L;
return usec;
}
Depending on the difference of the two dates x
and y
the return value of the function timeval_subtract
(not the value represented by result
!) might be wrong, due to an overflow.
Assuming a long is 32bit wide this overflow will occur with differences larger than 4294s, for a long having 64bit (which should be the case an 64bit machines) the overflow whould occur after much later ... ;-)
I'd try the following :
int timeval_subtract ( struct timeval *result, struct timeval *x, struct timeval *y ) {
if ( x->tv_usec < y->tv_usec ) {
int nsec = ( y->tv_usec - x->tv_usec ) / 1000000 + 1;
y->tv_usec -= 1000000 * nsec;
y->tv_sec += nsec;
}
if (x->tv_usec - y->tv_usec > 1000000) {
int nsec = ( x->tv_usec - y->tv_usec ) / 1000000;
y->tv_usec += 1000000 * nsec;
y->tv_sec -= nsec;
}
result->tv_sec = x->tv_sec - y->tv_sec;
result->tv_usec = x->tv_usec - y->tv_usec;
return x->tv_sec < y->tv_sec;
}
void Start ( struct timeval *timer_profiling ) {
if ( timer_profiling == NULL ) return;
gettimeofday ( timer_profiling , NULL );
return;
}
void End ( struct timeval *timer_profiling , char *msg ) {
struct timeval res;
struct timeval now;
gettimeofday ( &now , NULL );
if ( msg == NULL ) return;
timeval_subtract ( &res , &now , timer_profiling );
sprintf ( msg , "[ %ld,%.3ld ms]" , res.tv_sec*1000 + (long)round(res.tv_usec/1000) , res.tv_usec - (long)round(res.tv_usec/1000)*1000);
return;
}
Start(&s) one with an allocated timer_profiling , then retrieve the result in a string by calling End(&s,buff);
精彩评论