开发者

Bad Results: time(NULL) and clock()

#import <stdio.h>
#import <time.h>

int main (void) {

    printf("Clock ticks per second: %d\n", CLOCKS_PER_SEC);
    double check = clock();
    int timex = time(NULL);

    for (int x = 0; x <= 500000; x++) {

        printf(".");

    }
    puts("\n");

    printf("Total Time by Clock: %7.7f\n", (clock() - check) / CLOCKS_PER_SEC );
    printf("Total Time by Time: %d\n", time(NULL) - timex);

    getchar();
}

When I execute the above code I get results like:

Total Time by Clock: 0.0108240

Total Time by Time: 12

I would like to have clock() represent a number as close to as possible as time.

The to开发者_JAVA百科tal time presented above was done on a macbook, however, the code works excellent on my laptop (windows).

The CLOCKS_PER_SECOND macro returns 1000 on the PC, 1,000,000 on the MAC.


clock() on windows returns the wall clock time. clock() on *nixes return the CPU time your program has spent, which is not going to be a lot, you're likely blocked when doing I/O here.


printf() to console makes system call for each functon call, and time spent blocked in console redrawing, etc. do not count for process time.

Make some heavy calculations there.

for (long int x = 0; x <= 5000000000; x++) {
    sqrt(2.9999);
}


time() returns a time_t. When you assign that to an int it is possible that you lose information. What happens if you use time_t throughout?

int main(void) {
    time_t timex = time(0);
    /* ... */
    printf("%d", (int)(time(0) - timex));
}
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜