我正在学习C(和Cygwin)并试图为一个任务完成一个简单的远程执行系统。C定时器差异返回0ms
我得到一个简单的要求就是:'客户端将报告服务器响应每个查询所花费的时间。'
我已经尝试过四处搜索并实施其他工作解决方案,但总是得到0作为结果。
什么,我有一个片段:
#include <time.h>
for(;;)
{
//- Reset loop variables
bzero(sendline, 1024);
bzero(recvline, 1024);
printf("> ");
fgets(sendline, 1024, stdin);
//- Handle program 'quit'
sendline[strcspn(sendline, "\n")] = 0;
if (strcmp(sendline,"quit") == 0) break;
//- Process & time command
clock_t start = clock(), diff;
write(sock, sendline, strlen(sendline)+1);
read(sock, recvline, 1024);
sleep(2);
diff = clock() - start;
int msec = diff * 1000/CLOCKS_PER_SEC;
printf("%s (%d s/%d ms)\n\n", recvline, msec/1000, msec%1000);
}
我使用浮动也试过了,而不是1000分,10000乘以只是为了看看是否有一个值的任何闪烁,但总是回到0. 很明显,我如何实现这一点肯定是错误的,但是经过多次阅读,我无法弄清楚。
- 编辑 -
值的打印输出:
clock_t start = clock(), diff;
printf("Start time: %lld\n", (long long) start);
//process stuff
sleep(2);
printf("End time: %lld\n", (long long) clock());
diff = clock() - start;
printf("Diff time: %lld\n", (long long) diff);
printf("Clocks per sec: %d", CLOCKS_PER_SEC);
结果: 开始时间:15 结束时间:15 DIFF时间:每秒0 时钟:1000
- FINAL WORKING CODE -
#include <sys/time.h>
//- Setup clock
struct timeval start, end;
//- Start timer
gettimeofday(&start, NULL);
//- Process command
/* Process stuff */
//- End timer
gettimeofday(&end, NULL);
//- Calculate differnce in microseconds
long int usec =
(end.tv_sec * 1000000 + end.tv_usec) -
(start.tv_sec * 1000000 + start.tv_usec);
//- Convert to milliseconds
double msec = (double)usec/1000;
//- Print result (3 decimal places)
printf("\n%s (%.3fms)\n\n", recvline, msec);
注:我已经阅读[此问题](http://stackoverflow.com/questions/18436734/c-unix-millisecond-timer-returning-difference-of-0?rq=1)和解决方案didn没有工作。 – Blake
什么是'clock_t'的原始值? 'printf(“%lld \ n”,(long long)start)' – chux
问题在于你的分配问题,你将分配给一个'int',它将为任何小于1的值赋值0,使用' float msec = float(clock() - start)/ CLOCKS_PER_SEC;'代替。 –