TIming
-
I am writing a Windows application in Visual C++. I want to make a program loop take exactly a thirtieth of a second per loop. Please how can I make the program find the time since the program started running, or similar, accurate to a millisecond or a microsecond? I have found many time-related functions, but all giving the time to the nearest whole second.
-
I am writing a Windows application in Visual C++. I want to make a program loop take exactly a thirtieth of a second per loop. Please how can I make the program find the time since the program started running, or similar, accurate to a millisecond or a microsecond? I have found many time-related functions, but all giving the time to the nearest whole second.
GetTickCount
will report with ms accuracy - but it's lumpier than that. It used to change at ~19ms intervals, but I think it's finer. If you want *exactly* 1/3 of a second, then windows is the wrong OS for you. After that, it boils down to your exactness. You could use SetTimer (..., 333, ...), or waitable timers, or... Do you want no less that 333ms between events, 3 events pretty evenly per second, but you can handle some jitter? Short version: Define "exact". Iain.I have now moved to Sweden for love (awwww). If you're in Scandinavia and want an MVP on the payroll (or happy with a remote worker), or need contract work done, give me a job! http://cv.imcsoft.co.uk/[^]
-
I am writing a Windows application in Visual C++. I want to make a program loop take exactly a thirtieth of a second per loop. Please how can I make the program find the time since the program started running, or similar, accurate to a millisecond or a microsecond? I have found many time-related functions, but all giving the time to the nearest whole second.
QueryPerformanceCounter can give better resolution than GetTickCount or 'time()' or SetTimer. but you will never get a loop that takes exactly 1/30s in Windows - you can get close, on average. but not exactly, all the time. Windows just isn't that kind of OS.
-
GetTickCount
will report with ms accuracy - but it's lumpier than that. It used to change at ~19ms intervals, but I think it's finer. If you want *exactly* 1/3 of a second, then windows is the wrong OS for you. After that, it boils down to your exactness. You could use SetTimer (..., 333, ...), or waitable timers, or... Do you want no less that 333ms between events, 3 events pretty evenly per second, but you can handle some jitter? Short version: Define "exact". Iain.I have now moved to Sweden for love (awwww). If you're in Scandinavia and want an MVP on the payroll (or happy with a remote worker), or need contract work done, give me a job! http://cv.imcsoft.co.uk/[^]
Hey warrior, how goes on the great invasion of Sweden? :-D
If the Lord God Almighty had consulted me before embarking upon the Creation, I would have recommended something simpler. -- Alfonso the Wise, 13th Century King of Castile.
This is going on my arrogant assumptions. You may have a superb reason why I'm completely wrong. -- Iain Clarke
[My articles] -
I am writing a Windows application in Visual C++. I want to make a program loop take exactly a thirtieth of a second per loop. Please how can I make the program find the time since the program started running, or similar, accurate to a millisecond or a microsecond? I have found many time-related functions, but all giving the time to the nearest whole second.
Use the MultiMedia timer with a callback function. It is the most accurate timing you can have in Windows. Lookup the following in the VC++ help: TIMECAPS timeGetDevCaps timeBeginPeriod timeSetEvent
-
Hey warrior, how goes on the great invasion of Sweden? :-D
If the Lord God Almighty had consulted me before embarking upon the Creation, I would have recommended something simpler. -- Alfonso the Wise, 13th Century King of Castile.
This is going on my arrogant assumptions. You may have a superb reason why I'm completely wrong. -- Iain Clarke
[My articles]Going great - see the change in my sig! Iain.
I have now moved to Sweden for love (awwww).
-
Use the MultiMedia timer with a callback function. It is the most accurate timing you can have in Windows. Lookup the following in the VC++ help: TIMECAPS timeGetDevCaps timeBeginPeriod timeSetEvent
This "event setting" sounds complicated, like trapping errors. My program uses this method, and it seems to work:
NEXTFRAME: t=GetTickCount();
InvalidateRect(wn,0,0); RedrawWindow(wn,0,0,RDW_UPDATENOW);
for(j=0;j<4;j++) {if(!isdigit(MFN[i+j])) goto DEF; FN[j]=MFN[i+j];}
fn=(FN[0]-'0')*1000+(FN[1]-'0')*100+(FN[2]-'0')*10+FN[3]-'0'+1;
if(fn>9999) goto DEF;
FN[3]++; if(FN[3]>'9') {FN[3]='0'; FN[2]++;}
if(FN[2]>'9') {FN[2]='0'; FN[1]++;}
if(FN[1]>'9') {FN[1]='0'; FN[0]++;}
MFN[i]=FN[0]; MFN[i+1]=FN[1]; MFN[i+2]=FN[2]; MFN[i+3]=FN[3];
if(!readfile(I,MFN,wn)) goto DEF; // read from file MFN to image I, running under Windows window wn.
do {} while(GetTickCount()-t<33); goto NEXTFRAME;}; but the loop in the last line looks like it ties up the computer until time comes to display the next frame, instead of letting aniother process run.
-
This "event setting" sounds complicated, like trapping errors. My program uses this method, and it seems to work:
NEXTFRAME: t=GetTickCount();
InvalidateRect(wn,0,0); RedrawWindow(wn,0,0,RDW_UPDATENOW);
for(j=0;j<4;j++) {if(!isdigit(MFN[i+j])) goto DEF; FN[j]=MFN[i+j];}
fn=(FN[0]-'0')*1000+(FN[1]-'0')*100+(FN[2]-'0')*10+FN[3]-'0'+1;
if(fn>9999) goto DEF;
FN[3]++; if(FN[3]>'9') {FN[3]='0'; FN[2]++;}
if(FN[2]>'9') {FN[2]='0'; FN[1]++;}
if(FN[1]>'9') {FN[1]='0'; FN[0]++;}
MFN[i]=FN[0]; MFN[i+1]=FN[1]; MFN[i+2]=FN[2]; MFN[i+3]=FN[3];
if(!readfile(I,MFN,wn)) goto DEF; // read from file MFN to image I, running under Windows window wn.
do {} while(GetTickCount()-t<33); goto NEXTFRAME;}; but the loop in the last line looks like it ties up the computer until time comes to display the next frame, instead of letting aniother process run.
I would have thought that, in these times when desktop computers can run at a gigahertz and more, that it would be routinely easy for a C++ program to time events by the microsecond.
-
This "event setting" sounds complicated, like trapping errors. My program uses this method, and it seems to work:
NEXTFRAME: t=GetTickCount();
InvalidateRect(wn,0,0); RedrawWindow(wn,0,0,RDW_UPDATENOW);
for(j=0;j<4;j++) {if(!isdigit(MFN[i+j])) goto DEF; FN[j]=MFN[i+j];}
fn=(FN[0]-'0')*1000+(FN[1]-'0')*100+(FN[2]-'0')*10+FN[3]-'0'+1;
if(fn>9999) goto DEF;
FN[3]++; if(FN[3]>'9') {FN[3]='0'; FN[2]++;}
if(FN[2]>'9') {FN[2]='0'; FN[1]++;}
if(FN[1]>'9') {FN[1]='0'; FN[0]++;}
MFN[i]=FN[0]; MFN[i+1]=FN[1]; MFN[i+2]=FN[2]; MFN[i+3]=FN[3];
if(!readfile(I,MFN,wn)) goto DEF; // read from file MFN to image I, running under Windows window wn.
do {} while(GetTickCount()-t<33); goto NEXTFRAME;}; but the loop in the last line looks like it ties up the computer until time comes to display the next frame, instead of letting aniother process run.
It's really not that complicated once you have done it a couple of times. The do while loop that you are using is extremely wasteful in terms of CPU utilization. The proper way to block a thread is with WaitForMultipleObjects. Also, you should never block the main thread of your application. The best solution would really be to use a background thread to draw on a BITMAP and use use BitBlt to transfer the image to the screen every 333 ms. As a side note the values returned by GetTickCount are really only accurate to 15.2 ms so using it to time 33 ms isn't going to work well on a loaded machine. The advantage of the multimedia timer is that you can truely have 1 ms accuracy. And by using a background thread and proper thread synchronization your code will scale properly on faster and multi core machines and still work correcly on a slower or heavly loaded machine.
-
It's really not that complicated once you have done it a couple of times. The do while loop that you are using is extremely wasteful in terms of CPU utilization. The proper way to block a thread is with WaitForMultipleObjects. Also, you should never block the main thread of your application. The best solution would really be to use a background thread to draw on a BITMAP and use use BitBlt to transfer the image to the screen every 333 ms. As a side note the values returned by GetTickCount are really only accurate to 15.2 ms so using it to time 33 ms isn't going to work well on a loaded machine. The advantage of the multimedia timer is that you can truely have 1 ms accuracy. And by using a background thread and proper thread synchronization your code will scale properly on faster and multi core machines and still work correcly on a slower or heavly loaded machine.
In this case I need to wait not 333ms absolutely, but (333ms minus (the time taken to read that frame from a file)).