AS3: Synchronize Timer event to actual time?
I plan to use a timer event to fire every second (for a clock application).
I may be wrong, but I assume that there will probably be a (very slight) sync issue with the actual system time. For example the timer event might fire when the actual system time milliseconds are at 500 instead of 0 (meaning the seconds will be partially 'out of phase' if you will).
Is there a way to either synchronize the timer event to the real time or get some kind of syste开发者_Python百科m time event to fire when an second ticks in AS3?
Also if I set a Timer to fire every 1000 milliseconds, is that guaranteed or can there be some offset based on the application load?
These are probably negligible issues but I'm just curious.
Thanks.
You can get the current time using Date. If you really wanted to, you could try to control your timer jitter by trying to align it with what Date returns. I'm not certain this would even be an issue (certainly if your application is not kept running for long periods of time, and even then I am not certain the error would build too quickly).
Note that the OS is usually only accurate to within a few milliseconds and you may need to do something else if you need that kind of accuracy.
I just thought of a simple way to at least increase the accuracy of the timer. By simply reducing the Timer interval, the maximum margin of error gets reduced.
So setting the Timer to fire every 100ms instead of 1000ms would cause the maximum error to be 99ms instead of 999ms. So depending on the accuracy/performance required, these values can be tweaked.
If instead the timer frequency must remain 1 second, we can create a temporary initializing timer that fires at very quick intervals (~1ms). Then at each tick we keep track of the current time and the previous time. If the time's seconds changed (the real clock changed) then we can fire the main Timer (with 1000ms) at that instant. This would ensure that the 1 second timer starts when the real time seconds change (with an error in single digit milliseconds instead of 3 digit milliseconds)
Of course, again, there's going to be some lag from when the time change was detected and when the timer gets started. But at least the accuracy can be increased while retaining a Timer of 1s.
I will try this out some time this week and report back.
Unfortunately I don't think Timer is guaranteed to stay at the same rate. It's very likely it will drift over a period of time (eg if you set it for 1000ms, it may fire every 990ms, which is a very small difference, but over time it will add up). I think you should do as you said, fire every 100ms or so, and then check the Date object to determine if one second has passed yet.
精彩评论