Timing Accuracy of "Move" command
Posted: Wed Nov 09, 2016 8:27 pm
If I move an object to the points of a certain graphic in, say, 3 seconds, the actual time to travel the course is always about 0.25 seconds more. It is not a scalable thing. If I move in 30 seconds, it is about 0.25 seconds more.
Guess what it is in 300 seconds.
This was carefully measured with a digital stopwatch. If I use the ticks to measure the timing, I get "accurate" readings. It seems the ticks are in cahoots with LC seconds, but not real seconds. In other words:.
Gives 300 in the dialog. But the measured time is, just guess, about 0.25 seconds longer. It is obviously a conspiracy.
There have been many threads where users did indeed require pretty good accuracy, and many methods to get pretty close. All well and good.
If the ticks are just a little bit longer than they ought to be, the difference would scale. Why is there a constant extra quarter second?
Craig
Guess what it is in 300 seconds.
This was carefully measured with a digital stopwatch. If I use the ticks to measure the timing, I get "accurate" readings. It seems the ticks are in cahoots with LC seconds, but not real seconds. In other words:
Code: Select all
on mouseUp
put the ticks into tt
wait 5 seconds
answer the ticks - tt
end mouseUp
Gives 300 in the dialog. But the measured time is, just guess, about 0.25 seconds longer. It is obviously a conspiracy.
There have been many threads where users did indeed require pretty good accuracy, and many methods to get pretty close. All well and good.
If the ticks are just a little bit longer than they ought to be, the difference would scale. Why is there a constant extra quarter second?
Craig