Guess what it is in 300 seconds.
This was carefully measured with a digital stopwatch. If I use the ticks to measure the timing, I get "accurate" readings. It seems the ticks are in cahoots with LC seconds, but not real seconds. In other words:
Code: Select all
on mouseUp
put the ticks into tt
wait 5 seconds
answer the ticks - tt
end mouseUp
Gives 300 in the dialog. But the measured time is, just guess, about 0.25 seconds longer. It is obviously a conspiracy.
There have been many threads where users did indeed require pretty good accuracy, and many methods to get pretty close. All well and good.
If the ticks are just a little bit longer than they ought to be, the difference would scale. Why is there a constant extra quarter second?
Craig