1. Field of the Invention
The present invention relates in general to a data processing system and, in particular, to a method, system, and computer program product for implementing a high resolution monotonic system clock.
2. Description of the Related Art
On computing systems with a high resolution time of day clock that is routinely adjusted to synchronize system time with a network standard time, applications can observe that time has moved backwards. This happens typically as a result of the normal operation of the network time protocol. On many computer systems, such as the AIX system, for example, the time adjustment must be made in discrete increments or decrements to a real time clock.
To minimize the visibility of such time corrections, systems generally break the required correction into a larger number of smaller sub-corrections. In this fashion, the time change is performed more gradually, but over a relatively long period of time. For example, on an AIX system, the situation of setting the time of day backwards is handled by setting the time of day back one millisecond every ten milliseconds until the system time has been corrected. For example, a two millisecond correction applied in this fashion becomes a discrete 1 millisecond correction performed at the next regularly scheduled timer tick, which occurs every ten milliseconds. Then, ten milliseconds after the first correction was made, at the time of the next timer tick, the second one millisecond correction is applied and the total correction is complete.
On systems without a high resolution clock, the time correction does not cause a problem. Such systems rely on being able to have regularly scheduled timer ticks, such as at ten millisecond intervals, and keep track of the time of day by simply adding ten milliseconds to a global time value at each tick. When time is being adjusted backwards, systems such as these simply add nine milliseconds to the global value at each tick until a total backwards time correction has been applied. Any program that references such a system timer can never observe time going backwards. It can only observe time going forward, albeit at varying rates, including standing still.
Once, however, a high resolution hardware clock is added, such as the timebase register on POWER/PC systems, for example, any program that observes the current time more frequently than every millisecond can easily observe when time has been adjusted backwards. Unfortunately, observing this occurrence frequently causes programs to fail.
One solution to this problem would be to provide a low resolution time value optionally to any application that requests it. The application will not see time going backwards anymore, but it will see time standing still for ten milliseconds at a time, and then jumping forward. This solution is not always practical.
Another solution is to arithmetically adjust the high resolution time value through an additional software layer which could accurately prorate the correction over an interval before the discrete correction is made to the hardware clock. However, there are numerous implementation difficulties with this solution, including the processing time necessary to do the sixty-four bit divides and multiplies that this requires, which are expensive operations when emulated on thirty-two bit hardware.