Accurate time measurements using integrated tapped CMOS delay lines have been obtained in low voltage semiconductor environments. Rahkonen et al., Time Interval Measurements Using Integrated Tapped CMOS Delay Lines, PROCEEDINGS OF THE 32ND MIDWEST SYMPOSIUM, CIRCUITS AND SYSTEMS, pp. 202-204 (14-16 Aug. 1989) describe a time-to-digital converter wherein a start signal is propagated on a slow tapped delay line and a stop signal is propagated on an integrated relatively fast tapped delay line. The stop signal latches the output of the slow delay line and when the stop signal catches the start signal, the position of the start signal is detected by a coder and converted to a binary output word. Calibration of the time-to-digital converter is accomplished either by measuring a known time interval, calculating the tap size and scaling the results computationally, or by biasing adjustable delay elements to a proper unit delay. In any event, the calibration is dependent upon the environmental conditions affecting the delay lines and the flip-flops.
In order to continuously and accurately measure short time durations, on the order of hundreds of nanoseconds, within a reasonable tolerance on the order of fifty or fewer picoseconds, within changing environments and at low power, modification and calibration of the Rahkonen et al. device is required. Continuous measurements, moreover, permit in situ control of the high frequency oscillating signal.