The traditional method of calibrating the time-base axis of a measurement instrument, such as an oscilloscope, has been to apply a signal having known frequency characteristics to the vertical axis of the instrument. Typically, these signals are precise time markers, square waves, or sine waves generated by a separate instrument. The appropriate timing and linearity adjustments are then made in the time-base circuit to provide a reasonably accurate sweep rate. This method of calibration has been adequate because the time base of a given oscilloscope has been compatible with a bandwidth or rise time thereof. That is, no faster sweep rate was provided then that which would provide an undistorted display of a signal passed through the associated vertical amplifier system.
The current digital trends and the analysis of logic signals have necessitated faster and more precise sweep signals, while less emphasis has been placed on bandwidth or rise time. It is commonplace today to find oscilloscopes having bandwidths on the order of 100 to 200 megahertz and time-base sweep rates on the order of 1 to 0.5 nanoseconds per graticule division. Calibration or accuracy checks of these time bases is difficult using present methods because the vertical amplifier channels simply do not have the capability of passing calibration signals having a frequency of 1 or 2 gigahertz.