Analog to digital converters are well known circuits in the electronics arts. In electronics, an analog to digital converter (ADC) is a system that converts an analog signal into a digital signal. The analog signal is typically a voltage domain signal but it could be a current or charge domain signal. A digital-to-analog converter (DAC) performs the reverse function. An ADC may also provide an isolated measurement such as an electronic device that converts an input analog voltage or current to a digital number proportional to the magnitude of the voltage or current.
The conversion involves quantization of the input, so it necessarily introduces a small amount of error. Furthermore, instead of continuously performing the conversion, an ADC does the conversion periodically, sampling the input. The result is a sequence of digital values that have been converted from a continuous-time and continuous-amplitude analog signal to a discrete-time and discrete-amplitude digital signal. Note that sometimes the provided input is discrete in time through, for example, a circuit called “sample-and-hold”.
An ADC is defined by its bandwidth and its signal-to-noise ratio. The bandwidth of an ADC is characterized primarily by its sampling rate. The dynamic range of an ADC is influenced by many factors, including the resolution, linearity and accuracy (i.e. how well the quantization levels match the true analog signal), aliasing and jitter. The dynamic range of an ADC is often summarized in terms of its effective number of bits (ENOB), i.e. the number of bits of each measure it returns that are on average not noise. An ideal ADC has an ENOB equal to its resolution. ADCs are chosen to match the bandwidth and required signal-to-noise ratio of the signal to be quantized. If an ADC operates at a sampling rate greater than twice the bandwidth of the signal, then perfect reconstruction is possible given an ideal ADC and neglecting quantization error. The presence of quantization error limits the dynamic range of even an ideal ADC. If the dynamic range of the ADC exceeds that of the input signal, however, its effects may be neglected resulting in an essentially perfect digital representation of the input signal.
A time mode analog to digital converter is a special type of ADC that utilizes a voltage to frequency converter to convert the input voltage signal into an oscillating signal with a frequency proportional to the voltage of the input signal. A frequency counter is then used to convert that frequency into a digital count proportional to the input signal voltage. Longer integration times allow for higher resolutions. Likewise, the speed of the converter can be improved by sacrificing resolution. The two parts of the time mode ADC may be widely separated, with the frequency signal passing through an optoisolator or transmitted wirelessly. Some such ADCs use sine wave or square wave frequency modulation while others use pulse frequency modulation.
The transfer function of the voltage to frequency converter used in traditional time mode ADCs, however, suffers from relatively nonlinear response over a fairly wide frequency tuning range. This impacts the linearity and dynamic range of the ADC. There is thus a need for a time mode ADC that incorporates a voltage to frequency converter that exhibits fairly linear response over a wide frequency tuning range in which the linearity and dynamic range of the ADC are significantly improved.