Voltage-to-frequency converters are known in the prior art for use in analog-to-digital converters. Thus, prior art analog-to-digital converters are known to include a multiplexer providing successive voltage levels to a signal conditioning and buffer amplifier for output to a voltage-to-frequency converter. A counter is connected to the output of the voltage-to-frequency converter to provide a digital output count, representing the input analog voltage level.
Typically, auto-zeroing circuits may be included to determine, for each counting and measuring cycle, an output offset voltage from the signal conditioning and buffer amplifier. A synchronizing signal, referred hereinafter as a voltage convert request signal, requests performance of a conversion. A known analog-to-digital converter of this type performs 1200 conversions per second. Thus, in such a converter there is available a duration of approximately 800 microseconds for each cycle of measurement. Where 200-300 microseconds are required by the multiplexer to provide the next input level to the amplifier and buffer, as well as for the buffer to settle and provide a stable output, approximately 500 microseconds remain to digitize the signal output by the voltage-to-frequency converter.
That is, approximately 500 microseconds are available for measurement of the frequency output by the voltage-to-frequency converter.
For charge-balanced voltage-to-frequency converters, there is provided a two-cycle sequence of operation. In a first cycle of operation, the reset cycle, an input capacitance of an integrator is charged by a current derived from an input voltage source and from a current source for a fixed time interval. In a second cycle of operation, the integration cycle, the charge is dissipated by integrating the current derived from the input voltage source. During this discharge cycle, a comparator detects the output voltage level from the integrator and, upon matching a predetermined threshold, an output pulse is generated. Generation of the output pulse completes the conversion and initiates a new sequence.
The frequency of the output pulses is determined by the sum of the time periods for both the reset and integration cycles.
It is known that when the input voltage changes from one level to another the frequency of the output pulses changes from a value representing the first voltage level to a value representing the second voltage level. The time required for the frequency to stabilize to the value representing the second voltage level typically requires one complete cycle of operation. When the voltage-to-frequency converter operates at frequencies ranging from 10 kHz to 100 kHz, where 10 kHz represents a zero input voltage level, such a change may require passage of 100-200 microseconds prior to obtaining a stable output frequency. Thus, approximately 20% of the time available for frequency measurement and conversion is used in settling to the new frequency value.
Loss of 100 microseconds out of a 500 microsecond available time period thus represents a loss of 20% of resolution of measurement. There is accordingly a need in prior art circuits to reduce the amount of time consumed by voltage-to-frequency converters in settling from one frequency to another frequency upon changes in voltage levels input thereto.