In the past, the change-over from one range to another in a dual-slope analog to digital converter was accomplished by changing the deintegrate reference voltage to affect the converter's internal gain. For example, a 2 volt range would use a 1 volt reference and a 0.2 volt range would use a 0.1 volt reference (in A/D converters using 20,000 clock counts for the maximum period of the deintegrate cycle). The buffer and/or integrator gain was changed so as to keep the internal signal level up.
The use of very low reference voltages in conventional A/D converters results in susceptability of the meter to noise, and further, since the integration period during the deintegrate cycle is not synchronous with the AC line, the converter is quite sensitive to AC line pick-up. For example, for a 0.1 volt reference in 20,000 clock counts, the sensitivity at full scale is 5 microvolts per digit which yields twice the noise sensitivity. Thus, conventional A/D converters are generally limited to using reference voltages larger than 0.1 volt.
In conventional A/D converters, the gain cannot be changed without changing the zero offset. This is because the change of gain also changes the effect of the zero offset scheme.