There are many electronic applications for a unity gain amplifier, which manner of electronic circuit is also often referred to as a unity gain buffer, as well as for amplifiers that have a gain greater than unity. One application that places relatively high demands for good performance upon such an amplifier or buffer is at the ‘front end’ of a laboratory quality high accuracy DC DVM (Digital Volt Meter). Nowadays, high accuracy usually means seven to nine digits of resolution, with perhaps just a few counts of error in the least significant digit. Most DC DVMs also measure AC with reduced accuracy for pure sinusoids having frequencies up to about one to, in some cases, as high as ten megahertz. The ‘front end’ of a DVM is a location therein after any input attenuation, but before the actual measurement mechanism.
It is immediately clear that any unity gain buffer in such a location does indeed have stringent performance requirements, since it generally is not possible to subsequently distinguish between what the input to be measured actually is and an error introduced by the buffer. Whether the gain is EXACTLY unity is probably not the real issue—calibration can be expected to take care of any uniform gain error. Instead, there are other issues, such as stability and ageing. Does it maintain whatever gain it has over time and with variations in temperature, etc? Assuming those concerns can be adequately addressed, another important issue is linearity. That is, does the gain remain the same for inputs of various amplitudes?
One of the mechanisms that, regardless of their gain, some amplifiers exhibit that causes such behavior is a change in output voltage as input voltages that actually are equal vary over their allowed range. This phenomenon has a name: Common Mode Rejection, or CMR. It is commonly expressed as a ratio, and a value in the range of 80-120 dB is common. The internal causes within the amplifier that limit its CMR can also operate to make it to appear that the gain of the amplifier is not constant over different levels of input. (CMR can be modeled as an input that affects the output, and an incorrect output will appear to be of the wrong gain.) Gain that is not constant when it should be is an issue of non-linearity.
This lack of linearity can be a serious problem, as the input buffer to an eight digit DVM probably needs to faithfully reproduce inputs over at least the range of ±10 VDC, and preferably ±12 VDC. Good stability is wasted if the circuit cannot operate with sufficient linearity to avoid the introduction of error ahead of the actual measurement circuitry, and a suspected or known lack of linearity is very difficult to correct once it is on the loose! The preferred choice is to not introduce any non-linearity in the first place. One the other hand, as purveyors of fine quality electronic test equipment that must compete in the marketplace, we also experience the urge to control the manufacturing cost of our products, among which are laboratory grade high accuracy DVMs. It seems that we need a low cost high linearity fairly wide range unity gain amplifier, with a decided emphasis on high linearity. To be specific, we'd like a unity gain buffer having a CMR of 140 dB, or more, or at least 15 ppb (parts per billion, with a ‘b’!) linearity over ±12 VDC for the lowest possible cost, and that will also work respectably up to, for one particular case of interest, 150 kilohertz for AC measurements. We can also readily appreciate that there are comparable situations in other applications where the amplifier is expected to have a gain greater than unity. What to do?