1. Field of the Invention
The present invention pertains to a method and a device for determining the relative value (ratio) of the amplification factor before changing and after changing the amplification factor of a signal normalizer, and in particular, relates to a method and a device for determining the ratio of the amplification factor of a signal normalizer used in determination devices in order to improve the determination accuracy.
2. Discussion of the Background Art
The method whereby an analog-digital converter (A/D converter hereafter) is set up as an alternating-current voltage determination device and after the signals that are the subject of determination have been converted to discrete values, the signal components of the desired frequency are calculated by numeric operation, which is widely used as a method of determining the voltage amplitude of alternating-current signals that are the subject of determination in determination devices such as alternating-current volt-ammeters and LCR meters, etc. Determination errors in this case are attributed mainly to quantization errors, linearity errors due to bit weighting errors, and thermal noise, and there is particularly an increase in errors when the voltage amplitude of a signal that is input to the A/D converter is small. For example, errors of as much as 108 ppm are produced when signals are input where the maximum input voltage to an A/D converter with an accuracy of 18 bits is {fraction (1/10)}th the full-scale voltage of the A/D converter. On the other hand, as a result of the increased accuracy needed for components that are the subject of determination, etc., the determination accuracy of the above-mentioned LCR meter must be 10 ppm or less when determining the alternating-current signal amplitude.
The method whereby a voltage converter that raises or lowers the signal voltage by a predetermined ratio is set up in front of the A/D converter and fluctuations in the voltage amplitude of signals that are input to the A/D converter are kept within a predetermined range in order to reduce the effects of linearity errors is a method for reducing the effects of linearity errors of the A/D converter on the determination values when determining the voltage amplitude of input signals within a broad voltage range. If the voltage amplitude of the signals that are the subject of determination is large enough, it is directly determined by an A/D converter, while if this voltage amplitude is so small that it will cause linearity errors of the A/D converter, voltage conversion by a voltage converter is performed on the signals that are the subject of determination. The voltage amplitude of the signals that are the subject of determination during voltage conversion can be obtained by multiplying the voltage amplitude of the signals input from the voltage converter to the A/D converter as determined by the A/D converter by the inverse of the conversion ratio of this converter. Consequently, it is necessary to know the conversion ratio with an accuracy that is superior to the accuracy of the A/D converter and the voltage converter is generally made so that this accuracy requirement is satisfied to the utmost, or a standard signal source is generally set up and the voltage amplitude of signals obtained through the voltage converter is determined by the A/D converter and the error in the conversion ratio is corrected by calculation after the determination.
When a transformer is used as an example of a voltage converter, transformers of large shape are more expensive, depending on the frequency band that is used, and except when at xc2xd partial pressure, there is a problem with accuracy because of coil resistance and leakage inductance.
The method whereby amplifier 12 (signal normalizer hereafter) having multiple amplification factors in stages is set up in front of the input part of A/D converter 11 is another method that does not have the above-mentioned restrictions, as in the case of the alternating-current voltage determination device 10 in FIG. 1. Signal normalizer 12 has the function of amplification and output so that the voltage amplitude of signals that are input by signal generator 13 stays within a predetermined range. As a result, the maximum input voltage to A/D converter 11 is always close to the full-scale voltage of A/D converter 11, regardless of the size of the voltage amplitude of the input signals, and the effect of linearity errors due to A/D converter 11 is reduced. However, although an amplifier is used in this method, progress has been made in terms of high integration and high performance, even with popular amplifiers on the market, and space-saving effects are marked, regardless of the frequency used.
Depending on the width of the voltage range to be determined, there are cases in which multiple amplification factors of the normalizer are set up in stages in order to finely divide this voltage range. In this case, unless the ratio of the amplification factor before the amplification factor of the signal normalizer is changed and the amplification factor after it has been changed (simply amplification factor ratio hereafter) is known within a desired range of accuracy, the linearity of the determination value will be interrupted by the time when the amplification factor of the signal normalizer is changed as the dividing line, and new linearity errors will be produced. Therefore, the effect of improving errors with a normalizer will not be obtained.
For instance, when the amplification factor of signal normalizer 12 is set using a popular type of network resistance for the part comprising signal normalizer 12, an accuracy of only 100 ppm at the most can be realized. Therefore, by setting up normalizer 12, the effect of linearity errors of A/D converter 11 on the determination values is reduced, but a new linearity error is produced due to the insufficient accuracy of the amplification factor ratio in signal normalizer 12. Consequently, in order to obtain appropriate results when using a signal normalizer for the purpose of reducing the effect of linearity errors of the A/D converter on determinations, it is necessary to precisely determine the amplification factor ratio of the signal normalizer so that linearity of the determinations is maintained.
The method in which only the amplification factor of signal normalizer 12 is changed while keeping constant the amplitude of signals input to input normalizer 12, the output amplitude of signal normalizer 12 is determined by A/D converter 11 before and after this change is determined, and determinations are performed based on this determination ratio is a method for determining the amplification factor ratio of the signal normalizer 12. For instance, when determinations of the signal normalizer are performed with an amplification factor of 1xc3x97 and 10xc3x97, first, the amplification factor of signal normalizer 12 is set at 1xc3x97 and signals are input from signal generator 13 to signal normalizer 12 so that the maximum input voltage to A/D converter 11 is {fraction (1/10)}th the full-scale voltage of A/D converter 11. The voltage of the determination frequency component is calculated from the determination results with A/D converter 11 at this time and serves as V1. Then the amplification factor of signal normalizer 12 is set at 10xc3x97. The voltage amplitude of the signals input to signal amplifier 12 is not changed and therefore, the maximum input voltage to A/D converter 11 is the full-scale voltage of A/D converter 11. The voltage of the determination frequency component is calculated from the determination results with A/D converter 11 at this time and serves as V2. As a result, the amplification factor ratio obtained from the determination results is found as V2/V1.
By means of the above-mentioned method, the maximum input voltage to A/D converter 11 during V1 determination is {fraction (1/10)}th the full-scale voltage of the A/D converter. Consequently, the determination of V1 contains the linearity error of A/D converter 11 and therefore, the amplification factor ratio lacks accuracy and the result of reducing the determination error by normalizer 12 is not adequately obtained, even if the voltage amplitude of the input signals is determined by concomitantly using signal normalizer 12.
The present invention includes a method and a device for determining the amplification factor before the amplification factor of a signal normalizer is changed and the amplification factor after it is changed at a predetermined accuracy, its purpose being to obtain the appropriate error-reducing results with this signal normalizer based on the amplification factor ratio as determined using this method or device when the determination device has a signal normalizer for the purpose of reducing linearity errors.
Moreover, the invention prevents the scale of the circuit from becoming very large with an increase in the determination frequency by using a signal normalizer.
The invention also allows for reduced cost by constructing the present invention so that inexpensive parts can be used.
In a first aspect, the invention includes a method for determining the amplification factor ratio before and after changing the amplification factor of a signal normalizer that amplifies the voltage amplitude of signals that are input from a signal generation means so that it is within a predetermined range and outputs that voltage amplitude by using a signal generation means, which adds two sine-wave signals of the same frequency and voltage amplitude to form the output and controls the output voltage amplitude at any phase relationship between said two sine-wave signals by a phase control means, and an alternating-current voltage determination means, comprising an adjustment step wherein when the above-mentioned phase relationship is reversed so that the above-mentioned sine-wave signals are negated, the output voltage amplitude of the above-mentioned signal generator stays within a predetermined range and
the step wherein the amplification factor of the above-mentioned signal normalizer and the output voltage amplitude of the above-mentioned signal generator are each changed so that they are inversely proportional and a difference is obtained between the output voltage amplitude of the above-mentioned signal normalizer before the above-mentioned change and that after the above-mentioned change.
Moreover, a second aspect of the invention includes a step wherein the output voltage amplitude of the above-mentioned signal generation means is amplified.
Furthermore, in a third aspect of the invention, the above-mentioned adjustment step includes a step wherein the voltage amplitude of at least one of the above-mentioned two sine-wave signals is controlled and in that it further comprises:
the step wherein the amplification factor in the above-mentioned amplification step is determined and
the step wherein the output voltage amplitude of the above-mentioned signal generation means that has been adjusted by the above-mentioned adjustment step is determined.
A fourth aspect of the invention includes
the step wherein one of the above-mentioned two sine-wave signals is produced by the synthesis of two or more sine-wave signals with the same frequency and
the step wherein the voltage amplitude of the above-mentioned two or more sine-wave signals is controlled in the above-mentioned adjustment step.
Moreover, a fifth aspect of the invention includes a device for determining the amplification factor ratio before and after changing the amplification factor of a signal normalizer that amplifies the voltage amplitude of signals that are input from a signal generation means so that it is within a predetermined range and outputs that voltage amplitude and that consists of a signal generation means, which adds two sine-wave signals of the same frequency and the same voltage amplitude to form the output and controls the output voltage amplitude at any phase relationship between said two sine-wave signals by a phase control means, and an alternating-current voltage determination means, comprising
a means by which, when the above-mentioned phase relationship is reversed so that the above-mentioned sine-wave signals are negated, the output voltage amplitude of the above-mentioned signal generator stays within a predetermined range and
a means by which the amplification factor of the above-mentioned signal normalizer and the output voltage amplitude of the above-mentioned signal generator are each changed so that they are inversely proportional and a difference is obtained between the output voltage amplitude of the above-mentioned signal normalizer before the above-mentioned change and that after the above-mentioned change.
In a sixth aspect of the invention, the signal generation means includes a means by which the output voltage amplitude of the above-mentioned signal generation means is amplified.
In a seventh aspect of the invention, the above-mentioned adjustment means is a means whereby the voltage amplitude of at least one of the above-mentioned two sine-wave signals is controlled and comprises
a means by which the amplification factor of the above-mentioned amplification means is determined and
a means by which the output voltage amplification of the above-mentioned signal generation means that has been adjusted by the above-mentioned adjustment means is determined.
In an eighth aspect of the invention, one of the above-mentioned two sine-wave signals is a synthetic signal of 2 or more sine-wave signals with the same frequency and the above-mentioned adjustment means is a means for controlling the voltage amplitude of the above-mentioned two or more sine-wave signals.