1. Field of the Invention
The present invention relates to a method for monitoring a sensor device, which provides an analog output signal on an output line, which is fed to an test signal is applied to the sensor output in a monitoring phase and is received and evaluated in the analysis unit. The invention also relates to the sensor device and an analysis unit with monitoring means suitable for performing the method of the invention.
2. Prior Art
Analog sensors, such as pressure sensors, are generally connected with three lines to an analysis and control unit. Two of the lines are for supplying electrical power to the sensor electronics and one line is for output of an electrical sensor signal. This sensor signal is usually effected by fluctuations of the power supply, so that the analysis circuits are frequently ratiometric. The output voltage of the sensor is then related to the power supply voltage, since the power supply voltage is superimposed as an output voltage shift or offset of the output signal. The analog/digital converter in the analysis unit uses this same power supply voltage for further processing or reading of the output voltage. In this way power supply fluctuations are not present in the measurement result and the reference voltage does not need to have a predetermined value that is within narrow tolerances.
However this method is only satisfactory, when the reference potential of the analog/digital converter is compatible with the output voltage shift. In practice however a voltage drop results because of the supply line, interface and electrical components based on unavoidable transmission impedance, which changes the output voltage shift. Because of those effects measurement errors result. If the transmission impedance remains time invariant or constant, this measurement error may be eliminated by an initial calibration. When the transmission impedance wanders or fluctuates with time (e.g. for example because of temperature dependence), the calibration process should be repeated from time to time.
Most existing sensor devices have monitoring circuitry for testing for possible short circuits or line interruptions. A "signal-range-checking" method for displacement sensors is described in DE 28 05 876 A1. These displacement sensors produce output signals with amplitudes in a predetermined amplitude range, which depend on a measured displacement. A threshold value switch is used to determine if the amplitude exceeds the predetermined amplitude range and transmits an error signal to the analysis unit in the case of a fault.
EP 0 635 135 B1 describes a method of monitoring a rotation speed measuring device with at least two rotation speed sensors. In this method a test pulse is fed to one of the rotation speed sensors from the microprocessor acting as the analysis unit over a special test line during a testing phase. The level of the other rotation speed sensor is then monitored. A short circuit is detected by a level change of the other rotation speed sensor.
In other existing test methods a definite operating state is produced (e.g. the pressure of the medium to be measured by a pressure sensor is reduced to zero pressure) and then the offset of the output signal is measured. This type of test however only tests the output signal range corresponding to the definite operating state and also is not useable in all sensors.
Sensors are also known that perform an initialization test after the operating voltage has been applied to them. In this initialization test, e.g., half the power supply voltage is applied to the sensor output, whereby the dependence of the output signal on the power supply voltage, especially in ratiometric sensors, can be established.
It has proven especially disadvantageous that the known monitoring methods can only detected gross faults, such as short circuits and line interruptions, or that the tested operating range is too small. Thus it is desirable to be able to start the testing phase not only when the sensor device is turned on, but at any arbitrary time point. Otherwise the initialization test time of the sensor adds to the initialization time of the analysis unit, which leads to an undesirable delay prior of operational readiness.