1. Field of the Invention
The present invention relates to a method of signal noise reduction, and more particularly, to a method of signal noise reduction that may be applied to digital chips in a computer.
2. Description of the Related Art
Electrical signals from typical electronic devices, such as computers, often have signal bounce problem. In order to ensure correct signals, one method involves performing a delay process on a signal that may have noise. For example, when a digital signal changes (e.g. from 1 to 0), noise may occur. Please refer to FIG. 1. Between time t0 and t1, received digital signals D0 and D1 are both 1; at time t2, D2=0, which means the digital signal has changed. Therefore, at times t3, t4, regardless of what the signals D3 or D4 are, the received signal is still D1, until at time t5 if signal D5 is still 0, the received signal will then be changed from 1 to 0.
This method is usually applied for slower speed signals from general purpose input output (GPIO) ports, but can also be applied to higher speed signals, such as transmission signals between a CPU and a south bridge chip in a computer.
Please refer to FIG. 2. FIG. 2 is a schematic drawing of prior art technology processing changing time points for three pins P1, P2, P3. When the signal changes, after a period of delay, the signal can be confirmed. FIG. 2 shows that the signal changing time points of the three pins are usually different, and the traditional method of utilizing a delay time count for each pin P1, P2, P3 requires hardware resources and software that makes use of a timer. However, since there may be hundreds of GPIO pins, the computer must waste resources on support for the pins. Furthermore, when a GPIO pin is added, additional programs will be added, which is inefficient and which may cause numerous errors.
Therefore, it is desirable to provide a method of signal noise reduction to mitigate and/or obviate the aforementioned problems.