The present invention relates generally to the field of data communication, and more particularly to the field of testing to verify that the bit error rate of a chip in a data communication system is acceptable.
Transmitter/receiver (Tx/Rx) chips transmit and receive digital data. Systems that use Ethernet, Fiber Channel, ATM, and other data communication standards that transmit bits serially through a single pair of wires often incorporate these chips. The Tx/Rx chips often include analog blocks to provide functions such as clock multiplication and clock recovery that cannot be performed fast enough digitally. The analog blocks introduce jitter, however, which can increase the bit error rate (BER), the number of errors per bits transmitted.
FIG. 1 shows an example of a transmitted data stream with jitter. Ideally, the receiver circuit will sample the received data in the middle of each bit position because the signal is most stable at this point. The clocks used for sampling predict the middle of the bit position from edges (or level transitions) of previously-received data. Jitter in the data stream, however, can move edges of the signal and fool the clock into sampling the data stream either too soon or too late, causing errors in the recovered data stream.
For example, during period C of FIG. 1, an edge in the recovered data stream occurs before the corresponding edge of the transmitted data stream, resulting in jitter represented by -.DELTA.t.sub.1. The opposite situation arises in period F where the edge in the recovered data stream occurs after the corresponding edge of the transmitted data stream by an amount represented by .DELTA.t.sub.2.
FIG. 1 also shows a situation where the jitter has caused a bit error. In period G, as in period F, the edge in the recovered data stream occurs after the corresponding edge of the transmitted data stream by an amount represented by .DELTA.t.sub.3. The amount .DELTA.t.sub.3 is great enough to push the rising edge in the recovered data stream past period G. As a result, the bit level is one in the transmitted data stream, but remains zero in the recovered data stream, resulting in an error.
Bit errors produce problems for modern data communication systems that require high reliability at high speeds and without a high cost. The high reliability generally requires extensive testing of chips, but the need to keep the cost low constrains the testing to a short time period with low frequency equipment. As test times increase, so does the cost of testing. Also, equipment for testing at high frequencies is more expensive then equipment for testing at low frequencies.
Certain conventional tests ensure that the final IC package works properly when assembled on a circuit board, such as an Ethernet board, and confirms that the physical wiring on the board is working correctly. This process does not test the analog blocks separately nor stress the chips to see the extremes at which it can operate, however, and such limitations may not result in a fully tested product.
Another method for determining reliability tests the chips only long enough to provide significant indications of the chip's performance. Modern bit error rate requirements, however, may demand over an hour of testing for each chip. For instance, a system with a bit error rate of 10.sup.-12 errors per bit or less for a system bit rate of 1 Gbit/s would only generate 3.6 errors per hour. This testing time exceeds the time available to test the chip. Avoiding this expense requires a testing system that can, in a short period of time, reliably verify the bit error rate of a chip in a data communication system.