Advances in semiconductor design and fabrication have resulted in integrated circuits (ICs) with a wide range of functions and a large number of inputs and outputs. Complex packages having hundreds of pins may be required for housing such integrated circuits. Hybrid circuits may also require packages with large pin counts.
In ATE systems, the device(s) being tested are typically mounted in a test socket that provides a signal path to the pins of the device under test (DUT). For purposes of this disclosure, a socket is intended to include fixtures used to provide an interface to devices that may not have physical “pins,” (e.g., surface mount devices or individual die).
An individual integrated circuit or hybrid circuit typically has a number of points of physical contact for signal input that mate to a test socket. This collection of points will be referred to as an “interface plane.” The physical points of the interface plane may be distributed in a three-dimensional space, and are not restricted to a geometric plane. The interface plane serves as a reference for the timing relationships for signals associated with the points (pins).
FIG. 1 shows a block diagram for a typical ATE test system 100. A DUT 105 with an array of pins 110 has a number of input pins coupled to test signal lines 115. Each of the signal lines 115 is coupled to a driver 125 belonging to an array of drivers 120. The array of drivers 120 is coupled to a controller 130 then determines the nature of the signals applied by the drivers, and also the timing relationships between the signals.
In a test system, the signal lines 115 may have different lengths and may also have different impedance characteristics. The differences between the signal paths may result in the timing relationships between signals being altered at the interface plane of the device under test (DUT) 105, in comparison to the timing relationships at the output of the array of drivers 120. It is thus desirable to measure the differences in the signal timing and apply corrections to the signal sources by calibrating the complete test setup.
Time domain reflectometry (TDR) is a common method for determining the timing relationships between pins of a circuit. In conventional TDR, the pins of the DUT 105 are all opened or shorted to ground and a test edge may be applied to each pin in turn. The impedance discontinuity of an open circuit will produce a positive reflection, whereas a short circuit will produce a negative reflection. The time required for the return of the reflected input edge at each pin provides information that may be used to adjust the timing of the input signals for the pins so that they arrive at the interface plane with the desired phase relationships.
Conventional TDR systems and techniques are limited in accuracy by the rise time of the test edge. Since the timing of a reflected edge is determined through the detection of the edge, there is an inherent ambiguity in measurement of an edge having a finite rise time.
Another difficulty with current systems is that in addition to the differences in signal delay between different drivers and the interface plane, there are also typically differences in the length of path traveled by the signals used in TDR calibration. Signal paths are also commonly analyzed in a piecewise fashion that produces a cumulative error that increases with the number of test segments.