Circuit designers of multi-Gigabit systems face a number of challenges as advances in technology mandate increased performance in high-speed components. For example, chip-to-chip data rates have traditionally been constrained by the bandwidth of input/output (I/O) circuitry in each component. However, process enhancements (e.g., transistor bandwidth) and innovations in I/O circuitry have forced designers to also consider the effects of the transmission channels between the chips on which data is sent.
At a basic level, data transmission between components within a single semiconductor device or between two devices on a printed circuit board may be represented by the system 10 shown in FIG. 1A. In FIG. 1A, a transmitter 12 (e.g., a microprocessor) sends data over channel 16 (e.g., a copper trace on a printed circuit board or “on-chip” in a semiconductor device) to a receiver 14 (e.g., another processor or memory). When data is sent from an ideal transmitter 12 to a receiver 14 across an ideal (lossless) channel 16, all of the energy in a transmitted pulse will be contained within a single time cell, which is referred to hereinafter as a unit interval (UI).
However, real transmitters and real transmission channels do not exhibit ideal characteristics. Due to a number of factors, including, for example, the limited conductivity of copper traces, the dielectric medium of the printed circuit board (PCB), and the discontinuities introduced by vias, the initially well-defined digital pulse will tend to spread or disperse as it passes through the channel 16. This is shown in FIG. 1B. As shown, a single pulse of data 105a is sent by the transmitter 12 during a given UI (e.g., UI3). However, because of the effect of the channel 16, this data pulse becomes spread 105b over multiple UIs at the receiver 14, i.e., some portion of the energy of the pulse is observed outside of the UI in which the pulse was sent (e.g., in UI2 and UI4). This residual energy outside of the UI of interest may perturb a pulse otherwise occupying the neighboring UIs, in a phenomenon referred to as intersymbol interference (ISI).
Because of the potentially negative impact of ISI on the reliability of data transfer and detection at the receiver 14, such data transfer is often simulated in a computer system using simulation software. The design of a high-speed system 10 typically involves iterations of circuit-level simulation to ascertain whether or not the system 10 has performed within suitable tolerances, and this of course requires a waveform vector suitable for simulation using simulation software. Simulation is a valuable tool in the semiconductor industry, where it is generally very expensive to design and produce a given integrated circuit. The use of simulation software allows the circuit designer to verify the operation and margins of a circuit design before incurring the expense of actually building and testing the circuit. Through the use of simulations, design errors or risks are hopefully identified early in the design process, and resolved prior to fabrication. Unfortunately, modeling and simulation of realistic waveforms that accurately reflect subtle characteristics of a signal is difficult. It is generally necessary to define a waveform in a layout simulator such as SPICE™. This requires transistors, resistors, and other discrete components to be electronically considered, even if they are not actually yet constructed or laid out. Such component-level consideration takes considerable time and effort.
Further, modeling and simulation may not provide a suitably accurate picture of how the system 10 will process real signals. Realistic data signals will not be ideal, but instead will suffer from various sources of amplitude noise and timing jitter, which may vary randomly between the unit intervals of the data. Regardless of the source or type of amplitude noise or timing jitter, it is difficult to quickly and efficiently simulate the effects of amplitude noise and timing jitter in the context of a system 10.
The challenge associated with simulating noise- or jitter-affected signals is highly correlated to the characteristics of the degradation. Signals in any transmission medium experience both random and deterministic degradation. Random degradation, in the form of random Gaussian distributed amplitude noise and timing jitter, which stem from thermal and shot noise, requires statistical quantification. Similarly, deterministic amplitude noise and timing jitter are linked to several sources including power supply noise, inter-channel crosstalk, impedance discontinuities, component variance, and at high frequencies the response of the channel. These factors result in a variety of observable characteristics, from periodicity to uncorrelated-bounded randomness. To model these noise components correctly requires the ability to designate their probability during the noise generation stage and consequently inject or superimpose these effects onto the underlying signal in a way reflecting what occurs in the actual system. The final success or robustness of a particular design is dependent, to a large measure, on the achieved realism of the simulation environment.
To date, industry standard simulators do not provide the level of amplitude noise and timing jitter generation control necessary to model a realistic communication channel, though some jitter adding features have recently become available. Agilent's Advanced Design System (ADS) tool, Synopsys's Hspice, and Synapticad's WaveformerPRO all offer stock waveforms with additive jitter, but the features are limited in several ways. First of all, in the cases of ADS and Hspice, the jitter exhibited by the waveform may take on one of a few standard forms: it may either exhibit a Gaussian probability distribution or a periodic jitter distribution (e.g. sinusoidal distribution, etc.), but combinations of random and periodic jitter distributions are limited both in terms of the number of permitted jitter sources per signal and the peak magnitude of the jitter. In addition there is no clear mechanism for adding amplitude noise in the time-domain. WaveformerPRO claims even less, allowing the user to define a peak-to-peak jitter value, but offering no control over the statistical characteristics of the jitter. While all three tools provide jittery clock sources, only Agilent's tool allows for jitter added to random data sequences. And while the random data may be manually altered by the user, the length of a user defined sequence is limited to 232−1 bits. So while one can find clock and random data sources exhibiting a limited selection of jitter characteristics, a tool has yet to emerge providing the user the ability to produce simulatable waveforms of arbitrary data patterns, of arbitrary length, exhibiting arbitrary jitter and amplitude noise characteristics.
It is possible to formulate piecewise linear functions (PWLs) with tools like Matlab, as well as within Spice-based simulators, through the careful architecting of time and voltage vectors, where the voltage amplitude is designated for each step in time. But to approximate Gaussian distributed noise and jitter, as well as other specific noise distributions, over hundreds or thousands of cycles using known methods is daunting.
Another challenge in simulating realistic signaling environments is tied to the underlying statistical assumption that sufficient samples of the behavior to be characterized are readily available. As such, it is becoming necessary to include more and more cycles with each simulation. As the relative size of each individual noise component is very small with respect to the overall cycle period, fine voltage and timing resolution are necessary. While the timing resolution of a simulation may be enhanced by decreasing the time span between each calculation (i.e., the simulated time step), this leads to a simultaneous increase in both the simulation run time and the memory requirement. When fine simulation resolution is coupled with a greater number of simulated cycles, the result is an enormous amount of data and prohibitively lengthy simulation times. It is not uncommon for transistor-level transient (time-based) simulations to run for hours or even days. It is likewise not uncommon for such a simulation to fail due to a lack of memory resources.
Perhaps the strongest argument for developing full signals with noise and jitter is the impact of ISI, mentioned briefly above. While unbounded Gaussian noise and jitter lead to long term bit errors, depending upon the bandwidth of the channel, ISI and the corresponding data-dependent jitter (DDJ) may dominate the short term signal degradation. Recent papers have proposed methods for predicting the DDJ distribution from the relationship of the data-rate and the channel bandwidth, see J. Buckwalter et al., “Predicting Data-Dependent Jitter,” IEEE Transactions on Circuits and Systems II: Express Briefs, vol. 51, no. 9, pp. 453-457 (September 2004).
With the following background in hand, it should be apparent that an improved signal simulation technique would at least allow for the generation of various kinds of and lengths of signals, with good computational efficiency, and allow for the formation a signal for simulation in which amplitude noise and timing jitter of any resolution are easily and realistically modeled. The disclosed techniques achieve such results in a manner easily implemented in a typical computerized system or other computerized circuit simulation software package.