Due to the complexity of modern electronic circuitry, designs for electronic circuits are often subjected to a rigorous verification process in order to detect errors. Often, simulation of the design for the circuitry is used to help locate and detect errors in the design and thus debug and/or verify the correctness of the design. Simulation of a design may occur at low levels of abstraction, e.g., at a “switch-level.” Switch-level simulations typically include active circuit elements (e.g., transistors) and passive circuit elements (e.g., resistors, capacitors, and inductors).
Simulation also may occur at a higher level of abstraction, e.g., a “behavioral level.” Behavioral level simulations typically use a hardware description language (HDL) that determines the functionality of a single circuit element or group of circuit elements. Examples of typical behavioral level simulation languages are Very High Speed Integrated Circuit HDL (VHDL) and Verilog. Using Verilog, for example, electronic circuitry is described as a set of modules, each of which is part of a design of a circuit, circuits, or a device. Modules written in an HDL may be used to represent circuit elements of hardware ranging from simple gates to complete systems.
A design, e.g., a simulation or logic design, may be simulated on more than one type of simulator in order to facilitate verification of the design and/or the simulator. For example, a design may be simulated on a hardware accelerator and/or a software simulator. The hardware accelerator is a dedicated computer designed for a specific purpose, such as verifying designs. A software simulator is a computer program (which may be written in an HDL) that may run on a standard computer.
Often, a design is first simulated on a reference simulator, which provides reference, or “golden” data. The golden data is used as a reference or standard (i.e., an “expected” output) for later simulations and to compared against results of test simulators. These simulations may be performed on a different type of simulator and/or after the design has undergone modifications or re-implementation.
FIG. 1 shows a system for verifying a simulation on two simulators, namely, a reference simulator and a test simulator. The system includes a design (20), user data (22), a compiler (28), a simulation image (30), a reference simulator (32), a test simulator (34), a golden data repository (40), and a comparator (42).
The design (e.g., Verilog modules) (20) and the user data (22) are compiled by the compiler (28) producing a simulation image (30), i.e., a binary executable file. The design (20) includes instructions for simulating circuit elements by executing the simulation image (30) on the reference simulator (32) and the test simulator (34). The user data (22) includes stimuli (24) and comparison data (26).
The stimuli (24) include instructions for stimulating (or exercising) particular portions of a simulation image (30) during the simulation in order to generate the test data (38) and the golden data (36). For example, the design (20) may include a particular circuit element, such as a gate. A designer may include instructions in the stimuli (24) so that a stimulus (e.g., a value of “1”) is sent as an input to the gate at a particular point of the simulation (e.g., 500 seconds into the simulation), or during a particular cycle of the simulation (e.g., the 100th cycle of the simulation). The effect of the stimulus (such as the value of “1”) generates a particular output of the gate, which may be included in the test data (38) and/or the golden data (36).
The comparison data (26) includes instructions that specify particular values of interest to the designer for comparison purposes. For example, the designer may want to compare the reference simulator output of the aforementioned gate at a particular point of the simulation to the test simulator output of the aforementioned gate at that point of the simulation. The comparison data (26) includes instructions from the user data (22) to specify which signals of the design simulation are used for comparison purposes and at what point the signals are compared.
Executing the simulation image (30) on the reference simulator (32) generates golden data (36), which is stored in the golden data repository (40). Executing the simulation image (30) on the test simulator (34) generates test data (38). The comparator (42) receives test data (38) and the golden data (36) as input. The golden data is typically obtained from the golden data repository (40).
The comparator (42) compares the test data (38) to the golden data (36) and generates output in the form of a comparison result, which may be used to evaluate (i.e., debug and/or verify) the test simulator (34) and/or the design (20). For example, for the aforementioned gate, the comparator (42) may use the comparison data (26) to compare the output of the gate at the 500th cycle of a simulation for both the reference simulator (32) and the test simulator (34). If the comparison shows a difference between the output of the gate on the reference simulator (32) and the output of the gate on the test simulator (34), this may be an indication that the test simulator is defective.
Often, the design (20) is simulated on the test simulator (34) well after the design (20) is simulated on the reference simulator (32). For example, after the design (20) is first simulated on the reference simulator (32), the test simulator (34) is re-implemented (e.g., written in a different computer language, debugged, etc.) and then the design is simulated on the test simulator (34) days, weeks, months, etc. later. In this example, the resulting test data (38) generated by the test simulator (34) is compared to the same golden data (36) previously generated by reference simulator (32).
FIG. 2 shows a flowchart for a simulation verification using the system shown in FIG. 1. Initially, the design and user data for the design are obtained (Step 80). Typically, the design and user data are created by a designer or team of designers using industry-standard tools, such as a testbench, designed for design verification.
Then, the design is compiled (with the user data) to generate a simulation image (Step 82). The simulation image is loaded (Step 84), and simulated (Step 86) on the reference simulator. Once the simulation is complete on the reference simulator, the golden data from the simulation is stored in the golden data repository (Step 88). Generally, the golden data is a record of values of signals (i.e., waveforms) from the simulation. For example, a particular waveform may represent the output value of a gate at specific points during the simulation.
Once the golden data has been stored, the simulation image is loaded onto (Step 90), and simulated (Step 92) on the test simulator. As a result of simulation on the test simulator, test data is generated. From the test data and the golden data, the comparator selects golden data and test data for comparison according to the instructions of the comparison data (which are included in the simulation image as executable code) (Step 94). The selected golden data and selected test data is then compared to obtain a comparison result (Step 96).
Next, the designer debugs the test simulator and/or design using the comparison result (Step 98). For example, the designer may detect an error if a mismatch between selected test data and selected reference data occurs, after which the designer may correct and recompile the test simulator, and continue verification of the simulation.
If the designer wishes to change the stimuli and/or comparison data, the designer may create new user data and recompile the design to generate another simulation image. Likewise, while the simulation is performing on the test simulator, if the designer needs to revise stimuli or comparison data for simulation, the designer may halt the ongoing simulation, revise the stimuli and/or comparison data, recompile the design and user data, and begin simulation again.