
- •Contents
- •Introducing Comparescan
- •Features
- •Applications
- •Organization of this Guide
- •Getting Started
- •Starting Comparescan
- •Specifying Input Files
- •Specifying Simulation Results Databases for Comparison
- •Specifying a Comparescan Rules File
- •Specifying a Comparescan Error Database for Viewing
- •Using Automatically Generated Files
- •Using the State File
- •Using the Preferences File
- •Using X Resources
- •Understanding Comparescan Exit Status Codes
- •Comparescan Tutorial
- •Accessing the Tutorial Source Files
- •Simulation Results Databases
- •Comparescan Rules Files
- •Generating a Comparescan Error Database
- •Viewing a Comparescan Error Database
- •Viewing Errors in Hierarchical Order
- •Viewing Errors in Signalscan Waves
- •Viewing Multiple Objects in Signalscan Waves
- •Viewing Errors in Time Order
- •Exiting Comparescan
- •Giving More Information in the Rules File
- •Reviewing the demo.rules Rules File
- •Using the demo.rules Rules File
- •Viewing Clocked Miscompares in Signalscan Waves
- •Comparescan Graphical User Interface
- •Comparescan Window
- •Menu Bar
- •File Menu
- •Tools Menu
- •View Menu
- •Window Menu
- •Button Bar
- •Application Examples
- •Making Absolute Comparisons
- •Specifying the Filename of an Object
- •Comparing Two Objects from the Same File
- •Comparing Two Objects on a Condition
- •Making Clocked Comparisons
- •Using One Clock and Two Simulation Files
- •Using One Clock and One Simulation File
- •Using Two Clocks and Two Simulation Files
- •Using Two Clocks and One Simulation File
- •Performing a Clock Compare with Timing Checks
- •Checking Stability
- •Specifying Objects in the Design Hierarchy
- •Command Option Examples
- •Specifying Start and End Times for Golden and Test Files
- •Specifying Start and End Times for Comparison
- •Specifying a Time Shift
- •Specifying compare Command Tolerance Windows
- •Special Syntax Examples
- •Commenting
- •Continuing Long Lines
- •Specifying Both a Golden and a Test File
- •Specifying Filenames Using UNIX Filename Syntax
- •Writing Comparescan Rules
- •Default Comparison Rules
- •Absolute and Clocked Comparisons
- •Rules File Parsing and Execution
- •Command Syntax and Options
- •General Command Syntax
- •Command Summary
- •General Command Options
- •datafile1
- •datafile2
- •compare
- •clkdef
- •clkcompare
- •stability
- •sequencetime
- •nosequencetime
- •statemapping
- •threshold
- •report
- •savedata
- •loaddata
- •translate
- •Frequently Asked Questions
- •Comparescan Input
- •Comparescan Output
- •How do I view a comparison report that I have generated?
- •Can I print a report of my errors?
- •Can I run the tool in batch mode?
- •Graphical User Interface (GUI)
- •How do I run the program without the user interface?
- •Why can’t Comparescan allocate colors?
- •How do I change the colors in the GUI?
- •Why can’t I read the text in the GUI?
- •Comparescan Features
- •How can I compare a digital simulation with an analog simulation?
- •How do I match an X value?
- •How do I compare only the top-level signals in my design?
- •How do I compare modules at different levels in the hierarchy?
- •What if I want a different tolerance window for every signal?
- •Comparescan Error Messages
- •What does “Out of Memory” mean?
- •What does “fork failed” mean?
- •Index

Comparescan User Guide
Comparescan Tutorial
The entire_design.rules file is a text file that contains a rule that compares all of the signals in any input design:
compare .
Therefore, when your comparescan command completes, your current working directory will contain the following additional files:
■entire_design.csd is a binary Comparescan error database.
■entire_design.rpt is a text version of the information in the Comparescan error messages database.
The entire_design.rpt file contains the following comparison information in text format:
■A list of the objects that were compared and the mismatch errors that occurred
■A list of the objects that could not be compared and the reason why each object could not be compared
■The number of objects that were compared
■The number of objects that matched
■The number of objects that did not match
■The total number of each kind of mismatch error and the line number in the rules file of the rule that resulted in each type of error
■A list of the mismatches that occurred and the simulation time at which each mismatch occurred
You can load both of your simulation results databases into Signalscan Waves, select all of the mismatched objects in both designs, organize the mismatches in the Waveform window, and then scroll to the mismatch times given in the text report. However, if you view the binary error database in the Comparescan GUI, you can do all of the above tasks automatically, as described below.
Viewing a Comparescan Error Database
At the operating system command prompt, enter:
comparescan entire_design.csd
This opens the Comparescan GUI and loads the error database that you created with the -save option in the previous step. As shown in Figure 3-1 on page 20, the error summary and individual errors are listed just as they are in the text report (entire_design.rpt), but
December 2000 |
19 |
Product Version 1.9 |

Comparescan User Guide
Comparescan Tutorial
in this case, you can click on an individual error to display the mismatch in the Signalscan
Waves window.
Figure 3-1 Comparescan Window Showing entire_design.csd
The first line of text in the Comparescan window is the Summary message. This message tells you how many signals were processed, how many of the processed signals matched, and how many of the processed signals did not match.
The Info messages give you information about the comparisons, such as the reason why a given comparison could not be performed, and provide a summary of the types of errors that were found.
The Error messages list compare errors such as missing objects. Detailed mismatch errors for each object are reported below the Info error summaries.
December 2000 |
20 |
Product Version 1.9 |

Comparescan User Guide
Comparescan Tutorial
Viewing Errors in Hierarchical Order
By default, detailed errors are displayed hierarchically, as shown in Figure 3-2 on page 22.
This example shows 67 absolute compare errors. (Thirty are in scope hello and 37 are in scope top.) The letter M that appears before each scope in the list indicates that these objects are modules.
To view the compare errors that occurred under scope hello:
1.Click on the scope name hello.
The list of signals and sub-scopes in scope hello that contain compare errors is displayed, as shown in Figure 3-2 on page 22. The number of compare errors for each signal or sub-scope appears to the left of its name.
You can see that the object named c1 is a module because of the M symbol next to its name. You can also see that the object named o1 is a signal or variable because of the waveform symbol next to its name.
December 2000 |
21 |
Product Version 1.9 |

Comparescan User Guide
Comparescan Tutorial
Figure 3-2 View Objects in a Subscope
2.Click on the signal o1.
The detail list of compare errors for this signal displays, as shown in Figure 3-3 on page 23. The golden simulation signal name is given, then the secondary simulation signal name, and then the time window in which the compare error occurred.
December 2000 |
22 |
Product Version 1.9 |