For verification purposes, a reference model is developed which will be used for the mentioned result validation methodology. E Reference Model Source Code - Registry File 129 F Reference Model Source Code - Controller 130 G Reference Model Source Code - Immediate.
Background
IC Design Flow
These system specifications are often provided by customers and are known as the high-level representation of the system. The design team works with the verification team and performs behavioral simulations to verify the functional and logical behavior of the circuit.
Verification and Validation in IC Design Flow
After the physical design is completed, the physical layout of the system is obtained and physical verification is performed to validate the functional behavior of the design. After the prototype passes the verifications, the design flow enters the final phase, where packaging and testing are carried out.
Problem Statements
For a general purpose processor design, the coverage of the functional verification performed should include all functionalities implemented through multiple stages of simulation, verification, and pre-tape-out evaluation (Gupta and Harakchand, 2014). By designing functional verification components into modular components such as sequence, the functional verification components, also known as Intellectual Property Verification (VIP), can be reused for verification at different levels and even across different projects.
Aims and Objectives
Report Overview
RISC-V Computer Architecture
RISC-V Base Instruction Set Architecture
The basic architecture of the RISC-V instruction set also includes 32 general-purpose registers in the system, which are 32 bits wide and can be used without any restrictions, with the exception of the x0 register, which is physically grounded and returns zero whenever it is read. Each general-purpose register among the 32 registers has an alias that corresponds to their use in the standard RISC-V Application Binary Interface (ABI).
RISC-V Computer Organization
- Program Counter
- Instruction Memory
- Register File
- Control Unit
- ALU Control Unit
- Arithmetic Logic Unit
- Data Memory Unit
- Immediate Generation Unit
The instruction address is sent to the instruction memory to retrieve the corresponding instruction code from the program memory. The instruction address obtained from the program counter is used to fetch a 32-bit instruction code.
Datapath Flow
- R-Type Instruction Datapath Flow
- I-Type Instruction Datapath Flow
- S-Type Instruction Datapath Flow
- SB-Type Instruction Datapath Flow
- U-Type Instruction Datapath Flow
- UJ-Type Instruction Datapath Flow
The program counter provides the instruction address used to access and retrieve the corresponding instruction (add) from the instruction memory. The program counter provides the instruction address used to access the corresponding instruction (branch if equal) from the instruction memory.
Processor Pipelining
- Pipeline Implementation
- Structural Hazards
- Data Hazards
- Control Hazards
- Data Forwarding
- Pipeline Stalling
- Pipeline Flushing
For example, the pipeline register that separates the instruction fetch (IF) and instruction decode (ID) stages is named the IF/ID pipeline register. Flushing the information on the pipeline can be done by loading zero values into the pipeline registers.
Summarized Review for RISC-V Computer Architecture
Functional Verification
- SystemVerilog as Functional Verification Language
- Functional Verification Requirements
- Functional Verification Technologies
- Functional Verification Approaches
- Summarized Review for Functional Verification
Reusability of verification methodology is one of the highly focused aspects of functional verification. Automating test case execution allows verification to be performed without manually entering input data into the design under test. In simulation-based verification, inputs are fed into the design under test one at a time, with the spreadsheet verifying the correctness of the design's behavior.
Therefore, the project will use simulation-based verification, whereby input stimuli are generated and driven to the design. Since manual test stimulus generation is involved, this form of verification allows specific functionality of the design to be tested. It provides insight into features of the design that have not yet been adequately verified and tracks the functional coverage of the verification process.
Emulation-based verification verifies the gate-level model or RTL representation of the design mapped onto an FPGA through emulation.
Universal Verification Methodology
UVM Testbench Architecture
UVM provides class libraries that allow for generic utilities such as configuration databases, transaction library modeling, and component hierarchy. The building blocks allow for the rapid development of well-constructed, reusable verification components and test environments. In a typical UVM authentication environment, the authentication environment can be built using readily available UVM classes.
UVM class components have a well-established standard communication infrastructure, allowing verification components to send data packets between each other and work synchronously (Chip Verify, n.d.).
UVM Component Class
- UVM Testbench
- UVM Test
- UVM Environment
- UVM Agent
- UVM Sequencer
- UVM Driver
- UVM Monitor
- UVM Scoreboard
The agent component is another hierarchical component that includes the authentication components dealing with a specific design under test interface. The sequencer is an authentication component that generates sequence items as data transactions and sends them to the driver component for further execution. The driver is an active component within the verification environment that actively sends the sequence items obtained from the sequencer to the design under test via the interface.
The sequence items generated and received by the driver component are further mapped to signal level formats compatible with the interface to be driven to the design under test. Monitor captures information from the design under test from the interface and converts the captured signals to transaction level sequence items. From the data transactions received from the monitor component via an analysis port, the actual values of the design under test and the expected values are compared.
The input stimuli to be driven to the design under test are also sent to the reference model.
UVM Transaction Base Class
- UVM Sequence Items
- UVM Sequence
Verification Flow
- Design Specification
- Testbench Architecture Planning
- Functional Verification Environment
- Reference Model Development
- Simulation and Verification
- Verification Analysis
Since the project uses UVM for verification, a UVM testbench architecture is used for the design verification. The reference model provides the expected data for functional verification of the design under test. Simulation-based verification is performed to check the functionality of the design under test.
After the simulation-based verification phase, verification analysis is performed to determine the adequacy and thoroughness of the design verification. Design verification requires extensive knowledge and understanding of the functionalities of the design to be verified. The revised implementation enables synchronized operation between the reference model and the design under test.
When sufficient verification has been performed on the design, the design verification proceeds to the verification close phase.
Project Timeline
INSTR_TYPE=
The verification environment looks through the test repository for pre-existing test cases on repeated test case simulation. This additional feature can also be bypassed, forcing the verification environment to execute the instruction code generation function and update the test repository using the command line argument “+FORCE_GEN”. The test log documentation contains instructions executed and the internal states of the pipeline registers.
Self-Checking Bug Detection
For testing the fault detection capabilities of the design verification methodology implemented, faults are deliberately introduced to the design under test. Due to the change to the source code, the program counter of the design under test increases only by 2. As of the modification performed to the program counter, the design under test provides an incorrect target address.
The test bench identifies the register address discrepancy accessed by the tested design and provides relevant information for debugging. When a right-shift arithmetic instruction is executed, the design-under-test ALU produces an incorrect result. As a result of not acknowledging the blocking control signal, the pipeline flow of the design under test is not stopped, as observed in the program mismatch counter.
Constrained-Random Verification
In cases where no logical errors are detected on the outcome, such detailed analysis may be unnecessary. Test log provides a more accessible analysis for specific internal conditions of the design, allowing test results analysis and verification to be performed without performing waveform debugging. Test cases are generated and stored in the test repository by listing the test seed values on the
By including the "-coverage" argument, functional coverage and code coverage analysis can be performed on the running simulation. BATCH_TEST" allows all test seeds generated and listed in "SEED.txt" to be run in a batch test. The inclusion of all test seeds allows comprehensive coverage analysis of the limited random verification performed.
Since the case statements for the reference model design are all assigned a specific value, it is expected that the standard theorem is not obeyed.
Conclusion
For the functional verification of the reference model, a self-checking mechanism was introduced in the UVM scoreboard component. The self-checking mechanism performs functional verification for major design functionalities, including pipeline establishment, pipeline flushing, and implemented instruction functionality. Intentional errors were also introduced in the design under test and reference model to test the error detection capability of the verification environment.
Simulation results were collected and explained to test the self-checking mechanism and to test the bug detection capability. For a limited random check, different test seeds with different specifications were created and tested. The unimplemented code is the result of default cases for multiple case statements, where all expected case statements have been assigned appropriately.
The project would have been completed with sufficient functional verification, as indicated by the full functional coverage and high code coverage.
Recommendations
Aside from using a higher confidence reference model, another recommendation that can be made is the fragmentation of the verification process. Lower-level verification may place more emphasis on the correctness of the functionality of the unit, while higher-level verification may place more emphasis on the overall correctness of the functionality and interconnection of the lower-level components. In this project, the functional coverage criteria include pipeline blocking, pipeline flushing, and instructions executed.
The lack of complex functional coverage point or cross coverage point makes it easy to achieve full functional coverage. A well-planned functional coverage would allow more complex test case scenarios to be included in the test plan, leading to a better verification. A complex functional coverage criterion would also accelerate the necessity of a complex test generator algorithm.
Finally, formal property verification can also be included for specific properties of the RISC-V microprocessor architecture, such as pipeline stalling and pipeline flushing. Description: Perform addition on the contents stored on source register rs1 and rs2 and store the result on destination register rd. Description: Perform subtraction on the contents stored on source register rs1 and rs2 and store the result on destination register rd.