Functional verification

Last updated

Functional verification is the task of verifying that the logic design conforms to specification. [1] Functional verification attempts to answer the question "Does this proposed design do what is intended?" [2] This is complex and takes the majority of time and effort (up to 70% of design and development time) [1] in most large electronic system design projects. Functional verification is a part of more encompassing design verification, which, besides functional verification, considers non-functional aspects like timing, layout and power. [3]

Contents

Background

Although the number of transistors increased exponentially according to Moore's law, increasing the number of engineers and time taken to produce the designs only increase linearly. As the transistors' complexity increases, the number of coding errors also increases. Most of the errors in logic coding come from careless coding (12.7%), miscommunication (11.4%), and microarchitecture challenges (9.3%). [1] Thus, electronic design automation (EDA) tools are produced to catch up with the complexity of transistors design. Languages such as Verilog and VHDL are introduced together with the EDA tools. [1]

Functional verification is very difficult because of the sheer volume of possible test-cases that exist in even a simple design. Frequently there are more than 10^80 possible tests to comprehensively verify a design – a number that is impossible to achieve in a lifetime. This effort is equivalent to program verification, and is NP-hard or even worse – and no solution has been found that works well in all cases. However, it can be attacked by many methods. None of them are perfect, but each can be helpful in certain circumstances:

Types

There are three types of functional verification, namely: dynamic functional, hybrid dynamic functional/static, and static verification. [1]

Simulation based verification (also called 'dynamic verification') is widely used to "simulate" the design, since this method scales up very easily. Stimulus is provided to exercise each line in the HDL code. A test-bench is built to functionally verify the design by providing meaningful scenarios to check that given certain input, the design performs to specification.

A simulation environment is typically composed of several types of components:

Different coverage metrics are defined to assess that the design has been adequately exercised. These include functional coverage (has every functionality of the design been exercised?), statement coverage (has each line of HDL been exercised?), and branch coverage (has each direction of every branch been exercised?).

Tools

See also

Related Research Articles

Software testing is the act of examining the artifacts and the behavior of the software under test by validation and verification. Software testing can also provide an objective, independent view of the software to allow the business to appreciate and understand the risks of software implementation. Test techniques include, but are not necessarily limited to:

<span class="mw-page-title-main">VHDL</span> Hardware description language

The VHSIC Hardware Description Language (VHDL) is a hardware description language (HDL) that can model the behavior and structure of digital systems at multiple levels of abstraction, ranging from the system level down to that of logic gates, for design entry, documentation, and verification purposes. Since 1987, VHDL has been standardized by the Institute of Electrical and Electronics Engineers (IEEE) as IEEE Std 1076; the latest version of which is IEEE Std 1076-2019. To model analog and mixed-signal systems, an IEEE-standardized HDL based on VHDL called VHDL-AMS has been developed.

In computer engineering, a hardware description language (HDL) is a specialized computer language used to describe the structure and behavior of electronic circuits, and most commonly, digital logic circuits.

<span class="mw-page-title-main">System on a chip</span> Micro-electronic component

A system on a chip or system-on-chip is an integrated circuit that integrates most or all components of a computer or other electronic system. These components almost always include on-chip central processing unit (CPU), memory interfaces, input/output devices and interfaces, and secondary storage interfaces, often alongside other components such as radio modems and a graphics processing unit (GPU) – all on a single substrate or microchip. SoCs may contain digital and also analog, mixed-signal and often radio frequency signal processing functions.

<span class="mw-page-title-main">Application-specific integrated circuit</span> Integrated circuit customized for a specific task

An application-specific integrated circuit is an integrated circuit (IC) chip customized for a particular use, rather than intended for general-purpose use, such as a chip designed to run in a digital voice recorder or a high-efficiency video codec. Application-specific standard product chips are intermediate between ASICs and industry standard integrated circuits like the 7400 series or the 4000 series. ASIC chips are typically fabricated using metal–oxide–semiconductor (MOS) technology, as MOS integrated circuit chips.

Electronic design automation (EDA), also referred to as electronic computer-aided design (ECAD), is a category of software tools for designing electronic systems such as integrated circuits and printed circuit boards. The tools work together in a design flow that chip designers use to design and analyze entire semiconductor chips. Since a modern semiconductor chip can have billions of components, EDA tools are essential for their design; this article in particular describes EDA specifically with respect to integrated circuits (ICs).

In the context of hardware and software systems, formal verification is the act of proving or disproving the correctness of a system with respect to a certain formal specification or property, using formal methods of mathematics. Formal verification is a key incentive for formal specification of systems, and is at the core of formal methods. It represents an important dimension of analysis and verification in electronic design automation and is one approach to software verification. The use of formal verification enables the highest Evaluation Assurance Level (EAL7) in the framework of common criteria for computer security certification.

In software project management, software testing, and software engineering, verification and validation (V&V) is the process of checking that a software system meets specifications and requirements so that it fulfills its intended purpose. It may also be referred to as software quality control. It is normally the responsibility of software testers as part of the software development lifecycle. In simple terms, software verification is: "Assuming we should build X, does our software achieve its goals without any bugs or gaps?" On the other hand, software validation is: "Was X what we should have built? Does X meet the high-level requirements?"

Formal equivalence checking process is a part of electronic design automation (EDA), commonly used during the development of digital integrated circuits, to formally prove that two representations of a circuit design exhibit exactly the same behavior.

<span class="mw-page-title-main">SystemVerilog</span> Hardware description and hardware verification language

SystemVerilog, standardized as IEEE 1800, is a hardware description and hardware verification language used to model, design, simulate, test and implement electronic systems. SystemVerilog is based on Verilog and some extensions, and since 2008, Verilog is now part of the same IEEE standard. It is commonly used in the semiconductor and electronic design industry as an evolution of Verilog.

<span class="mw-page-title-main">Integrated circuit design</span> Engineering process for electronic hardware

Integrated circuit design, or IC design, is a sub-field of electronics engineering, encompassing the particular logic and circuit design techniques required to design integrated circuits, or ICs. ICs consist of miniaturized electronic components built into an electrical network on a monolithic semiconductor substrate by photolithography.

Runtime verification is a computing system analysis and execution approach based on extracting information from a running system and using it to detect and possibly react to observed behaviors satisfying or violating certain properties. Some very particular properties, such as datarace and deadlock freedom, are typically desired to be satisfied by all systems and may be best implemented algorithmically. Other properties can be more conveniently captured as formal specifications. Runtime verification specifications are typically expressed in trace predicate formalisms, such as finite state machines, regular expressions, context-free patterns, linear temporal logics, etc., or extensions of these. This allows for a less ad-hoc approach than normal testing. However, any mechanism for monitoring an executing system is considered runtime verification, including verifying against test oracles and reference implementations. When formal requirements specifications are provided, monitors are synthesized from them and infused within the system by means of instrumentation. Runtime verification can be used for many purposes, such as security or safety policy monitoring, debugging, testing, verification, validation, profiling, fault protection, behavior modification, etc. Runtime verification avoids the complexity of traditional formal verification techniques, such as model checking and theorem proving, by analyzing only one or a few execution traces and by working directly with the actual system, thus scaling up relatively well and giving more confidence in the results of the analysis, at the expense of less coverage. Moreover, through its reflective capabilities runtime verification can be made an integral part of the target system, monitoring and guiding its execution during deployment.

Logic simulation is the use of simulation software to predict the behavior of digital circuits and hardware description languages. Simulation can be performed at varying degrees of physical abstraction, such as at the transistor level, gate level, register-transfer level (RTL), electronic system-level (ESL), or behavioral level.

Intelligent Verification, including intelligent testbench automation, is a form of functional verification of electronic hardware designs used to verify that a design conforms to specification before device fabrication. Intelligent verification uses information derived from the design and specification(s) to expose bugs in and between hardware IPs. Intelligent verification tools require considerably less engineering effort and user guidance to achieve verification results that meet or exceed the standard approach of writing a testbench program.

In the automated design of integrated circuits, signoff checks is the collective name given to a series of verification steps that the design must pass before it can be taped out. This implies an iterative process involving incremental fixes across the board using one or more check types, and then retesting the design. There are two types of sign-off's: front-end sign-off and back-end sign-off. After back-end sign-off, the chip goes to fabrication. After listing out all the features in the specification, the verification engineer will write coverage for those features to identify bugs, and send back the RTL design to the designer. Bugs, or defects, can include issues like missing features, errors in design, etc. When the coverage reaches a maximum percentage then the verification team will sign it off. By using a methodology like UVM, OVM, or VMM, the verification team develops a reusable environment. Nowadays, UVM is more popular than others.

<span class="mw-page-title-main">Construction and Analysis of Distributed Processes</span>

CADP is a toolbox for the design of communication protocols and distributed systems. CADP is developed by the CONVECS team at INRIA Rhone-Alpes and connected to various complementary tools. CADP is maintained, regularly improved, and used in many industrial projects.

Random test generators are a type of computer software that is used in functional verification of microprocessors. Their primary use lies in providing input stimulus to a device under test.

<span class="mw-page-title-main">Device driver synthesis and verification</span>

Device drivers are programs which allow software or higher-level computer programs to interact with a hardware device. These software components act as a link between the devices and the operating systems, communicating with each of these systems and executing commands. They provide an abstraction layer for the software above and also mediate the communication between the operating system kernel and the devices below.

Random testing is a black-box software testing technique where programs are tested by generating random, independent inputs. Results of the output are compared against software specifications to verify that the test output is pass or fail. In case of absence of specifications the exceptions of the language are used which means if an exception arises during test execution then it means there is a fault in the program, it is also used as a way to avoid biased testing.

This article discusses a set of tactics useful in software testing. It is intended as a comprehensive list of tactical approaches to Software Quality Assurance (more widely colloquially known as Quality Assurance and general application of the test method.

References

  1. 1 2 3 4 5 Molina, A; Cadenas, O (8 September 2006). "Functional verification: approaches and challenges". Latin American Applied Research. 37. ISSN   0327-0793. Archived from the original on 16 October 2022. Retrieved 12 October 2022.
  2. Rezaeian, Banafsheh. "Simulation and Verification Methodology of Mixed Signal Automotive ICs". CiteSeerX   10.1.1.724.527 .
  3. Stroud, Charles E; Change, Yao-Chang (2009). "CHAPTER 1 – Introduction". Design Verification. pp. 1–38. doi:10.1016/B978-0-12-374364-0.50008-4. ISBN   978-0-12-374364-0. Archived from the original on 12 October 2022. Retrieved 11 October 2022.{{cite book}}: |journal= ignored (help)