FAN algorithm

Last updated

FAN algorithm is an algorithm for automatic test pattern generation (ATPG). It was invented in 1983 by Hideo Fujiwara and Takeshi Shimono at the Department of Electronic Engineering, Osaka University, Japan. [1] It was the fastest ATPG algorithm at that time and was subsequently adopted by industry. The FAN algorithm succeeded in reducing the number of backtracks by adopting new heuristics such as unique sensitization and multiple back tracing. [1] Unique sensitization is to determine as many signal values as possible that can be uniquely implied. Multiple backtracing is concurrent backtracing of more than one path, which is more efficient than backtracing along a single path. In order to reduce the number of backtracks, it is important to find the nonexistence of the solution as soon as possible. When we find that there exists no solution we should backtrack immediately to avoid the subsequent unnecessary search. These heuristics can lead to the early detection of inconsistency and decrease the number of backtracks. FAN algorithm has been introduced in several books [2] [3] [4] and many conference papers such as ACM/IEEE Design Automation Conference, et al. [5]

Implementations

Related Research Articles

A* is a graph traversal and pathfinding algorithm, which is used in many fields of computer science due to its completeness, optimality, and optimal efficiency. Given a weighted graph, a source node and a goal node, the algorithm finds the shortest path from source to goal.

Constraint satisfaction problems (CSPs) are mathematical questions defined as a set of objects whose state must satisfy a number of constraints or limitations. CSPs represent the entities in a problem as a homogeneous collection of finite constraints over variables, which is solved by constraint satisfaction methods. CSPs are the subject of research in both artificial intelligence and operations research, since the regularity in their formulation provides a common basis to analyze and solve problems of many seemingly unrelated families. CSPs often exhibit high complexity, requiring a combination of heuristics and combinatorial search methods to be solved in a reasonable time. Constraint programming (CP) is the field of research that specifically focuses on tackling these kinds of problems. Additionally, the Boolean satisfiability problem (SAT), satisfiability modulo theories (SMT), mixed integer programming (MIP) and answer set programming (ASP) are all fields of research focusing on the resolution of particular forms of the constraint satisfaction problem.

ATPG is an electronic design automation method or technology used to find an input sequence that, when applied to a digital circuit, enables automatic test equipment to distinguish between the correct circuit behavior and the faulty circuit behavior caused by defects. The generated patterns are used to test semiconductor devices after manufacture, or to assist with determining the cause of failure. The effectiveness of ATPG is measured by the number of modeled defects, or fault models, detectable and by the number of generated patterns. These metrics generally indicate test quality and test application time. ATPG efficiency is another important consideration that is influenced by the fault model under consideration, the type of circuit under test, the level of abstraction used to represent the circuit under test, and the required test quality.

<span class="mw-page-title-main">Dominator (graph theory)</span> When every path in a control-flow graph must go through one node to reach another

In computer science, a node d of a control-flow graph dominates a node n if every path from the entry node to n must go through d. Notationally, this is written as d dom n. By definition, every node dominates itself.

In theoretical computer science, the subgraph isomorphism problem is a computational task in which two graphs G and H are given as input, and one must determine whether G contains a subgraph that is isomorphic to H. Subgraph isomorphism is a generalization of both the maximum clique problem and the problem of testing whether a graph contains a Hamiltonian cycle, and is therefore NP-complete. However certain other cases of subgraph isomorphism may be solved in polynomial time.

In computer science, symbolic execution (also symbolic evaluation or symbex) is a means of analyzing a program to determine what inputs cause each part of a program to execute. An interpreter follows the program, assuming symbolic values for inputs rather than obtaining actual inputs as normal execution of the program would. It thus arrives at expressions in terms of those symbols for expressions and variables in the program, and constraints in terms of those symbols for the possible outcomes of each conditional branch. Finally, the possible inputs that trigger a branch can be determined by solving the constraints.

In computer science and mathematical optimization, a metaheuristic is a higher-level procedure or heuristic designed to find, generate, tune, or select a heuristic that may provide a sufficiently good solution to an optimization problem or a machine learning problem, especially with incomplete or imperfect information or limited computation capacity. Metaheuristics sample a subset of solutions which is otherwise too large to be completely enumerated or otherwise explored. Metaheuristics may make relatively few assumptions about the optimization problem being solved and so may be usable for a variety of problems.

<span class="mw-page-title-main">Model-based testing</span>

Model-based testing is an application of model-based design for designing and optionally also executing artifacts to perform software testing or system testing. Models can be used to represent the desired behavior of a system under test (SUT), or to represent testing strategies and a test environment. The picture on the right depicts the former approach.

The graph isomorphism problem is the computational problem of determining whether two finite graphs are isomorphic.

Anatoly Abramovich Shalyto is a Russian scientist, doctor of sciences, and professor. He was awarded by Russian State Government in 2008 for his achievements in education and his development of the technology for Automata-based programming called "Switch-technology." He is also an initiator of the Open Project Documentation Initiative.

In electronic design, wire routing, commonly called simply routing, is a step in the design of printed circuit boards (PCBs) and integrated circuits (ICs). It builds on a preceding step, called placement, which determines the location of each active element of an IC or component on a PCB. After placement, the routing step adds wires needed to properly connect the placed components while obeying all design rules for the IC. Together, the placement and routing steps of IC design are known as place and route.

Placement is an essential step in electronic design automation — the portion of the physical design flow that assigns exact locations for various circuit components within the chip's core area. An inferior placement assignment will not only affect the chip's performance but might also make it non-manufacturable by producing excessive wire-length, which is beyond available routing resources. Consequently, a placer must perform the assignment while optimizing a number of objectives to ensure that a circuit meets its performance demands. Together, the placement and routing steps of IC design are known as place and route.

A memetic algorithm (MA) in computer science and operations research, is an extension of the traditional genetic algorithm (GA) or more general evolutionary algorithm (EA). It may provide a sufficiently good solution to an optimization problem. It uses a suitable heuristic or local search technique to improve the quality of solutions generated by the EA and to reduce the likelihood of premature convergence.

Search-based software engineering (SBSE) applies metaheuristic search techniques such as genetic algorithms, simulated annealing and tabu search to software engineering problems. Many activities in software engineering can be stated as optimization problems. Optimization techniques of operations research such as linear programming or dynamic programming are often impractical for large scale software engineering problems because of their computational complexity or their assumptions on the problem structure. Researchers and practitioners use metaheuristic search techniques, which impose little assumptions on the problem structure, to find near-optimal or "good-enough" solutions.

<span class="mw-page-title-main">Rapidly exploring random tree</span> Search algorithm

A rapidly exploring random tree (RRT) is an algorithm designed to efficiently search nonconvex, high-dimensional spaces by randomly building a space-filling tree. The tree is constructed incrementally from samples drawn randomly from the search space and is inherently biased to grow towards large unsearched areas of the problem. RRTs were developed by Steven M. LaValle and James J. Kuffner Jr. They easily handle problems with obstacles and differential constraints and have been widely used in autonomous robotic motion planning.

In computer science and formal methods, a SAT solver is a computer program which aims to solve the Boolean satisfiability problem. On input a formula over Boolean variables, such as "(x or y) and (x or not y)", a SAT solver outputs whether the formula is satisfiable, meaning that there are possible values of x and y which make the formula true, or unsatisfiable, meaning that there are no such values of x and y. In this case, the formula is satisfiable when x is true, so the solver should return "satisfiable". Since the introduction of algorithms for SAT in the 1960s, modern SAT solvers have grown into complex software artifacts involving a large number of heuristics and program optimizations to work efficiently.

Test compression is a technique used to reduce the time and cost of testing integrated circuits. The first ICs were tested with test vectors created by hand. It proved very difficult to get good coverage of potential faults, so Design for testability (DFT) based on scan and automatic test pattern generation (ATPG) were developed to explicitly test each gate and path in a design. These techniques were very successful at creating high-quality vectors for manufacturing test, with excellent test coverage. However, as chips got bigger and more complex the ratio of logic to be tested per pin increased dramatically, and the volume of scan test data started causing a significant increase in test time, and required tester memory. This raised the cost of testing.

In computer science, conflict-driven clause learning (CDCL) is an algorithm for solving the Boolean satisfiability problem (SAT). Given a Boolean formula, the SAT problem asks for an assignment of variables so that the entire formula evaluates to true. The internal workings of CDCL SAT solvers were inspired by DPLL solvers. The main difference between CDCL and DPLL is that CDCL's backjumping is non-chronological.

Hideo Fujiwara is a Japanese computer scientist who made significant contributions to ATPG algorithms. As one of his works, he invented the FAN algorithm in 1983, which was the fastest ATPG algorithm at that time, and was adopted by industry.

References

  1. 1 2 Fujiwara, Hideo; Shimono, Takeshi (December 1983). "On the acceleration of test generation algorithm". IEEE Transactions on Computers. C-32 (12): 1137–1144. doi:10.1109/TC.1983.1676174. S2CID   9547677.
  2. Fujiwara, Hideo (September 1985). Logic Testing and Design for Testability. MIT Press. ISBN   9780262561990.
  3. Abramovici, Miron; Breuer, Melvin A.; Friedman, Arthur D. (1990). Digital Systems Testing and Testable Design. IEEE Press. ISBN   9780780310629.
  4. Niraj, Jha; Gupta, Sandeep (2003). Testing of Digital Systems. Cambridge University Press. ISBN   0521773563.
  5. Kirkland, Tom; Mercer, M. Ray (1987). "A topological search algorithm for ATPG". 24th ACM/IEEE conference proceedings on Design automation conference - DAC '87. pp. 502–508. doi: 10.1145/37888.37963 . ISBN   0818607815.