This is a true cutting-edge circuit design from industry which may lead to corporate relationship with Mentor Graphics. It is a book for professionals which has some small usage as a grad level text. It clusters well with many recent and upcoming titles in the heart of my signing target area. The MS is camera-ready and the authors are adding more introductory and comprehensive material to broaden its market further.
"synopsis" may belong to another edition of this title.
Preface The semiconductor industry, driven by ever-increasing demands for higher performance and reliability, as well as greater functionality and speeds, continuously introduces new higher density technologies and new integrated circuits. These circuits, like any other complex systems, not only have to meet the performance and functionality requirements, but they also have to be manufacturable. In particular, they have to be highly testable in order to meet extremely high and constantly growing quality requirements. The quality of testing is often defined as the number of faulty chips that pass the test for one million chips declared as good. Many microelectronics companies have already set their testing quality goals to less than 100 dpm (defects per million), and there is intensive ongoing research to lower this number to less than 10 dpm as targeted in the six sigma project pioneered by Motorola.
Many integrated circuits are produced in large volume and very often operate at high speeds. Since their manufacturing yield strongly depends on the silicon area, and their performance is directly related to the delays on critical paths, it is essential that the testing strategy provides a high fault coverage without a significant \ix area overhead and \ix performance degradation in order to build reliable and competitive products. It is a well-known fact that the costs associated with detecting faults rise over thousands of times from the time the product is specified to the time the product is released to customers. This is why the most effective way to prevent costly prototyping turns is to consider testing issues as early in the design cycle as possible. Tremendous practical importance of this problem generated an immense amount of research in an attempt to develop testing schemes of the ultimate quality. The increasing complexity of VLSI circuits, in the absence of a corresponding increase in the number of input and output pins, has made structured design for testability (DFT) and built-in self-test (BIST) two of the most important concepts in testing that profoundly influenced the area in recent years bar87. Scan design is a good example of structured DFT where, in the test mode, all memory elements are connected in scan chains, through which the test vectors can be shifted in and out. This solution enhances the controllability and observability of the circuit, and, as far as testing of combinational stuck-at faults is concerned, the circuit can be treated as a combinational network.
In BIST, the original circuit designed to perform the system function is appended with additional modules for generation of test patterns and compaction of test responses bar87. Thus, the BIST approach can be applied at all levels of testing, starting from wafer and device to system and field testing. It is widely accepted that appending these modules to the original circuit satisfies the high fault coverage requirement while reducing the dependence on expensive testing equipment. However, it is also agreed that this solution compromises a circuit's area and performance as it inevitably introduces either a hardware overhead or additional delays and increased latency. These delays may be excessive for high-speed circuits used in several new applications such as broadband packet switching, digital signal processing (DSP) for the asynchronous transfer mode (ATM), new generations of floating point processors, and others. Therefore, BIST schemes are evaluated thoroughly on the basis of the fault coverage they provide, \ix area overhead they require, and the performance penalty they introduce. A more detailed survey of existing DFT and BIST schemes is provided in Chapter 1. Further information can be found in abr90, aga93, aga93b, and bar87.
With the cost of testing becoming a significant part of the cost of new microelectronics products, with inevitably upcoming challenges of new deep submicron technologies, with the increasing role of the hardware-software codesign, and last but not least, with ever-changing customer expectations, a demand for new solutions and tools appears to be relentless. In particular, an unquestionable proliferation of high-performance data-path architectures clearly demonstrates how inadequate existing BIST schemes can be if they are to entail non-intrusive and at-speed testing and yet guarantee a portability of test procedures. Paradoxically, although the vastness of data-path architectures consist of powerful building blocks such as adders, multipliers, or arithmetic and logic units (ALUs) offering a very high computational potential, existing data-path BIST schemes are unfortunate examples of having sophisticated modules on the chip but remaining unable to translate this advantage into efficient nonintrusive testing schemes.
The approach presented in Chapters 2 through 8 is fundamentally different from the solutions introduced so far. It uses several generic building blocks, which are already in the \ix data path, as well as its very flexible and powerful control circuitry to generate patterns and compact test responses. This permits design of complex software-based, and thus very portable, BIST functions. These functions produce test vectors in the form of control signals, such as the type of ALU operation, the addresses of registers, the input to shifters, etc., rather than data, as it is done in all other systems. In such an environment, the need for extra hardware is either entirely eliminated or drastically reduced, test vectors can be easily distributed to different modules of the system, test responses can be collected in parallel, and there is virtually no \ix performance degradation. Furthermore, the approach can be used for at-speed testing, thereby providing a capability to detect failures that may not be detected by conventional low-speed testing. These characteristics make this method an exceptionally attractive testing scheme for a wide range of circuits including high performance DSP systems, microprocessors, and microcontrollers.
In the following chapters we will discuss several new fundamental concepts and practical scenarios concerned with test generation, test application, and test-response compaction performed by means of building blocks of high performance \ix data paths. We will show that even the simplest modules provide a very high potential for the integration of their features into a new generation of efficient and portable BIST schemes. As described techniques rest predominantly on arithmetic operations, these schemes will be jointly referred to as arithmetic built-in self-test (ABIST) methodology. We will demonstrate that the ABIST paradigm virtually eliminates a traditional dichotomy between the functional mode and the testing mode, as testing will be based on regular operations and with no interference into the circuit structure. It can be expected that it will create a next integration platform where off-line and \ix on-line BIST schemes will be merged together
. Chapter 2 introduces several test generation schemes that can be easily implemented in data paths based on adders, multipliers, and ALUs. These schemes may replace commonly used LFSR-based test-pattern generators, and consequently allow it to mimic several commonly used generation techniques. In particular, a new approach to generate pseudo-exhaustive test patterns by means of arithmetic operations is described. The resultant test patterns provide a complete state coverage on subspaces of contiguous bits.
The Accumulator-Based Compaction (ABC) scheme for parallel compaction of test responses is the subject of Chapter 3. We will demonstrate that the ABC scheme offers a quality of compaction similar to that of the best compactors based on multiple input signature registers (MISRs) or cellular automata (CA) of the same size. The presented characteristics can be used to estimate the fault coverage drop for a given circuit under test (CUT) characterized by its detection profile. The impact of the compactor's internal faults on the compaction quality is also examined.
Compaction schemes can be also used to perform fault diagnosis. Faults, especially single ones, can be easily identified by collecting signatures and comparing them with a dictionary of precomputed signatures. Chapter 4 examines the relationship between the size of the compactor, the size of the circuit which determines the number of faults, and the quality of diagnostic resolution measured as the percentage of faults that have unique signatures. Moreover, an adaptive procedure to facilitate the fault diagnosis in scan-based designs is also described. When running successive test experiments, it uses the ABC scheme to identify all scan flip-flops which are driven by erroneous signals.
Chapter 5 addresses testing of those data-path building blocks which play a key role in implementing more complex ABIST functions. The blocks analyzed are components forming multiply-and-accumulate structures; that is, the ALUs, multipliers, and register files. In addition, testability of simple microcontrollers is also discussed.
Chapter 6 deals with fault simulation techniques customized for ABIST applications. It starts with a method exploiting the hierarchy inherent in the data paths. Then it continues with an approach taking an advantage of the architectural regularity of several building blocks and concludes with a comparison of the described technique with the best gate-level fault simulation tools.
Perspectives for integration of the ABIST approach with a behavioral synthesis are examined in Chapter 7. A survey of methodology for incorporating ABIST elements into the high-level synthesis process is accompanied by the analysis of a relationship between input subspace state coverage and the structural fault coverage of various data-path building blocks.
In Chapter 8, several case studies are presented. First, schemes aimed at testing random logic accessible through multiple scan chains are examined. Next, the ABIST implementation of memory test algorithms is discussed along with customized arithmetic test-response compaction schemes adopted for this particular application. A scheme which is intended to enhance the testability of the digital decimators is subsequently described. This scheme is built around the circuitry used for a normal function. It exploits operations offered by already existing functional blocks of the decimators to perform basic testing functions. Finally, yet another scheme to encapsulate test responses is shown. It employs leaking integrators appearing in a variety of DSP circuits. A quantitative characterization of this circuit acting as a compactor of test responses is provided together with modifications leading to a very competitive compaction quality.
In several places throughout this book we will use an assembly level language. It will allow brief programs to be written for various data-path test scenarios presented in chapters devoted to the ABIST methodology. A detailed description of the language is included in Appendix B. We urge the reader to spend a few minutes studying this section so that the test software will be easily comprehended. Furthermore, a careful analysis of some of the test programs may reveal interesting implementation details illustrating effectiveness of the software-based self-test mechanisms.
This book is based on the results of research in ABIST, some of which have been presented in IEEE publications. We would like to acknowledge the IEEE for granting us the permission to utilize material from these works. We would like to thank our students at McGill University who contributed to portions of various chapters in this book. Mark Kassab developed the fault simulator described in Chapter 6 and performed hundreds of experiments on different data-path architectures. Sanjay Gupta ran several experiments on additive test-pattern generators and helped us typeset Appendix A. Kasia Radecka provided valuable assistance in running fault simulations on ALUs, multipliers, and microcontroller units. Nilanjan Mukherjee contributed to Chapter 7. We appreciate very much many helpful discussions we had with Saman Adham of Bell-Northern Research, Greg Young of Texas Instruments, and Aiman El-Maleh of Mentor Graphics. Robert Aitken of Hewlett Packard, Vivek Chickermane of IBM Microelectronics, and Sanjay Patel of Mentor Graphics provided useful suggestions and comments. It is our pleasure to acknowledge the support we received from the Cooperative Research and Development grant from Natural Sciences and Engineering Research Council of Canada and Northern Telecom in early stages of the research project leading to this book. Our special thanks go to Rod Favaron and Mark Olen of Mentor Graphics Corporation for providing support to complete this project. Last but not least, we would like to express our gratitude to Danusia Rajska for her help in preparation of the manuscript.
Jerzy TyszerFrom the Back Cover:
This book will introduce test and design engineers to new techniques that can be used to improve testing and quality of a wide range of circuits.This book explains what arithmetic built-in self-test (BIST) is, and how it can be used in a wide variety of circuits. It shows how BIST can support new design methodologies that rely on hardware/software co-design, DSP cores and embedded processors.Hardware/embedded system designers, test engineers, researchers working on IC/core testing, and graduate students.
"About this title" may belong to another edition of this title.
Book Description Prentice Hall, 1997. Hardcover. Book Condition: New. Never used!. Bookseller Inventory # P110137564384