{"title":"DISTRACTIONS IN DESIGN FOR TESTABILITY AND BUILT-IN SELF-TEST","authors":"C. Stroud","doi":"10.1109/TEST.1991.519787","DOIUrl":null,"url":null,"abstract":"DFT techniques, such as LSSD, emerged as a result of the rapid increase in circuit density and complexity in the LSI circuits of the 1970’s and provided high fault coverage at the device level of testing. BIST emerged circa 1980 with the widespread development of VLSI circuits and provided potentially high fault coverage at all levels of testing, from the device level through system diagnostics. One would think that, with the increased complexity of VLSI devices and circuit boards of the 1990’s, DFT and BIST techniques would be an integral part of current design methodologies. Yet, in some cases, circuit designers continue to be reluctant to incorporate DFT and BIST in their designs. This reluctance has traditionally been attributed to the area and performance penalties associated with DFT and BIST techniques. But, as a VLSI designer, CAD tool developer, and disciple of BIST, I believe that the history of DFT and BIST techniques has been marked by developments, related to VLSI design and testing, which have distracted designers from the incorporation of these techniques. For example, by the end of the 1970s, CAD tools were in place to provide automatic implementation and test pattern generation for LSSD. Area and performance penalties were minimized and could be further reduced by using partial scan design techniques. CAD tools for implementation and support of LSSD were developed to the point that DFT was probably the most automated aspect of system level VLSI design. Then along came BIST. BIST was attractive in that it offered an at-speed testing capability which could be use at all levels of testing since test pattern generation and output response compaction circuitry were an inherent part of the BIST scheme. And BIST provided elegant solutions to testing regular structures, such as memories, which had posed problems to LSSD. Throughout the 1980s, different BIST approaches for regular structures were developed based on the type of structure and BIST approaches for general sequential logic were proposed. But the various BIST techniques were disputed with respect to their effectiveness versus their area overhead, performance penalty, and difficulty of implementation. As a result of the trade-offs associated with the different approaches, the design community became confused to the extent that a certain level of expertise was required to effectively implement BIST in a given VLSI design. Projects with local experts in DFT and BIST were able to make essential plans and decisions while projects without this expertise suffered to the point of failing to implement any DFT. By the mid1980s, CAD tools for the automation of B E T in regular structures were developed but the lack of a generally accepted approach for sequential logic hindered the development of tools that would automate the various BIST implementations required for a complete VLSI design. Then along came high-level synthesis. High-level synthesis provided implementation of the gate or transistor level design of a VLSI circuit from a behavioral description of the circuit. The development of high-level synthesis was, for the most part, performed by people with little background in DFT and BIST. As a result, minimization of the resultant logic was the primary goal of high-level synthesis and testability considerations were ignored under the assumption that minimized logic was testable logic. In addition, high-level synthesis took the designer to higher levels of abstraction and away from an intimate knowledge of the gate level structure of the circuit that had previously served the designer in making informed decisions with respect to DFT and BIST approaches. The level of design automation provided by high-level synthesis made the CAD tools for DFT and the emerging tools for BET look crude by comparison. The reduced design intervals afforded by high-level synthesis gave designers even less time to consider DFT and BIST for their designs. This created the problem of educating the synthesis tool developers, as well as designers, in the need for and importance of DFI and BIST so that these techniqucs would be incorporated into the synthesis systems. By the end of the 1980s, BIST approaches were being sought and proposed which could be parameterized and automated, and highlevel synthesis developers were beginning to acknowledge the need for the integration of DFT and BIST into the synthesis tools. Then along came Boundary Scan. Boundary Scan provided a solution to the problem of testing interconnect on high density surface mount circuit boards and, being based on LSSD, did not require major development in order to be a viable approach. But to encourage acceptance of the additional pins and package size, Boundary Scan was sold to designers as providing access to BIST. This access to BIST was not a problem in itself but did provide another distraction to the extent that some designers were under the impression that if they incorporated Boundary Scan, their VLSI designs would be self-testing as well. Therefore, the reasons for the reluctance on the part of some designers to incorporate DFT and BIST techniques in their designs may be due more to historical reasons than to any specific technical reasons. Ken Parker (“Testability: barriers to acceptance”, Design & Test of Computers, Oct. 1986) stated it best when he said: “Indeed, the community of DFT [and BIST] partisans may well be responsible for many of them.” Development of local DFT/ BIST experts within projects may be the best way to avoid future distractions.","PeriodicalId":272630,"journal":{"name":"1991, Proceedings. International Test Conference","volume":"25 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1991-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"1991, Proceedings. International Test Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/TEST.1991.519787","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
DFT techniques, such as LSSD, emerged as a result of the rapid increase in circuit density and complexity in the LSI circuits of the 1970’s and provided high fault coverage at the device level of testing. BIST emerged circa 1980 with the widespread development of VLSI circuits and provided potentially high fault coverage at all levels of testing, from the device level through system diagnostics. One would think that, with the increased complexity of VLSI devices and circuit boards of the 1990’s, DFT and BIST techniques would be an integral part of current design methodologies. Yet, in some cases, circuit designers continue to be reluctant to incorporate DFT and BIST in their designs. This reluctance has traditionally been attributed to the area and performance penalties associated with DFT and BIST techniques. But, as a VLSI designer, CAD tool developer, and disciple of BIST, I believe that the history of DFT and BIST techniques has been marked by developments, related to VLSI design and testing, which have distracted designers from the incorporation of these techniques. For example, by the end of the 1970s, CAD tools were in place to provide automatic implementation and test pattern generation for LSSD. Area and performance penalties were minimized and could be further reduced by using partial scan design techniques. CAD tools for implementation and support of LSSD were developed to the point that DFT was probably the most automated aspect of system level VLSI design. Then along came BIST. BIST was attractive in that it offered an at-speed testing capability which could be use at all levels of testing since test pattern generation and output response compaction circuitry were an inherent part of the BIST scheme. And BIST provided elegant solutions to testing regular structures, such as memories, which had posed problems to LSSD. Throughout the 1980s, different BIST approaches for regular structures were developed based on the type of structure and BIST approaches for general sequential logic were proposed. But the various BIST techniques were disputed with respect to their effectiveness versus their area overhead, performance penalty, and difficulty of implementation. As a result of the trade-offs associated with the different approaches, the design community became confused to the extent that a certain level of expertise was required to effectively implement BIST in a given VLSI design. Projects with local experts in DFT and BIST were able to make essential plans and decisions while projects without this expertise suffered to the point of failing to implement any DFT. By the mid1980s, CAD tools for the automation of B E T in regular structures were developed but the lack of a generally accepted approach for sequential logic hindered the development of tools that would automate the various BIST implementations required for a complete VLSI design. Then along came high-level synthesis. High-level synthesis provided implementation of the gate or transistor level design of a VLSI circuit from a behavioral description of the circuit. The development of high-level synthesis was, for the most part, performed by people with little background in DFT and BIST. As a result, minimization of the resultant logic was the primary goal of high-level synthesis and testability considerations were ignored under the assumption that minimized logic was testable logic. In addition, high-level synthesis took the designer to higher levels of abstraction and away from an intimate knowledge of the gate level structure of the circuit that had previously served the designer in making informed decisions with respect to DFT and BIST approaches. The level of design automation provided by high-level synthesis made the CAD tools for DFT and the emerging tools for BET look crude by comparison. The reduced design intervals afforded by high-level synthesis gave designers even less time to consider DFT and BIST for their designs. This created the problem of educating the synthesis tool developers, as well as designers, in the need for and importance of DFI and BIST so that these techniqucs would be incorporated into the synthesis systems. By the end of the 1980s, BIST approaches were being sought and proposed which could be parameterized and automated, and highlevel synthesis developers were beginning to acknowledge the need for the integration of DFT and BIST into the synthesis tools. Then along came Boundary Scan. Boundary Scan provided a solution to the problem of testing interconnect on high density surface mount circuit boards and, being based on LSSD, did not require major development in order to be a viable approach. But to encourage acceptance of the additional pins and package size, Boundary Scan was sold to designers as providing access to BIST. This access to BIST was not a problem in itself but did provide another distraction to the extent that some designers were under the impression that if they incorporated Boundary Scan, their VLSI designs would be self-testing as well. Therefore, the reasons for the reluctance on the part of some designers to incorporate DFT and BIST techniques in their designs may be due more to historical reasons than to any specific technical reasons. Ken Parker (“Testability: barriers to acceptance”, Design & Test of Computers, Oct. 1986) stated it best when he said: “Indeed, the community of DFT [and BIST] partisans may well be responsible for many of them.” Development of local DFT/ BIST experts within projects may be the best way to avoid future distractions.