DISTRACTIONS IN DESIGN FOR TESTABILITY AND BUILT-IN SELF-TEST

C. Stroud
{"title":"DISTRACTIONS IN DESIGN FOR TESTABILITY AND BUILT-IN SELF-TEST","authors":"C. Stroud","doi":"10.1109/TEST.1991.519787","DOIUrl":null,"url":null,"abstract":"DFT techniques, such as LSSD, emerged as a result of the rapid increase in circuit density and complexity in the LSI circuits of the 1970’s and provided high fault coverage at the device level of testing. BIST emerged circa 1980 with the widespread development of VLSI circuits and provided potentially high fault coverage at all levels of testing, from the device level through system diagnostics. One would think that, with the increased complexity of VLSI devices and circuit boards of the 1990’s, DFT and BIST techniques would be an integral part of current design methodologies. Yet, in some cases, circuit designers continue to be reluctant to incorporate DFT and BIST in their designs. This reluctance has traditionally been attributed to the area and performance penalties associated with DFT and BIST techniques. But, as a VLSI designer, CAD tool developer, and disciple of BIST, I believe that the history of DFT and BIST techniques has been marked by developments, related to VLSI design and testing, which have distracted designers from the incorporation of these techniques. For example, by the end of the 1970s, CAD tools were in place to provide automatic implementation and test pattern generation for LSSD. Area and performance penalties were minimized and could be further reduced by using partial scan design techniques. CAD tools for implementation and support of LSSD were developed to the point that DFT was probably the most automated aspect of system level VLSI design. Then along came BIST. BIST was attractive in that it offered an at-speed testing capability which could be use at all levels of testing since test pattern generation and output response compaction circuitry were an inherent part of the BIST scheme. And BIST provided elegant solutions to testing regular structures, such as memories, which had posed problems to LSSD. Throughout the 1980s, different BIST approaches for regular structures were developed based on the type of structure and BIST approaches for general sequential logic were proposed. But the various BIST techniques were disputed with respect to their effectiveness versus their area overhead, performance penalty, and difficulty of implementation. As a result of the trade-offs associated with the different approaches, the design community became confused to the extent that a certain level of expertise was required to effectively implement BIST in a given VLSI design. Projects with local experts in DFT and BIST were able to make essential plans and decisions while projects without this expertise suffered to the point of failing to implement any DFT. By the mid1980s, CAD tools for the automation of B E T in regular structures were developed but the lack of a generally accepted approach for sequential logic hindered the development of tools that would automate the various BIST implementations required for a complete VLSI design. Then along came high-level synthesis. High-level synthesis provided implementation of the gate or transistor level design of a VLSI circuit from a behavioral description of the circuit. The development of high-level synthesis was, for the most part, performed by people with little background in DFT and BIST. As a result, minimization of the resultant logic was the primary goal of high-level synthesis and testability considerations were ignored under the assumption that minimized logic was testable logic. In addition, high-level synthesis took the designer to higher levels of abstraction and away from an intimate knowledge of the gate level structure of the circuit that had previously served the designer in making informed decisions with respect to DFT and BIST approaches. The level of design automation provided by high-level synthesis made the CAD tools for DFT and the emerging tools for BET look crude by comparison. The reduced design intervals afforded by high-level synthesis gave designers even less time to consider DFT and BIST for their designs. This created the problem of educating the synthesis tool developers, as well as designers, in the need for and importance of DFI and BIST so that these techniqucs would be incorporated into the synthesis systems. By the end of the 1980s, BIST approaches were being sought and proposed which could be parameterized and automated, and highlevel synthesis developers were beginning to acknowledge the need for the integration of DFT and BIST into the synthesis tools. Then along came Boundary Scan. Boundary Scan provided a solution to the problem of testing interconnect on high density surface mount circuit boards and, being based on LSSD, did not require major development in order to be a viable approach. But to encourage acceptance of the additional pins and package size, Boundary Scan was sold to designers as providing access to BIST. This access to BIST was not a problem in itself but did provide another distraction to the extent that some designers were under the impression that if they incorporated Boundary Scan, their VLSI designs would be self-testing as well. Therefore, the reasons for the reluctance on the part of some designers to incorporate DFT and BIST techniques in their designs may be due more to historical reasons than to any specific technical reasons. Ken Parker (“Testability: barriers to acceptance”, Design & Test of Computers, Oct. 1986) stated it best when he said: “Indeed, the community of DFT [and BIST] partisans may well be responsible for many of them.” Development of local DFT/ BIST experts within projects may be the best way to avoid future distractions.","PeriodicalId":272630,"journal":{"name":"1991, Proceedings. International Test Conference","volume":"25 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1991-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"1991, Proceedings. International Test Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/TEST.1991.519787","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

DFT techniques, such as LSSD, emerged as a result of the rapid increase in circuit density and complexity in the LSI circuits of the 1970’s and provided high fault coverage at the device level of testing. BIST emerged circa 1980 with the widespread development of VLSI circuits and provided potentially high fault coverage at all levels of testing, from the device level through system diagnostics. One would think that, with the increased complexity of VLSI devices and circuit boards of the 1990’s, DFT and BIST techniques would be an integral part of current design methodologies. Yet, in some cases, circuit designers continue to be reluctant to incorporate DFT and BIST in their designs. This reluctance has traditionally been attributed to the area and performance penalties associated with DFT and BIST techniques. But, as a VLSI designer, CAD tool developer, and disciple of BIST, I believe that the history of DFT and BIST techniques has been marked by developments, related to VLSI design and testing, which have distracted designers from the incorporation of these techniques. For example, by the end of the 1970s, CAD tools were in place to provide automatic implementation and test pattern generation for LSSD. Area and performance penalties were minimized and could be further reduced by using partial scan design techniques. CAD tools for implementation and support of LSSD were developed to the point that DFT was probably the most automated aspect of system level VLSI design. Then along came BIST. BIST was attractive in that it offered an at-speed testing capability which could be use at all levels of testing since test pattern generation and output response compaction circuitry were an inherent part of the BIST scheme. And BIST provided elegant solutions to testing regular structures, such as memories, which had posed problems to LSSD. Throughout the 1980s, different BIST approaches for regular structures were developed based on the type of structure and BIST approaches for general sequential logic were proposed. But the various BIST techniques were disputed with respect to their effectiveness versus their area overhead, performance penalty, and difficulty of implementation. As a result of the trade-offs associated with the different approaches, the design community became confused to the extent that a certain level of expertise was required to effectively implement BIST in a given VLSI design. Projects with local experts in DFT and BIST were able to make essential plans and decisions while projects without this expertise suffered to the point of failing to implement any DFT. By the mid1980s, CAD tools for the automation of B E T in regular structures were developed but the lack of a generally accepted approach for sequential logic hindered the development of tools that would automate the various BIST implementations required for a complete VLSI design. Then along came high-level synthesis. High-level synthesis provided implementation of the gate or transistor level design of a VLSI circuit from a behavioral description of the circuit. The development of high-level synthesis was, for the most part, performed by people with little background in DFT and BIST. As a result, minimization of the resultant logic was the primary goal of high-level synthesis and testability considerations were ignored under the assumption that minimized logic was testable logic. In addition, high-level synthesis took the designer to higher levels of abstraction and away from an intimate knowledge of the gate level structure of the circuit that had previously served the designer in making informed decisions with respect to DFT and BIST approaches. The level of design automation provided by high-level synthesis made the CAD tools for DFT and the emerging tools for BET look crude by comparison. The reduced design intervals afforded by high-level synthesis gave designers even less time to consider DFT and BIST for their designs. This created the problem of educating the synthesis tool developers, as well as designers, in the need for and importance of DFI and BIST so that these techniqucs would be incorporated into the synthesis systems. By the end of the 1980s, BIST approaches were being sought and proposed which could be parameterized and automated, and highlevel synthesis developers were beginning to acknowledge the need for the integration of DFT and BIST into the synthesis tools. Then along came Boundary Scan. Boundary Scan provided a solution to the problem of testing interconnect on high density surface mount circuit boards and, being based on LSSD, did not require major development in order to be a viable approach. But to encourage acceptance of the additional pins and package size, Boundary Scan was sold to designers as providing access to BIST. This access to BIST was not a problem in itself but did provide another distraction to the extent that some designers were under the impression that if they incorporated Boundary Scan, their VLSI designs would be self-testing as well. Therefore, the reasons for the reluctance on the part of some designers to incorporate DFT and BIST techniques in their designs may be due more to historical reasons than to any specific technical reasons. Ken Parker (“Testability: barriers to acceptance”, Design & Test of Computers, Oct. 1986) stated it best when he said: “Indeed, the community of DFT [and BIST] partisans may well be responsible for many of them.” Development of local DFT/ BIST experts within projects may be the best way to avoid future distractions.
可测试性和内置自测设计中的干扰
这种对BIST的访问本身并不是一个问题,但确实提供了另一个分散注意力的程度,一些设计师认为,如果他们加入边界扫描,他们的VLSI设计也将自我测试。因此,一些设计者不愿意在他们的设计中采用DFT和BIST技术的原因可能更多是由于历史原因,而不是任何具体的技术原因。Ken Parker(“可测试性:接受的障碍”,《计算机设计与测试》,1986年10月)说得最好:“事实上,DFT[和BIST]的支持者社区很可能要对其中的许多问题负责。”在项目中培养本地DFT/ BIST专家可能是避免未来分心的最好方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信