{"title":"PIDDN: Pair-Image-Based Defect Detection Network With Template for PCB Inspection","authors":"Qixing Jiang;Xiaojun Wu;Jinghui Zhou;Jun Cheng","doi":"10.1109/TCPMT.2025.3543396","DOIUrl":null,"url":null,"abstract":"The detection of defects has consistently posed a significant obstacle in the domain of machine vision, particularly in the context of the production of printed circuit board (PCB) components. To advance this technology, we present pair-image-based defect detection network (PIDDN), a novel framework that enables precise detection of object bounding boxes by utilizing a pair of images, with one designated as a template. The PIDDN approach involves the utilization of a Siamese neural network to simultaneously encode the features of the pair image, followed by the implementation of a template feature fusion network (TFFN) for integration. In addition, we introduce a template feature rectification module (TFRM) that aligns the feature maps of the pair-images attentively. We evaluate PIDDN on the DeepPCB dataset, and it achieves an impressive mean average precision (mAP) score of 99.6%. Furthermore, we present PairPCB, a complex and realistic dataset collected from real-world PCB production scenarios, to validate the effectiveness of template images. Extensive experiments demonstrate that PIDDN outperforms mainstream object detection algorithms with a 4.1% improvement in mAP. Code will be available at: <uri>https://github.com/QixingJiang/PIDDN</uri>.","PeriodicalId":13085,"journal":{"name":"IEEE Transactions on Components, Packaging and Manufacturing Technology","volume":"15 4","pages":"830-841"},"PeriodicalIF":2.3000,"publicationDate":"2025-02-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Components, Packaging and Manufacturing Technology","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/10897994/","RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
The detection of defects has consistently posed a significant obstacle in the domain of machine vision, particularly in the context of the production of printed circuit board (PCB) components. To advance this technology, we present pair-image-based defect detection network (PIDDN), a novel framework that enables precise detection of object bounding boxes by utilizing a pair of images, with one designated as a template. The PIDDN approach involves the utilization of a Siamese neural network to simultaneously encode the features of the pair image, followed by the implementation of a template feature fusion network (TFFN) for integration. In addition, we introduce a template feature rectification module (TFRM) that aligns the feature maps of the pair-images attentively. We evaluate PIDDN on the DeepPCB dataset, and it achieves an impressive mean average precision (mAP) score of 99.6%. Furthermore, we present PairPCB, a complex and realistic dataset collected from real-world PCB production scenarios, to validate the effectiveness of template images. Extensive experiments demonstrate that PIDDN outperforms mainstream object detection algorithms with a 4.1% improvement in mAP. Code will be available at: https://github.com/QixingJiang/PIDDN.
期刊介绍:
IEEE Transactions on Components, Packaging, and Manufacturing Technology publishes research and application articles on modeling, design, building blocks, technical infrastructure, and analysis underpinning electronic, photonic and MEMS packaging, in addition to new developments in passive components, electrical contacts and connectors, thermal management, and device reliability; as well as the manufacture of electronics parts and assemblies, with broad coverage of design, factory modeling, assembly methods, quality, product robustness, and design-for-environment.