求助PDF
{"title":"基于卷积关注和双分支特征网络的工业桶标文本检测","authors":"Ling Wang, Jing Zhang, Peng Wang, Yane Bai","doi":"10.1002/tee.24231","DOIUrl":null,"url":null,"abstract":"<p>Industrial barrel labels generally have low visual contrast, uneven lighting, and cluttered background, making it challenging to accurately locate text regions. This paper proposes a text detection network to solve the inaccurate localization problem based on DBNet. First, a convolutional attention mechanism is applied to the feature extraction network to get more valuable text feature maps. Then, a dual-branch convolutional feature module is proposed in the feature pyramid to enrich contextual information. Besides, during the probability map generation stage, using a feature remodeling enhancement module to further distinguish text and text boundaries. This paper designs comparative experiments on ILTD, ICDAR2015 and MSRA-TD500 datasets, achieve F-measure of 92.3%, 86.0% and 84.1%, which are 2.2%, 2.3%, and 1.9% higher than DBNet, respectively. They demonstrate that our proposed method exhibits competitive performance and strong robustness. © 2024 Institute of Electrical Engineers of Japan and Wiley Periodicals LLC.</p>","PeriodicalId":13435,"journal":{"name":"IEEJ Transactions on Electrical and Electronic Engineering","volume":"20 4","pages":"526-536"},"PeriodicalIF":1.0000,"publicationDate":"2024-11-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Text Detection on Industrial Barrel Label with Convolutional Attention and Dual-Branch Feature Network\",\"authors\":\"Ling Wang, Jing Zhang, Peng Wang, Yane Bai\",\"doi\":\"10.1002/tee.24231\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Industrial barrel labels generally have low visual contrast, uneven lighting, and cluttered background, making it challenging to accurately locate text regions. This paper proposes a text detection network to solve the inaccurate localization problem based on DBNet. First, a convolutional attention mechanism is applied to the feature extraction network to get more valuable text feature maps. Then, a dual-branch convolutional feature module is proposed in the feature pyramid to enrich contextual information. Besides, during the probability map generation stage, using a feature remodeling enhancement module to further distinguish text and text boundaries. This paper designs comparative experiments on ILTD, ICDAR2015 and MSRA-TD500 datasets, achieve F-measure of 92.3%, 86.0% and 84.1%, which are 2.2%, 2.3%, and 1.9% higher than DBNet, respectively. They demonstrate that our proposed method exhibits competitive performance and strong robustness. © 2024 Institute of Electrical Engineers of Japan and Wiley Periodicals LLC.</p>\",\"PeriodicalId\":13435,\"journal\":{\"name\":\"IEEJ Transactions on Electrical and Electronic Engineering\",\"volume\":\"20 4\",\"pages\":\"526-536\"},\"PeriodicalIF\":1.0000,\"publicationDate\":\"2024-11-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEJ Transactions on Electrical and Electronic Engineering\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://onlinelibrary.wiley.com/doi/10.1002/tee.24231\",\"RegionNum\":4,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEJ Transactions on Electrical and Electronic Engineering","FirstCategoryId":"5","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/tee.24231","RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
引用
批量引用