Pulse Coupled Neural Network Edge-Based Algorithm for Image Text Locating*

IF 5.2 1区 计算机科学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS
Zhang Xin (张昕), Sun Fuchun (孙富春)
{"title":"Pulse Coupled Neural Network Edge-Based Algorithm for Image Text Locating*","authors":"Zhang Xin (张昕),&nbsp;Sun Fuchun (孙富春)","doi":"10.1016/S1007-0214(11)70004-9","DOIUrl":null,"url":null,"abstract":"<div><p>This paper presents a method for locating text based on a simplified pulse coupled neural network (PCNN). The PCNN generates a firings map in a similar way to the human visual system<span> with non-linear image processing<span>. The PCNN is used to segment the original image into different planes and edges detected using both the PCNN firings map and a phase congruency detector. The different edges are integrated using an automatically adjusted weighting coefficient. Both the simplified PCNN and the phase congruency energy model in the frequency domain imitate the human visual system. This paper shows how to use PCNN by changing the compute space from the spatial domain to the frequency domain for solving the text location problem. The algorithm is a simplified PCNN edge-based (PCNNE) algorithm. Three comparison tests are used to evaluate the algorithm. Tests on large data sets show PCNNE efficiently detects texts with various colors, font sizes, positions, and uneven illumination. This method outperforms several traditional methods both in text detection rate and text detection accuracy.</span></span></p></div>","PeriodicalId":60306,"journal":{"name":"Tsinghua Science and Technology","volume":null,"pages":null},"PeriodicalIF":5.2000,"publicationDate":"2011-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/S1007-0214(11)70004-9","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Tsinghua Science and Technology","FirstCategoryId":"1093","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1007021411700049","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 6

Abstract

This paper presents a method for locating text based on a simplified pulse coupled neural network (PCNN). The PCNN generates a firings map in a similar way to the human visual system with non-linear image processing. The PCNN is used to segment the original image into different planes and edges detected using both the PCNN firings map and a phase congruency detector. The different edges are integrated using an automatically adjusted weighting coefficient. Both the simplified PCNN and the phase congruency energy model in the frequency domain imitate the human visual system. This paper shows how to use PCNN by changing the compute space from the spatial domain to the frequency domain for solving the text location problem. The algorithm is a simplified PCNN edge-based (PCNNE) algorithm. Three comparison tests are used to evaluate the algorithm. Tests on large data sets show PCNNE efficiently detects texts with various colors, font sizes, positions, and uneven illumination. This method outperforms several traditional methods both in text detection rate and text detection accuracy.

基于脉冲耦合神经网络边缘的图像文本定位算法*
提出了一种基于简化脉冲耦合神经网络(PCNN)的文本定位方法。PCNN以与人类视觉系统相似的方式通过非线性图像处理生成放电图。使用PCNN发射图和相位一致性检测器将原始图像分割成不同的平面和边缘。使用自动调整的加权系数集成不同的边缘。简化后的PCNN和相位同余能量模型在频域上都模拟了人类的视觉系统。本文介绍了如何利用PCNN将计算空间从空间域改为频域来解决文本定位问题。该算法是一种简化的PCNN边缘算法(PCNNE)。通过三个比较测试对算法进行了评价。对大型数据集的测试表明,PCNNE可以有效地检测各种颜色、字体大小、位置和光照不均匀的文本。该方法在文本检测率和文本检测准确率方面都优于几种传统方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
12.10
自引率
0.00%
发文量
2340
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信