用于自主检测机器人的工业压力计检测和读取

Jonas Günther, Martin Oehler, S. Kohlbrecher, O. Stryk
{"title":"用于自主检测机器人的工业压力计检测和读取","authors":"Jonas Günther, Martin Oehler, S. Kohlbrecher, O. Stryk","doi":"10.1109/ecmr50962.2021.9568833","DOIUrl":null,"url":null,"abstract":"Autonomous mobile robots for industrial inspection can reduce cost for digitalization of existing plants by performing autonomous routine inspections. A frequent task is reading of analog gauges to monitor the health of the facility. Automating this process involves capturing image data with a camera sensor and processing the data to read the value. Detection algorithms deployed on a mobile robot have to deal with increased uncertainty regarding localization and environmental influences. This imposes increased requirements regarding robustness to viewing angle, lighting and scale variation on detection and reading. Current approaches based on conventional computer vision require high quality images or prior knowledge. We address these limitations by leveraging the advances of neural networks in the task of object detection and instance segmentation in a two-stage pipeline. Our method robustly detects and reads manometers without prior knowledge of object location or exact object type. In our evaluation we show that our approach can detect and read manometers from a distance of up to 3m and a viewing angle of up to 60° in different lighting conditions with needle angle estimation errors of ±2.2°. We publish the validation split of our training dataset for manometer and needle detection at https://tudatalib.ulb.tu-darmstadt.de/handle/tudatalib/2881.","PeriodicalId":200521,"journal":{"name":"2021 European Conference on Mobile Robots (ECMR)","volume":"8 3","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Industrial Manometer Detection and Reading for Autonomous Inspection Robots\",\"authors\":\"Jonas Günther, Martin Oehler, S. Kohlbrecher, O. Stryk\",\"doi\":\"10.1109/ecmr50962.2021.9568833\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Autonomous mobile robots for industrial inspection can reduce cost for digitalization of existing plants by performing autonomous routine inspections. A frequent task is reading of analog gauges to monitor the health of the facility. Automating this process involves capturing image data with a camera sensor and processing the data to read the value. Detection algorithms deployed on a mobile robot have to deal with increased uncertainty regarding localization and environmental influences. This imposes increased requirements regarding robustness to viewing angle, lighting and scale variation on detection and reading. Current approaches based on conventional computer vision require high quality images or prior knowledge. We address these limitations by leveraging the advances of neural networks in the task of object detection and instance segmentation in a two-stage pipeline. Our method robustly detects and reads manometers without prior knowledge of object location or exact object type. In our evaluation we show that our approach can detect and read manometers from a distance of up to 3m and a viewing angle of up to 60° in different lighting conditions with needle angle estimation errors of ±2.2°. We publish the validation split of our training dataset for manometer and needle detection at https://tudatalib.ulb.tu-darmstadt.de/handle/tudatalib/2881.\",\"PeriodicalId\":200521,\"journal\":{\"name\":\"2021 European Conference on Mobile Robots (ECMR)\",\"volume\":\"8 3\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-08-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 European Conference on Mobile Robots (ECMR)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ecmr50962.2021.9568833\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 European Conference on Mobile Robots (ECMR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ecmr50962.2021.9568833","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

用于工业检测的自主移动机器人可以通过执行自主例行检测来降低现有工厂数字化的成本。一项常见的任务是读取模拟仪表以监测设施的健康状况。该过程的自动化包括使用相机传感器捕获图像数据并处理数据以读取值。部署在移动机器人上的检测算法必须处理关于定位和环境影响的不确定性。这增加了对视角,照明和检测和读取的尺度变化的鲁棒性的要求。目前基于传统计算机视觉的方法需要高质量的图像或先验知识。我们通过利用神经网络在两阶段管道中的目标检测和实例分割任务中的进展来解决这些限制。我们的方法健壮地检测和读取压力计,而不需要事先知道物体的位置或确切的物体类型。在我们的评估中,我们表明我们的方法可以在不同照明条件下从最远3米的距离和最远60°的视角检测和读取压力计,针角估计误差为±2.2°。我们在https://tudatalib.ulb.tu-darmstadt.de/handle/tudatalib/2881上发布了我们的压力计和针头检测训练数据集的验证分割。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Industrial Manometer Detection and Reading for Autonomous Inspection Robots
Autonomous mobile robots for industrial inspection can reduce cost for digitalization of existing plants by performing autonomous routine inspections. A frequent task is reading of analog gauges to monitor the health of the facility. Automating this process involves capturing image data with a camera sensor and processing the data to read the value. Detection algorithms deployed on a mobile robot have to deal with increased uncertainty regarding localization and environmental influences. This imposes increased requirements regarding robustness to viewing angle, lighting and scale variation on detection and reading. Current approaches based on conventional computer vision require high quality images or prior knowledge. We address these limitations by leveraging the advances of neural networks in the task of object detection and instance segmentation in a two-stage pipeline. Our method robustly detects and reads manometers without prior knowledge of object location or exact object type. In our evaluation we show that our approach can detect and read manometers from a distance of up to 3m and a viewing angle of up to 60° in different lighting conditions with needle angle estimation errors of ±2.2°. We publish the validation split of our training dataset for manometer and needle detection at https://tudatalib.ulb.tu-darmstadt.de/handle/tudatalib/2881.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信