Visual edge feature detection and guidance under 3D interference: A case study on deep groove edge features for manufacturing robots with 3D vision sensors
IF 4.1 3区 工程技术Q2 ENGINEERING, ELECTRICAL & ELECTRONIC
Zidong Wu , Hong Lu , Yongquan Zhang , He Huang , Zhi Liu , Jun Zhang , Xu Feng , Yongjie He , Yongjing Wang
{"title":"Visual edge feature detection and guidance under 3D interference: A case study on deep groove edge features for manufacturing robots with 3D vision sensors","authors":"Zidong Wu , Hong Lu , Yongquan Zhang , He Huang , Zhi Liu , Jun Zhang , Xu Feng , Yongjie He , Yongjing Wang","doi":"10.1016/j.sna.2024.116082","DOIUrl":null,"url":null,"abstract":"<div><div>For manufacturing robots equipped with 3D vision sensors, the presence of environmental interference significantly impedes the precision of edge extraction. Existing edge feature extraction methods often enhance adaptability to interference at the expense of final extraction precision. This paper introduces a novel 3D visual edge detection method that ensures greater precision while maintaining adaptability, capable of addressing various forms of interference in real manufacturing scenarios. To address the challenge, data-driven and traditional visual approaches are integrated. Deep groove edge feature extraction and guidance tasks are used as a case study. R-CNN and improved OTSU algorithm with adaptive threshold are combined to identify groove features. Subsequently, a scale adaptive average slope sliding window algorithm is devised to extract groove edge points, along with a corresponding continuity evaluation algorithm. Real data is used to validate the performance of the proposed method. The experiment results show that the average error in processing interfered data is 0.29 mm, with an average maximum error of 0.54 mm, exhibiting superior overall performance and precision compared to traditional and data-driven methods.</div></div>","PeriodicalId":21689,"journal":{"name":"Sensors and Actuators A-physical","volume":"381 ","pages":"Article 116082"},"PeriodicalIF":4.1000,"publicationDate":"2024-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Sensors and Actuators A-physical","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0924424724010768","RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
For manufacturing robots equipped with 3D vision sensors, the presence of environmental interference significantly impedes the precision of edge extraction. Existing edge feature extraction methods often enhance adaptability to interference at the expense of final extraction precision. This paper introduces a novel 3D visual edge detection method that ensures greater precision while maintaining adaptability, capable of addressing various forms of interference in real manufacturing scenarios. To address the challenge, data-driven and traditional visual approaches are integrated. Deep groove edge feature extraction and guidance tasks are used as a case study. R-CNN and improved OTSU algorithm with adaptive threshold are combined to identify groove features. Subsequently, a scale adaptive average slope sliding window algorithm is devised to extract groove edge points, along with a corresponding continuity evaluation algorithm. Real data is used to validate the performance of the proposed method. The experiment results show that the average error in processing interfered data is 0.29 mm, with an average maximum error of 0.54 mm, exhibiting superior overall performance and precision compared to traditional and data-driven methods.
期刊介绍:
Sensors and Actuators A: Physical brings together multidisciplinary interests in one journal entirely devoted to disseminating information on all aspects of research and development of solid-state devices for transducing physical signals. Sensors and Actuators A: Physical regularly publishes original papers, letters to the Editors and from time to time invited review articles within the following device areas:
• Fundamentals and Physics, such as: classification of effects, physical effects, measurement theory, modelling of sensors, measurement standards, measurement errors, units and constants, time and frequency measurement. Modeling papers should bring new modeling techniques to the field and be supported by experimental results.
• Materials and their Processing, such as: piezoelectric materials, polymers, metal oxides, III-V and II-VI semiconductors, thick and thin films, optical glass fibres, amorphous, polycrystalline and monocrystalline silicon.
• Optoelectronic sensors, such as: photovoltaic diodes, photoconductors, photodiodes, phototransistors, positron-sensitive photodetectors, optoisolators, photodiode arrays, charge-coupled devices, light-emitting diodes, injection lasers and liquid-crystal displays.
• Mechanical sensors, such as: metallic, thin-film and semiconductor strain gauges, diffused silicon pressure sensors, silicon accelerometers, solid-state displacement transducers, piezo junction devices, piezoelectric field-effect transducers (PiFETs), tunnel-diode strain sensors, surface acoustic wave devices, silicon micromechanical switches, solid-state flow meters and electronic flow controllers.
Etc...