Guruprasad M. Hegde, Chris Robinson, C. Ye, A. Stroupe, E. Tunstel
{"title":"Computer vision based wheel sinkage detection for robotic lunar exploration tasks","authors":"Guruprasad M. Hegde, Chris Robinson, C. Ye, A. Stroupe, E. Tunstel","doi":"10.1109/ICMA.2010.5588685","DOIUrl":null,"url":null,"abstract":"This paper presents a wheel sinkage detection method that may be used in robotic lunar exploration tasks. The method extracts the boundary line between a robot wheel and lunar soil by segmenting the wheel-soil images captured from a video camera that monitors wheel-soil interaction. The detected boundary is projected onto the soil-free image of the robot wheel to determine the depth of wheel sinkage. The segmentation method is based on graph theory. It first clusters a wheel-soil image into homogeneous regions called superpixels and constructs a graph on the superpixels. It then partitions the graph into segments by using normalized cuts. Compared with the existing methods, the proposed algorithm is more robust to illumination condition, shadows and dust (covering the wheel). The method's efficacy has been validated by experiments under various conditions.","PeriodicalId":145608,"journal":{"name":"2010 IEEE International Conference on Mechatronics and Automation","volume":"35 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2010-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2010 IEEE International Conference on Mechatronics and Automation","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICMA.2010.5588685","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5
Abstract
This paper presents a wheel sinkage detection method that may be used in robotic lunar exploration tasks. The method extracts the boundary line between a robot wheel and lunar soil by segmenting the wheel-soil images captured from a video camera that monitors wheel-soil interaction. The detected boundary is projected onto the soil-free image of the robot wheel to determine the depth of wheel sinkage. The segmentation method is based on graph theory. It first clusters a wheel-soil image into homogeneous regions called superpixels and constructs a graph on the superpixels. It then partitions the graph into segments by using normalized cuts. Compared with the existing methods, the proposed algorithm is more robust to illumination condition, shadows and dust (covering the wheel). The method's efficacy has been validated by experiments under various conditions.