Ethan Fahnestock;Erick Fuentes;Samuel Prentice;Vasileios Vasilopoulos;Philip R Osteen;Thomas Howard;Nicholas Roy
{"title":"Far-Field Image-Based Traversability Mapping for a Priori Unknown Natural Environments","authors":"Ethan Fahnestock;Erick Fuentes;Samuel Prentice;Vasileios Vasilopoulos;Philip R Osteen;Thomas Howard;Nicholas Roy","doi":"10.1109/LRA.2025.3563808","DOIUrl":null,"url":null,"abstract":"While navigating unknown environments, robots rely primarily on proximate features for guidance in decision making, such as depth information from lidar to build a costmap, or local semantic information from images. The limited range over which these features can be used may result in poor robot behavior when assumptions about the cost of the map beyond the range of proximate features misguide the robot. Integrating “far-field” image features that originate beyond these proximate features into the mapping pipeline has the promise of enabling more intelligent navigation through unknown terrain. To navigate with far-field features, key challenges must be overcome. As far-field features are typically too distant to localize precisely, they are difficult to place in a map. Additionally, the large distance between the robot and these features makes connecting these features to their navigation implications difficult. We propose <italic>FITAM</i>, an approach that learns to use far-field features to predict costs to guide navigation through unknown environments in a self-supervised manner. Unlike previous work, our approach does not rely on flat ground plane assumptions or range sensors to localize observations. We demonstrate the benefits of our approach through simulated trials and real-world deployment on a Clearpath Robotics Warthog navigating through a forest environment.","PeriodicalId":13241,"journal":{"name":"IEEE Robotics and Automation Letters","volume":"10 6","pages":"6039-6046"},"PeriodicalIF":4.6000,"publicationDate":"2025-04-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Robotics and Automation Letters","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10974571/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ROBOTICS","Score":null,"Total":0}
引用次数: 0
Abstract
While navigating unknown environments, robots rely primarily on proximate features for guidance in decision making, such as depth information from lidar to build a costmap, or local semantic information from images. The limited range over which these features can be used may result in poor robot behavior when assumptions about the cost of the map beyond the range of proximate features misguide the robot. Integrating “far-field” image features that originate beyond these proximate features into the mapping pipeline has the promise of enabling more intelligent navigation through unknown terrain. To navigate with far-field features, key challenges must be overcome. As far-field features are typically too distant to localize precisely, they are difficult to place in a map. Additionally, the large distance between the robot and these features makes connecting these features to their navigation implications difficult. We propose FITAM, an approach that learns to use far-field features to predict costs to guide navigation through unknown environments in a self-supervised manner. Unlike previous work, our approach does not rely on flat ground plane assumptions or range sensors to localize observations. We demonstrate the benefits of our approach through simulated trials and real-world deployment on a Clearpath Robotics Warthog navigating through a forest environment.
期刊介绍:
The scope of this journal is to publish peer-reviewed articles that provide a timely and concise account of innovative research ideas and application results, reporting significant theoretical findings and application case studies in areas of robotics and automation.