Yihe Lu, Jiahao Cen, Rana Alkhoury Maroun, Barbara Webb
{"title":"昆虫启发的具体化视觉路线跟踪","authors":"Yihe Lu, Jiahao Cen, Rana Alkhoury Maroun, Barbara Webb","doi":"10.1007/s42235-025-00695-8","DOIUrl":null,"url":null,"abstract":"<div><p>In the visual ‘teach-and-repeat’ task, a mobile robot is expected to perform path following based on visual memory acquired along a route that it has traversed. Following a visually familiar route is also a critical navigation skill for foraging insects, which they accomplish robustly despite tiny brains. Inspired by the mushroom body structure in the insect brain and its well-understood associative learning ability, we develop an embodied model that can accomplish visual teach-and-repeat efficiently. Critical to the performance is steering the robot body reflexively based on the relative familiarity of left and right visual fields, eliminating the need for stopping and scanning regularly for optimal directions. The model is robust against noise in visual processing and motor control and can produce performance comparable to pure pursuit or visual localisation methods that rely heavily on the estimation of positions. The model is tested on a real robot and also shown to be able to correct for significant intrinsic steering bias.</p></div>","PeriodicalId":614,"journal":{"name":"Journal of Bionic Engineering","volume":"22 3","pages":"1167 - 1193"},"PeriodicalIF":5.8000,"publicationDate":"2025-04-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://link.springer.com/content/pdf/10.1007/s42235-025-00695-8.pdf","citationCount":"0","resultStr":"{\"title\":\"Insect-inspired Embodied Visual Route Following\",\"authors\":\"Yihe Lu, Jiahao Cen, Rana Alkhoury Maroun, Barbara Webb\",\"doi\":\"10.1007/s42235-025-00695-8\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>In the visual ‘teach-and-repeat’ task, a mobile robot is expected to perform path following based on visual memory acquired along a route that it has traversed. Following a visually familiar route is also a critical navigation skill for foraging insects, which they accomplish robustly despite tiny brains. Inspired by the mushroom body structure in the insect brain and its well-understood associative learning ability, we develop an embodied model that can accomplish visual teach-and-repeat efficiently. Critical to the performance is steering the robot body reflexively based on the relative familiarity of left and right visual fields, eliminating the need for stopping and scanning regularly for optimal directions. The model is robust against noise in visual processing and motor control and can produce performance comparable to pure pursuit or visual localisation methods that rely heavily on the estimation of positions. The model is tested on a real robot and also shown to be able to correct for significant intrinsic steering bias.</p></div>\",\"PeriodicalId\":614,\"journal\":{\"name\":\"Journal of Bionic Engineering\",\"volume\":\"22 3\",\"pages\":\"1167 - 1193\"},\"PeriodicalIF\":5.8000,\"publicationDate\":\"2025-04-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://link.springer.com/content/pdf/10.1007/s42235-025-00695-8.pdf\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Bionic Engineering\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://link.springer.com/article/10.1007/s42235-025-00695-8\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, MULTIDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Bionic Engineering","FirstCategoryId":"94","ListUrlMain":"https://link.springer.com/article/10.1007/s42235-025-00695-8","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, MULTIDISCIPLINARY","Score":null,"Total":0}
In the visual ‘teach-and-repeat’ task, a mobile robot is expected to perform path following based on visual memory acquired along a route that it has traversed. Following a visually familiar route is also a critical navigation skill for foraging insects, which they accomplish robustly despite tiny brains. Inspired by the mushroom body structure in the insect brain and its well-understood associative learning ability, we develop an embodied model that can accomplish visual teach-and-repeat efficiently. Critical to the performance is steering the robot body reflexively based on the relative familiarity of left and right visual fields, eliminating the need for stopping and scanning regularly for optimal directions. The model is robust against noise in visual processing and motor control and can produce performance comparable to pure pursuit or visual localisation methods that rely heavily on the estimation of positions. The model is tested on a real robot and also shown to be able to correct for significant intrinsic steering bias.
期刊介绍:
The Journal of Bionic Engineering (JBE) is a peer-reviewed journal that publishes original research papers and reviews that apply the knowledge learned from nature and biological systems to solve concrete engineering problems. The topics that JBE covers include but are not limited to:
Mechanisms, kinematical mechanics and control of animal locomotion, development of mobile robots with walking (running and crawling), swimming or flying abilities inspired by animal locomotion.
Structures, morphologies, composition and physical properties of natural and biomaterials; fabrication of new materials mimicking the properties and functions of natural and biomaterials.
Biomedical materials, artificial organs and tissue engineering for medical applications; rehabilitation equipment and devices.
Development of bioinspired computation methods and artificial intelligence for engineering applications.