{"title":"Towards cognitive navigation: A biologically inspired calibration mechanism for the head direction cell network","authors":"Zhenshan Bing , Dominik Nitschke , Genghang Zhuang , Kai Huang , Alois Knoll","doi":"10.1016/j.jai.2023.100020","DOIUrl":null,"url":null,"abstract":"<div><p>To derive meaningful navigation strategies, animals have to estimate their directional headings in the environment. Accordingly, this function is achieved by the head direction cells that were found in mammalian brains, whose neural activities encode one’s heading direction. It is believed that such head direction information is generated by integrating self-motion cues, which also introduces accumulative errors in the long term. To eliminate such errors, this paper presents an efficient calibration model that mimics the animals’ behavior by exploiting visual cues in a biologically plausible way, and then implements it in robotic navigation tasks. The proposed calibration model allows the agent to associate its head direction and the perceived egocentric direction of a visual cue with its position and orientation, and therefore to calibrate the head direction when the same cue is viewed again. We examine the proposed head direction calibration model in extensive simulations and real-world experiments and demonstrate its excellent performance in terms of quick association of information to proximal or distal cues as well as accuracy of calibrating the integration errors of the head direction. Videos can be viewed at <span>https://videoviewsite.wixsite.com/hdc-calibration</span><svg><path></path></svg>.</p></div>","PeriodicalId":100755,"journal":{"name":"Journal of Automation and Intelligence","volume":"2 1","pages":"Pages 31-41"},"PeriodicalIF":0.0000,"publicationDate":"2023-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Automation and Intelligence","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2949855423000035","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
To derive meaningful navigation strategies, animals have to estimate their directional headings in the environment. Accordingly, this function is achieved by the head direction cells that were found in mammalian brains, whose neural activities encode one’s heading direction. It is believed that such head direction information is generated by integrating self-motion cues, which also introduces accumulative errors in the long term. To eliminate such errors, this paper presents an efficient calibration model that mimics the animals’ behavior by exploiting visual cues in a biologically plausible way, and then implements it in robotic navigation tasks. The proposed calibration model allows the agent to associate its head direction and the perceived egocentric direction of a visual cue with its position and orientation, and therefore to calibrate the head direction when the same cue is viewed again. We examine the proposed head direction calibration model in extensive simulations and real-world experiments and demonstrate its excellent performance in terms of quick association of information to proximal or distal cues as well as accuracy of calibrating the integration errors of the head direction. Videos can be viewed at https://videoviewsite.wixsite.com/hdc-calibration.