自主地面车辆的机会无线电辅助导航

Hongkai Wen, Yiran Shen, Savvas Papaioannou, W. Churchill, A. Trigoni, P. Newman
{"title":"自主地面车辆的机会无线电辅助导航","authors":"Hongkai Wen, Yiran Shen, Savvas Papaioannou, W. Churchill, A. Trigoni, P. Newman","doi":"10.1109/DCOSS.2015.22","DOIUrl":null,"url":null,"abstract":"Navigating autonomous ground vehicles with visual sensors has many advantages - it does not rely on global maps, yet is accurate and reliable even in GPS-denied environments. However, due to the limitation of the camera field of view, one typically has to record a large number of visual experiences for practical navigation. In this paper, we explore new avenues in linking together visual experiences, by opportunistically harvesting and sharing a variety of radio signals emitted by surrounding stationary access points and mobile devices. We propose a novel navigation approach, which exploits side-channel information of co-location to thread up visually-separated experiences with short exploration phases. The proposed approach empowers users to trade travel time for manual navigation effort, allowing them to choose the itinerary that best serves their needs. We evaluate the proposed approach with data collected from a typical urban area, and show that it achieves much better navigation performance in both reach ability and cost, comparing with the state of the arts that only use visual information.","PeriodicalId":332746,"journal":{"name":"2015 International Conference on Distributed Computing in Sensor Systems","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2015-06-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Opportunistic Radio Assisted Navigation for Autonomous Ground Vehicles\",\"authors\":\"Hongkai Wen, Yiran Shen, Savvas Papaioannou, W. Churchill, A. Trigoni, P. Newman\",\"doi\":\"10.1109/DCOSS.2015.22\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Navigating autonomous ground vehicles with visual sensors has many advantages - it does not rely on global maps, yet is accurate and reliable even in GPS-denied environments. However, due to the limitation of the camera field of view, one typically has to record a large number of visual experiences for practical navigation. In this paper, we explore new avenues in linking together visual experiences, by opportunistically harvesting and sharing a variety of radio signals emitted by surrounding stationary access points and mobile devices. We propose a novel navigation approach, which exploits side-channel information of co-location to thread up visually-separated experiences with short exploration phases. The proposed approach empowers users to trade travel time for manual navigation effort, allowing them to choose the itinerary that best serves their needs. We evaluate the proposed approach with data collected from a typical urban area, and show that it achieves much better navigation performance in both reach ability and cost, comparing with the state of the arts that only use visual information.\",\"PeriodicalId\":332746,\"journal\":{\"name\":\"2015 International Conference on Distributed Computing in Sensor Systems\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2015-06-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2015 International Conference on Distributed Computing in Sensor Systems\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/DCOSS.2015.22\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2015 International Conference on Distributed Computing in Sensor Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/DCOSS.2015.22","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

摘要

带有视觉传感器的自动驾驶地面车辆导航有很多优势——它不依赖全球地图,但即使在没有gps的环境中也能准确可靠。然而,由于相机视野的限制,为了实际导航,通常需要记录大量的视觉体验。在本文中,我们通过收集和共享周围固定接入点和移动设备发出的各种无线电信号,探索了将视觉体验联系在一起的新途径。我们提出了一种新的导航方法,该方法利用共定位的侧通道信息,将视觉分离的体验与短探索阶段联系起来。所提出的方法使用户可以将旅行时间用于手动导航工作,从而允许他们选择最适合自己需求的行程。我们用从典型城市地区收集的数据对所提出的方法进行了评估,并表明与仅使用视觉信息的技术相比,它在到达能力和成本方面都取得了更好的导航性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Opportunistic Radio Assisted Navigation for Autonomous Ground Vehicles
Navigating autonomous ground vehicles with visual sensors has many advantages - it does not rely on global maps, yet is accurate and reliable even in GPS-denied environments. However, due to the limitation of the camera field of view, one typically has to record a large number of visual experiences for practical navigation. In this paper, we explore new avenues in linking together visual experiences, by opportunistically harvesting and sharing a variety of radio signals emitted by surrounding stationary access points and mobile devices. We propose a novel navigation approach, which exploits side-channel information of co-location to thread up visually-separated experiences with short exploration phases. The proposed approach empowers users to trade travel time for manual navigation effort, allowing them to choose the itinerary that best serves their needs. We evaluate the proposed approach with data collected from a typical urban area, and show that it achieves much better navigation performance in both reach ability and cost, comparing with the state of the arts that only use visual information.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信