RING-Net: road inference from GPS trajectories using a deep segmentation network

E. Eftelioglu, Ravi Garg, Vaibhav Kango, Chintan Gohil, Amber Roy Chowdhury
{"title":"RING-Net: road inference from GPS trajectories using a deep segmentation network","authors":"E. Eftelioglu, Ravi Garg, Vaibhav Kango, Chintan Gohil, Amber Roy Chowdhury","doi":"10.1145/3557917.3567617","DOIUrl":null,"url":null,"abstract":"Accurate and rich representation of roads in a map is critical for safe and efficient navigation experience. Often, open source road data is incomplete and manually adding roads is labor intensive and consequently expensive. In this paper, we propose RING-Net, an approach for Road INference from GPS trajectories using a deep image segmentation Network. Previous work on road inference is either focused on satellite images or GPS trajectories, but they are not compatible with each other when there is a lack of high quality data from either of the source types. Even though it is primarily focused on using GPS trajectories as its input, RING-Net architecture is flexible enough to be used with multiple data sources with minimal effort. More specifically, RING-Net converts raw GPS trajectories into multi-band raster images with trip related features, and infers roads with high precision. Experiments on public data show that Ring-Net can be used to improve the completeness of a road network. Our approach is promising to bring us one step closer to fully automated map updates.","PeriodicalId":152788,"journal":{"name":"Proceedings of the 10th ACM SIGSPATIAL International Workshop on Analytics for Big Geospatial Data","volume":"37 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 10th ACM SIGSPATIAL International Workshop on Analytics for Big Geospatial Data","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3557917.3567617","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

Accurate and rich representation of roads in a map is critical for safe and efficient navigation experience. Often, open source road data is incomplete and manually adding roads is labor intensive and consequently expensive. In this paper, we propose RING-Net, an approach for Road INference from GPS trajectories using a deep image segmentation Network. Previous work on road inference is either focused on satellite images or GPS trajectories, but they are not compatible with each other when there is a lack of high quality data from either of the source types. Even though it is primarily focused on using GPS trajectories as its input, RING-Net architecture is flexible enough to be used with multiple data sources with minimal effort. More specifically, RING-Net converts raw GPS trajectories into multi-band raster images with trip related features, and infers roads with high precision. Experiments on public data show that Ring-Net can be used to improve the completeness of a road network. Our approach is promising to bring us one step closer to fully automated map updates.
环网:道路推断从GPS轨迹使用深度分割网络
地图中准确、丰富的道路表示对于安全、高效的导航体验至关重要。通常,开源道路数据是不完整的,手动添加道路是劳动密集型的,因此成本很高。在本文中,我们提出了RING-Net,一种利用深度图像分割网络从GPS轨迹推断道路的方法。以前的道路推断工作要么集中在卫星图像上,要么集中在GPS轨迹上,但是当缺乏来自任何一种来源类型的高质量数据时,它们就不能相互兼容。尽管RING-Net主要侧重于使用GPS轨迹作为输入,但它的架构足够灵活,可以以最小的努力与多个数据源一起使用。更具体地说,RING-Net将原始GPS轨迹转换为具有行程相关特征的多波段光栅图像,并高精度地推断道路。在公共数据上的实验表明,环形网可以提高路网的完整性。我们的方法有望使我们离全自动地图更新更近一步。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信