A Hybrid Timestamping Approach for Multi-Sensor Perception Systems

Josef Steinbaeck, C. Steger, E. Brenner, N. Druml
{"title":"A Hybrid Timestamping Approach for Multi-Sensor Perception Systems","authors":"Josef Steinbaeck, C. Steger, E. Brenner, N. Druml","doi":"10.1109/DSD51259.2020.00077","DOIUrl":null,"url":null,"abstract":"Synchronized and precisely timestamped data from perception sensors is highly advantageous for the low-level fusion of multiple sensor data. Many open-available, low-cost perception sensors do neither provide hardware support for precise clock synchronization, nor provide timestamps with their measurement data. In this work, we present an approach to enable synchronization and accurate timestamping of hardware-triggerable sensors in multi-sensor perception systems.We utilize a hybrid timestamping approach, taking into account the timestamp of a hardware trigger and the software timestamp. The presented timestamping approach utilizes the trigger time to assign precise timestamps to the data streams of the perception sensors. Precise timestamps are mandatory in order to achieve a high perception performance in dynamic applications which utilize low-level data streams.Additionally, we present an implementation of the approach on a multi-sensor perception platform, archiving a timestamp precision in the range of 2 ms. An existing Robot Operating System (ROS) architecture of the platform is extended to assign hybrid timestamps to the data streams. Additionally, we present a pedestrian detection implementation which fuses the timestamped data into a representation.","PeriodicalId":128527,"journal":{"name":"2020 23rd Euromicro Conference on Digital System Design (DSD)","volume":"9 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 23rd Euromicro Conference on Digital System Design (DSD)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/DSD51259.2020.00077","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

Synchronized and precisely timestamped data from perception sensors is highly advantageous for the low-level fusion of multiple sensor data. Many open-available, low-cost perception sensors do neither provide hardware support for precise clock synchronization, nor provide timestamps with their measurement data. In this work, we present an approach to enable synchronization and accurate timestamping of hardware-triggerable sensors in multi-sensor perception systems.We utilize a hybrid timestamping approach, taking into account the timestamp of a hardware trigger and the software timestamp. The presented timestamping approach utilizes the trigger time to assign precise timestamps to the data streams of the perception sensors. Precise timestamps are mandatory in order to achieve a high perception performance in dynamic applications which utilize low-level data streams.Additionally, we present an implementation of the approach on a multi-sensor perception platform, archiving a timestamp precision in the range of 2 ms. An existing Robot Operating System (ROS) architecture of the platform is extended to assign hybrid timestamps to the data streams. Additionally, we present a pedestrian detection implementation which fuses the timestamped data into a representation.
多传感器感知系统的混合时间戳方法
感知传感器数据的同步和精确的时间戳对于多传感器数据的低水平融合非常有利。许多开放的、低成本的感知传感器既不提供精确时钟同步的硬件支持,也不提供测量数据的时间戳。在这项工作中,我们提出了一种在多传感器感知系统中实现硬件可触发传感器同步和精确时间戳的方法。我们使用混合时间戳方法,同时考虑硬件触发器的时间戳和软件时间戳。提出的时间戳方法利用触发时间为感知传感器的数据流分配精确的时间戳。为了在使用低级数据流的动态应用程序中实现高感知性能,精确的时间戳是必需的。此外,我们在多传感器感知平台上实现了该方法,存档时间戳精度在2毫秒范围内。该平台扩展了现有的机器人操作系统(ROS)体系结构,为数据流分配混合时间戳。此外,我们提出了一种行人检测实现,它将时间戳数据融合到一个表示中。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信