在4K输入中实现实时无人机行人跟踪

IF 4.4 2区 地球科学 Q1 REMOTE SENSING
Drones Pub Date : 2023-10-06 DOI:10.3390/drones7100623
Chanyoung Oh, Moonsoo Lee, Chaedeok Lim
{"title":"在4K输入中实现实时无人机行人跟踪","authors":"Chanyoung Oh, Moonsoo Lee, Chaedeok Lim","doi":"10.3390/drones7100623","DOIUrl":null,"url":null,"abstract":"Over the past several years, significant progress has been made in object tracking, but challenges persist in tracking objects in high-resolution images captured from drones. Such images usually contain very tiny objects, and the movement of the drone causes rapid changes in the scene. In addition, the computing power of mission computers on drones is often insufficient to achieve real-time processing of deep learning-based object tracking. This paper presents a real-time on-drone pedestrian tracker that takes as the input 4K aerial images. The proposed tracker effectively hides the long latency required for deep learning-based detection (e.g., YOLO) by exploiting both the CPU and GPU equipped in the mission computer. We also propose techniques to minimize detection loss in drone-captured images, including a tracker-assisted confidence boosting and an ensemble for identity association. In our experiments, using real-world inputs captured by drones at a height of 50 m, the proposed method with an NVIDIA Jetson TX2 proves its efficacy by achieving real-time detection and tracking in 4K video streams.","PeriodicalId":36448,"journal":{"name":"Drones","volume":"55 1","pages":"0"},"PeriodicalIF":4.4000,"publicationDate":"2023-10-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Towards Real-Time On-Drone Pedestrian Tracking in 4K Inputs\",\"authors\":\"Chanyoung Oh, Moonsoo Lee, Chaedeok Lim\",\"doi\":\"10.3390/drones7100623\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Over the past several years, significant progress has been made in object tracking, but challenges persist in tracking objects in high-resolution images captured from drones. Such images usually contain very tiny objects, and the movement of the drone causes rapid changes in the scene. In addition, the computing power of mission computers on drones is often insufficient to achieve real-time processing of deep learning-based object tracking. This paper presents a real-time on-drone pedestrian tracker that takes as the input 4K aerial images. The proposed tracker effectively hides the long latency required for deep learning-based detection (e.g., YOLO) by exploiting both the CPU and GPU equipped in the mission computer. We also propose techniques to minimize detection loss in drone-captured images, including a tracker-assisted confidence boosting and an ensemble for identity association. In our experiments, using real-world inputs captured by drones at a height of 50 m, the proposed method with an NVIDIA Jetson TX2 proves its efficacy by achieving real-time detection and tracking in 4K video streams.\",\"PeriodicalId\":36448,\"journal\":{\"name\":\"Drones\",\"volume\":\"55 1\",\"pages\":\"0\"},\"PeriodicalIF\":4.4000,\"publicationDate\":\"2023-10-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Drones\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.3390/drones7100623\",\"RegionNum\":2,\"RegionCategory\":\"地球科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"REMOTE SENSING\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Drones","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3390/drones7100623","RegionNum":2,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"REMOTE SENSING","Score":null,"Total":0}
引用次数: 0

摘要

在过去的几年中,在目标跟踪方面取得了重大进展,但在从无人机捕获的高分辨率图像中跟踪目标方面仍然存在挑战。这样的图像通常包含非常微小的物体,无人机的移动会导致场景的快速变化。此外,无人机上任务计算机的计算能力往往不足以实现基于深度学习的目标跟踪的实时处理。本文提出了一种以4K航拍图像为输入的实时无人机行人跟踪器。所提出的跟踪器通过利用任务计算机中配备的CPU和GPU,有效地隐藏了基于深度学习的检测所需的长延迟(例如YOLO)。我们还提出了最小化无人机捕获图像检测损失的技术,包括跟踪器辅助的信心增强和身份关联的集成。在我们的实验中,使用50 m高度无人机捕获的真实输入,使用NVIDIA Jetson TX2实现了4K视频流的实时检测和跟踪,证明了该方法的有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Towards Real-Time On-Drone Pedestrian Tracking in 4K Inputs
Over the past several years, significant progress has been made in object tracking, but challenges persist in tracking objects in high-resolution images captured from drones. Such images usually contain very tiny objects, and the movement of the drone causes rapid changes in the scene. In addition, the computing power of mission computers on drones is often insufficient to achieve real-time processing of deep learning-based object tracking. This paper presents a real-time on-drone pedestrian tracker that takes as the input 4K aerial images. The proposed tracker effectively hides the long latency required for deep learning-based detection (e.g., YOLO) by exploiting both the CPU and GPU equipped in the mission computer. We also propose techniques to minimize detection loss in drone-captured images, including a tracker-assisted confidence boosting and an ensemble for identity association. In our experiments, using real-world inputs captured by drones at a height of 50 m, the proposed method with an NVIDIA Jetson TX2 proves its efficacy by achieving real-time detection and tracking in 4K video streams.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Drones
Drones Engineering-Aerospace Engineering
CiteScore
5.60
自引率
18.80%
发文量
331
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信