FPGA-Based CNN for Real-Time UAV Tracking and Detection

P. Hobden, S. Srivastava, Edmond Nurellari
{"title":"FPGA-Based CNN for Real-Time UAV Tracking and Detection","authors":"P. Hobden, S. Srivastava, Edmond Nurellari","doi":"10.3389/frspt.2022.878010","DOIUrl":null,"url":null,"abstract":"Neural networks (NNs) are now being extensively utilized in various artificial intelligence platforms specifically in the area of image classification and real-time object tracking. We propose a novel design to address the problem of real-time unmanned aerial vehicle (UAV) monitoring and detection using a Zynq UltraScale FPGA-based convolutional neural network (CNN). The biggest challenge while implementing real-time algorithms on FPGAs is the limited DSP hardware resources available on FPGA platforms. Our proposed design overcomes the challenge of autonomous real-time UAV detection and tracking using a Xilinx’s Zynq UltraScale XCZU9EG system on a chip (SoC) platform. Our proposed design explores and provides a solution for overcoming the challenge of limited floating-point resources while maintaining real-time performance. The solution consists of two modules: UAV tracking module and neural network–based UAV detection module. The tracking module uses our novel background-differencing algorithm, while the UAV detection is based on a modified CNN algorithm, designed to give the maximum field-programmable gate array (FPGA) performance. These two modules are designed to complement each other and enabled simultaneously to provide an enhanced real-time UAV detection for any given video input. The proposed system has been tested on real-life flying UAVs, achieving an accuracy of 82%, running at the full frame rate of the input camera for both tracking and neural network (NN) detection, achieving similar performance than an equivalent deep learning processor unit (DPU) with UltraScale FPGA-based HD video and tracking implementation but with lower resource utilization as shown by our results.","PeriodicalId":137674,"journal":{"name":"Frontiers in Space Technologies","volume":"48 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-05-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Frontiers in Space Technologies","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3389/frspt.2022.878010","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3

Abstract

Neural networks (NNs) are now being extensively utilized in various artificial intelligence platforms specifically in the area of image classification and real-time object tracking. We propose a novel design to address the problem of real-time unmanned aerial vehicle (UAV) monitoring and detection using a Zynq UltraScale FPGA-based convolutional neural network (CNN). The biggest challenge while implementing real-time algorithms on FPGAs is the limited DSP hardware resources available on FPGA platforms. Our proposed design overcomes the challenge of autonomous real-time UAV detection and tracking using a Xilinx’s Zynq UltraScale XCZU9EG system on a chip (SoC) platform. Our proposed design explores and provides a solution for overcoming the challenge of limited floating-point resources while maintaining real-time performance. The solution consists of two modules: UAV tracking module and neural network–based UAV detection module. The tracking module uses our novel background-differencing algorithm, while the UAV detection is based on a modified CNN algorithm, designed to give the maximum field-programmable gate array (FPGA) performance. These two modules are designed to complement each other and enabled simultaneously to provide an enhanced real-time UAV detection for any given video input. The proposed system has been tested on real-life flying UAVs, achieving an accuracy of 82%, running at the full frame rate of the input camera for both tracking and neural network (NN) detection, achieving similar performance than an equivalent deep learning processor unit (DPU) with UltraScale FPGA-based HD video and tracking implementation but with lower resource utilization as shown by our results.
基于fpga的CNN无人机实时跟踪与检测
神经网络(Neural networks, NNs)目前被广泛应用于各种人工智能平台,特别是在图像分类和实时目标跟踪领域。我们提出了一种新颖的设计,利用Zynq UltraScale基于fpga的卷积神经网络(CNN)来解决无人机(UAV)的实时监控和检测问题。在FPGA上实现实时算法的最大挑战是FPGA平台上可用的DSP硬件资源有限。我们提出的设计在芯片(SoC)平台上使用赛灵思的Zynq UltraScale XCZU9EG系统,克服了自主实时无人机检测和跟踪的挑战。我们提出的设计探索并提供了一种解决方案,可以在保持实时性能的同时克服有限的浮点资源的挑战。该方案包括两个模块:无人机跟踪模块和基于神经网络的无人机检测模块。跟踪模块使用我们新颖的背景差分算法,而无人机检测基于改进的CNN算法,旨在提供最大的现场可编程门阵列(FPGA)性能。这两个模块旨在相互补充,并同时为任何给定的视频输入提供增强的实时无人机检测。所提出的系统已在实际飞行的无人机上进行了测试,准确率达到82%,在跟踪和神经网络(NN)检测的输入摄像头的全帧速率下运行,与基于UltraScale fpga的高清视频和跟踪实现的同等深度学习处理器(DPU)的性能相似,但我们的结果显示其资源利用率较低。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信