Density and Context Aware Network with Hierarchical Head for Traffic Scene Detection

Zuhao Ge, Wenhao Yu, Xian Liu, Lizhe Qi, Yunquan Sun
{"title":"Density and Context Aware Network with Hierarchical Head for Traffic Scene Detection","authors":"Zuhao Ge, Wenhao Yu, Xian Liu, Lizhe Qi, Yunquan Sun","doi":"10.1109/IJCNN55064.2022.9892125","DOIUrl":null,"url":null,"abstract":"We investigate traffic scene detection from surveillance cameras and UAVs. This task is rather challenging, mainly due to the spatial nonuniform gathering, large-scale variance, and instance-level imbalanced distribution of vehicles. Most existing methods that employed FPN to enrich features are prone to failure in this scenario. To mitigate the influences above, we propose a novel detector called Density and Context Aware Network(DCANet) that can focus on dense regions and adaptively aggregate context features. Specifically, DCANet consists of three components: Density Map Supervision(DMP), Context Feature Aggregation(CFA), and Hierarchical Head Module(HHM). DMP is designed to capture the gathering information of objects supervised by density maps. CFA exploits adjacent feature layers' relationships to fulfill ROI-level contextual information enhancement. Finally, HHM is introduced to classify and locate imbalanced objects employed in hierarchical heads. Without bells and whistles, DCANet can be used in any two-stage detectors. Extensive experiments are carried out on the two widely used traffic detection datasets, CityCam and VisDrone, and DCANet reports new state-of-the-art scores on the CityCam.","PeriodicalId":106974,"journal":{"name":"2022 International Joint Conference on Neural Networks (IJCNN)","volume":"90 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-07-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 International Joint Conference on Neural Networks (IJCNN)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN55064.2022.9892125","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

We investigate traffic scene detection from surveillance cameras and UAVs. This task is rather challenging, mainly due to the spatial nonuniform gathering, large-scale variance, and instance-level imbalanced distribution of vehicles. Most existing methods that employed FPN to enrich features are prone to failure in this scenario. To mitigate the influences above, we propose a novel detector called Density and Context Aware Network(DCANet) that can focus on dense regions and adaptively aggregate context features. Specifically, DCANet consists of three components: Density Map Supervision(DMP), Context Feature Aggregation(CFA), and Hierarchical Head Module(HHM). DMP is designed to capture the gathering information of objects supervised by density maps. CFA exploits adjacent feature layers' relationships to fulfill ROI-level contextual information enhancement. Finally, HHM is introduced to classify and locate imbalanced objects employed in hierarchical heads. Without bells and whistles, DCANet can be used in any two-stage detectors. Extensive experiments are carried out on the two widely used traffic detection datasets, CityCam and VisDrone, and DCANet reports new state-of-the-art scores on the CityCam.
基于分层头的交通场景检测密度和上下文感知网络
我们研究从监控摄像机和无人机的交通场景检测。这一任务具有一定的挑战性,主要是由于车辆的空间不均匀收集、大规模方差和实例级分布不平衡。在这种情况下,大多数使用FPN来丰富特征的现有方法都容易失败。为了减轻上述影响,我们提出了一种新的检测器,称为密度和上下文感知网络(DCANet),它可以专注于密集区域并自适应地聚合上下文特征。具体来说,DCANet由三个组件组成:密度地图监督(DMP),上下文特征聚合(CFA)和分层头部模块(HHM)。DMP的目的是捕获由密度图监督的物体的收集信息。CFA利用相邻特征层的关系来实现roi级别的上下文信息增强。最后,引入HHM算法对分层头部中的不平衡目标进行分类和定位。DCANet可用于任何两级检测器,无需额外的附加功能。在两个广泛使用的交通检测数据集CityCam和VisDrone上进行了广泛的实验,DCANet报告了CityCam上最新的先进分数。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信