MGFNet: Cross-scene crowd counting via multistage gated fusion network

IF 5.5 2区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
{"title":"MGFNet: Cross-scene crowd counting via multistage gated fusion network","authors":"","doi":"10.1016/j.neucom.2024.128431","DOIUrl":null,"url":null,"abstract":"<div><p>Existing crowd counting methods are mainly trained and tested in similar scenarios. When the testing and training scenarios of the model are different, the counting accuracy of these methods will sharply decrease, which seriously limits their practical application. To address this problem, we propose a multistage gated fusion network (MGFNet) for cross-scene crowd counting. MGFNet is primarily composed of dynamic gated convolution units (DGCU) and multilevel scale attention blocks (MSAB) modules. Specifically, DGCU uses a dynamic gating path to supplement detailed information to reduce the loss of crowd information and overestimation of background in different scenarios. MSAB calibrates crowd information at different scales and perspectives in different scenes by generating attention maps with discriminative information. In addition, we used a new global local consistency loss to optimize the model to adapt to changes in crowd density and distribution. Extensive experiments on four different types of scene counting benchmarks show that the proposed MGFNet achieves superior cross-scene counting performance.</p></div>","PeriodicalId":19268,"journal":{"name":"Neurocomputing","volume":null,"pages":null},"PeriodicalIF":5.5000,"publicationDate":"2024-08-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neurocomputing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0925231224012025","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Existing crowd counting methods are mainly trained and tested in similar scenarios. When the testing and training scenarios of the model are different, the counting accuracy of these methods will sharply decrease, which seriously limits their practical application. To address this problem, we propose a multistage gated fusion network (MGFNet) for cross-scene crowd counting. MGFNet is primarily composed of dynamic gated convolution units (DGCU) and multilevel scale attention blocks (MSAB) modules. Specifically, DGCU uses a dynamic gating path to supplement detailed information to reduce the loss of crowd information and overestimation of background in different scenarios. MSAB calibrates crowd information at different scales and perspectives in different scenes by generating attention maps with discriminative information. In addition, we used a new global local consistency loss to optimize the model to adapt to changes in crowd density and distribution. Extensive experiments on four different types of scene counting benchmarks show that the proposed MGFNet achieves superior cross-scene counting performance.

MGFNet:通过多级门控融合网络进行跨场景人群计数
现有的人群计数方法主要是在相似的场景下进行训练和测试。当模型的测试和训练场景不同时,这些方法的计数精度会急剧下降,严重限制了其实际应用。针对这一问题,我们提出了一种用于跨场景人群计数的多级门控融合网络(MGFNet)。MGFNet 主要由动态门控卷积单元(DGCU)和多级标度注意力模块(MSAB)组成。具体来说,DGCU 使用动态门控路径来补充详细信息,以减少不同场景中人群信息的丢失和背景信息的高估。MSAB 通过生成具有判别信息的注意力图,校准不同场景下不同尺度和视角的人群信息。此外,我们还使用了一种新的全局局部一致性损失来优化模型,以适应人群密度和分布的变化。在四种不同类型的场景计数基准上进行的广泛实验表明,所提出的 MGFNet 实现了卓越的跨场景计数性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Neurocomputing
Neurocomputing 工程技术-计算机:人工智能
CiteScore
13.10
自引率
10.00%
发文量
1382
审稿时长
70 days
期刊介绍: Neurocomputing publishes articles describing recent fundamental contributions in the field of neurocomputing. Neurocomputing theory, practice and applications are the essential topics being covered.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信