RGB-D road segmentation based on cross-modality feature maintenance and encouragement

IF 2.3 4区 工程技术 Q2 ENGINEERING, ELECTRICAL & ELECTRONIC
Xia Yuan, Xinyi Wu, Yanchao Cui, Chunxia Zhao
{"title":"RGB-D road segmentation based on cross-modality feature maintenance and encouragement","authors":"Xia Yuan,&nbsp;Xinyi Wu,&nbsp;Yanchao Cui,&nbsp;Chunxia Zhao","doi":"10.1049/itr2.12515","DOIUrl":null,"url":null,"abstract":"<p>Deep images can provide rich spatial structure information, which can effectively exclude the interference of illumination and road texture in road scene segmentation and make better use of the prior knowledge of road area. This paper first proposes a new cross-modal feature maintenance and encouragement network. It includes a quantization statistics module as well as a maintenance and encouragement module for effective fusion between multimodal data. Meanwhile, for the problem that if the road segmentation is performed directly using a segmentation network, there will be a lack of supervised guidance with clear physical meaningful information and poor interpretability of learning features, this paper proposes two road segmentation models based on prior knowledge of deep image: disparity information and surface normal vector information. Then, a two-branch neural network is used to process the colour image and the processed depth image separately, to achieve the full utilization of the complementary features of the two modalities. The experimental results on the KITTI road dataset and Cityscapes dataset show that the method in this paper has good road segmentation performance and high computational efficiency.</p>","PeriodicalId":50381,"journal":{"name":"IET Intelligent Transport Systems","volume":"18 7","pages":"1355-1368"},"PeriodicalIF":2.3000,"publicationDate":"2024-05-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1049/itr2.12515","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IET Intelligent Transport Systems","FirstCategoryId":"5","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1049/itr2.12515","RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

Abstract

Deep images can provide rich spatial structure information, which can effectively exclude the interference of illumination and road texture in road scene segmentation and make better use of the prior knowledge of road area. This paper first proposes a new cross-modal feature maintenance and encouragement network. It includes a quantization statistics module as well as a maintenance and encouragement module for effective fusion between multimodal data. Meanwhile, for the problem that if the road segmentation is performed directly using a segmentation network, there will be a lack of supervised guidance with clear physical meaningful information and poor interpretability of learning features, this paper proposes two road segmentation models based on prior knowledge of deep image: disparity information and surface normal vector information. Then, a two-branch neural network is used to process the colour image and the processed depth image separately, to achieve the full utilization of the complementary features of the two modalities. The experimental results on the KITTI road dataset and Cityscapes dataset show that the method in this paper has good road segmentation performance and high computational efficiency.

Abstract Image

基于跨模态特征维护和鼓励的 RGB-D 道路分割
深度图像能提供丰富的空间结构信息,在道路场景分割中能有效排除光照和道路纹理的干扰,更好地利用道路区域的先验知识。本文首先提出了一种新的跨模态特征维护和鼓励网络。它包括量化统计模块以及维护和激励模块,可实现多模态数据之间的有效融合。同时,针对直接使用分割网络进行道路分割会缺乏具有明确物理意义信息的监督指导、学习特征的可解释性差等问题,本文提出了基于深度图像先验知识的两种道路分割模型:色差信息和表面法向量信息。然后,利用双分支神经网络分别处理彩色图像和处理后的深度图像,实现两种模态特征互补的充分利用。在 KITTI 道路数据集和城市景观数据集上的实验结果表明,本文的方法具有良好的道路分割性能和较高的计算效率。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
IET Intelligent Transport Systems
IET Intelligent Transport Systems 工程技术-运输科技
CiteScore
6.50
自引率
7.40%
发文量
159
审稿时长
3 months
期刊介绍: IET Intelligent Transport Systems is an interdisciplinary journal devoted to research into the practical applications of ITS and infrastructures. The scope of the journal includes the following: Sustainable traffic solutions Deployments with enabling technologies Pervasive monitoring Applications; demonstrations and evaluation Economic and behavioural analyses of ITS services and scenario Data Integration and analytics Information collection and processing; image processing applications in ITS ITS aspects of electric vehicles Autonomous vehicles; connected vehicle systems; In-vehicle ITS, safety and vulnerable road user aspects Mobility as a service systems Traffic management and control Public transport systems technologies Fleet and public transport logistics Emergency and incident management Demand management and electronic payment systems Traffic related air pollution management Policy and institutional issues Interoperability, standards and architectures Funding scenarios Enforcement Human machine interaction Education, training and outreach Current Special Issue Call for papers: Intelligent Transportation Systems in Smart Cities for Sustainable Environment - https://digital-library.theiet.org/files/IET_ITS_CFP_ITSSCSE.pdf Sustainably Intelligent Mobility (SIM) - https://digital-library.theiet.org/files/IET_ITS_CFP_SIM.pdf Traffic Theory and Modelling in the Era of Artificial Intelligence and Big Data (in collaboration with World Congress for Transport Research, WCTR 2019) - https://digital-library.theiet.org/files/IET_ITS_CFP_WCTR.pdf
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信