MStereoNet: A Lightweight Stereo Matching Network Using MobileNet

Han Yu, Ke Wang, Lun Zhou, Zhen Wang
{"title":"MStereoNet: A Lightweight Stereo Matching Network Using MobileNet","authors":"Han Yu, Ke Wang, Lun Zhou, Zhen Wang","doi":"10.1109/AICIT55386.2022.9930293","DOIUrl":null,"url":null,"abstract":"Deep learning model-based approaches to stereo matching challenges are more accurate than conventional feature-based techniques created by hand. This leads to the issue that deploying applications on devices with restricted resources is not friendly to employing complicated networks and total cost space to increase performance. To minimize processing effort without sacrificing matching accuracy, we propose MStereoNet in this study, a more effective stereo network. It has been demonstrated experimentally that the network in this research significantly lowers the requirement for computing power. The network in this article also employs a multiscale loss to enhance the reliability of the detail. The approach in this study delivers a performance comparable to the best existing algorithms in the Sceneflow dataset when compared to other low-cost dense stereo depth estimation techniques. Research demonstrates that the network suggested in this study can reduce up to 72.5% and 87% of the parameters and operations than the largest volume of the methods involved in the comparison. Our network also marginally outperforms other lightweight binocular matching networks in terms of accuracy.","PeriodicalId":231070,"journal":{"name":"2022 International Conference on Artificial Intelligence and Computer Information Technology (AICIT)","volume":"6 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-09-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 International Conference on Artificial Intelligence and Computer Information Technology (AICIT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/AICIT55386.2022.9930293","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

Deep learning model-based approaches to stereo matching challenges are more accurate than conventional feature-based techniques created by hand. This leads to the issue that deploying applications on devices with restricted resources is not friendly to employing complicated networks and total cost space to increase performance. To minimize processing effort without sacrificing matching accuracy, we propose MStereoNet in this study, a more effective stereo network. It has been demonstrated experimentally that the network in this research significantly lowers the requirement for computing power. The network in this article also employs a multiscale loss to enhance the reliability of the detail. The approach in this study delivers a performance comparable to the best existing algorithms in the Sceneflow dataset when compared to other low-cost dense stereo depth estimation techniques. Research demonstrates that the network suggested in this study can reduce up to 72.5% and 87% of the parameters and operations than the largest volume of the methods involved in the comparison. Our network also marginally outperforms other lightweight binocular matching networks in terms of accuracy.
MStereoNet:使用MobileNet的轻量级立体声匹配网络
基于深度学习模型的立体匹配方法比传统的手工创建的基于特征的技术更准确。这就导致了这样一个问题:在资源有限的设备上部署应用程序,不利于采用复杂的网络和总成本空间来提高性能。为了在不牺牲匹配精度的情况下最大限度地减少处理工作量,我们在本研究中提出了MStereoNet,这是一种更有效的立体网络。实验证明,本研究中的网络显著降低了对计算能力的要求。本文中的网络还采用了多尺度损失来提高细节的可靠性。与其他低成本密集立体深度估计技术相比,本研究中的方法提供了与scenflow数据集中现有最佳算法相当的性能。研究表明,本研究提出的网络可以减少高达72.5%和87%的参数和操作,比最大体积的方法所涉及的比较。我们的网络在精度方面也略微优于其他轻量级双目匹配网络。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信