A point cloud completion network integrating Mamba and transformer architectures

IF 7.5 1区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Expert Systems with Applications Pub Date : 2026-06-01 Epub Date: 2025-12-15 DOI:10.1016/j.eswa.2025.130826
Weichao Wu , Yongyang Xu , Zhong Xie
{"title":"A point cloud completion network integrating Mamba and transformer architectures","authors":"Weichao Wu ,&nbsp;Yongyang Xu ,&nbsp;Zhong Xie","doi":"10.1016/j.eswa.2025.130826","DOIUrl":null,"url":null,"abstract":"<div><div>Point cloud completion aims to reconstruct complete structures from incomplete point clouds by extracting fine-grained local details and global features. Current state-of-the-art methods rely on Transformer architectures, which suffer from quadratic complexity, leading to high computational costs and trade-offs in resolution and feature extraction. To address this limitation, we propose a novel point cloud completion network that integrates the Mamba model, a state space framework with linear complexity, for feature extraction in the encoding phase. Our approach replaces the self-attention module with Mamba and introduces a multi-scale encoding network to enhance the extraction and fusion of features from incomplete point clouds. A cross-attention decoding module processes centre points and incomplete features to predict a complete point cloud. Experiments on synthetic and real-world datasets show that our method achieves comparable performance to existing state-of-the-art approaches on benchmark datasets, achieving an average CDL1 score of 6.50 on the PCN dataset. In addition, our method demonstrates superior accuracy when processing large-volume point cloud data, highlighting Mamba’s effectiveness in handling such challenges compared with Transformer-based models.</div></div>","PeriodicalId":50461,"journal":{"name":"Expert Systems with Applications","volume":"313 ","pages":"Article 130826"},"PeriodicalIF":7.5000,"publicationDate":"2026-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Expert Systems with Applications","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0957417425044410","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2025/12/15 0:00:00","PubModel":"Epub","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Point cloud completion aims to reconstruct complete structures from incomplete point clouds by extracting fine-grained local details and global features. Current state-of-the-art methods rely on Transformer architectures, which suffer from quadratic complexity, leading to high computational costs and trade-offs in resolution and feature extraction. To address this limitation, we propose a novel point cloud completion network that integrates the Mamba model, a state space framework with linear complexity, for feature extraction in the encoding phase. Our approach replaces the self-attention module with Mamba and introduces a multi-scale encoding network to enhance the extraction and fusion of features from incomplete point clouds. A cross-attention decoding module processes centre points and incomplete features to predict a complete point cloud. Experiments on synthetic and real-world datasets show that our method achieves comparable performance to existing state-of-the-art approaches on benchmark datasets, achieving an average CDL1 score of 6.50 on the PCN dataset. In addition, our method demonstrates superior accuracy when processing large-volume point cloud data, highlighting Mamba’s effectiveness in handling such challenges compared with Transformer-based models.
集成Mamba和变压器架构的点云完井网络
点云补全的目的是通过提取细粒度的局部细节和全局特征,从不完整的点云中重建完整的结构。当前最先进的方法依赖于Transformer架构,它具有二次复杂度,导致高计算成本和分辨率和特征提取的权衡。为了解决这一限制,我们提出了一种新的点云补全网络,该网络集成了具有线性复杂性的状态空间框架Mamba模型,用于编码阶段的特征提取。该方法用Mamba取代了自关注模块,并引入了多尺度编码网络,增强了对不完整点云特征的提取和融合。交叉注意解码模块处理中心点和不完整的特征来预测一个完整的点云。在合成数据集和真实世界数据集上的实验表明,我们的方法在基准数据集上达到了与现有最先进方法相当的性能,在PCN数据集上实现了6.50的平均CDL1分数。此外,我们的方法在处理大容量点云数据时显示出卓越的准确性,与基于transformer的模型相比,突出了Mamba在处理此类挑战方面的有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Expert Systems with Applications
Expert Systems with Applications 工程技术-工程:电子与电气
CiteScore
13.80
自引率
10.60%
发文量
2045
审稿时长
8.7 months
期刊介绍: Expert Systems With Applications is an international journal dedicated to the exchange of information on expert and intelligent systems used globally in industry, government, and universities. The journal emphasizes original papers covering the design, development, testing, implementation, and management of these systems, offering practical guidelines. It spans various sectors such as finance, engineering, marketing, law, project management, information management, medicine, and more. The journal also welcomes papers on multi-agent systems, knowledge management, neural networks, knowledge discovery, data mining, and other related areas, excluding applications to military/defense systems.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信
小红书