Yutong Hou , Shiqiang Du , Huaikun Zhang , Jizhao Liu , Jinying Liu , Jing Lian
{"title":"基于纹理和几何特征融合的敦煌壁画绘制网络","authors":"Yutong Hou , Shiqiang Du , Huaikun Zhang , Jizhao Liu , Jinying Liu , Jing Lian","doi":"10.1016/j.sigpro.2025.110096","DOIUrl":null,"url":null,"abstract":"<div><div>In recent years, deep-learning-based image inpainting methods have emerged as a popular research area. However, the application of such methods to mural inpainting presents several challenges. First, mural images often contain complex textures and rich details. Traditional inpainting methods struggle to preserve texture and detail information effectively and cannot ensure consistency between restored areas and the original mural. Second, the missing regions of murals often contain complex geometric structures and artistic styles, requiring mural inpainting algorithms to understand an image’s global semantics and accurately capture local details. This paper proposes a texture and geometric feature fusion network for Dunhuang mural inpainting consisting of two subnetworks: a Primary Inpainting Network (PIN) and a Refinement Enhancement Network (REN). The PIN extracts geometric features by incorporating fresco line drawings. It utilizes a Mamba-enhanced encoding module and gated convolution in its encoder to capture image texture features effectively, thereby enhancing the clarity of texture details. Then, the Dynamic Multi-scale Semantic Fusion Module (DMSFM) combines global and local information from texture and geometric features, completing the initial inpainting of a damaged mural. The REN specializes in inpainting image details by recovering complex textures, fine edges, and local structures. Randomly selected Narrative, Buddhist, and Caisson murals from a Dunhuang mural painting dataset were used to test the proposed network. Comparative experimental results demonstrate that the proposed method achieves superior mural inpainting outcomes compared with popular existing methods.</div></div>","PeriodicalId":49523,"journal":{"name":"Signal Processing","volume":"238 ","pages":"Article 110096"},"PeriodicalIF":3.4000,"publicationDate":"2025-05-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Texture and geometric feature-fusion-based network for Dunhuang mural inpainting\",\"authors\":\"Yutong Hou , Shiqiang Du , Huaikun Zhang , Jizhao Liu , Jinying Liu , Jing Lian\",\"doi\":\"10.1016/j.sigpro.2025.110096\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>In recent years, deep-learning-based image inpainting methods have emerged as a popular research area. However, the application of such methods to mural inpainting presents several challenges. First, mural images often contain complex textures and rich details. Traditional inpainting methods struggle to preserve texture and detail information effectively and cannot ensure consistency between restored areas and the original mural. Second, the missing regions of murals often contain complex geometric structures and artistic styles, requiring mural inpainting algorithms to understand an image’s global semantics and accurately capture local details. This paper proposes a texture and geometric feature fusion network for Dunhuang mural inpainting consisting of two subnetworks: a Primary Inpainting Network (PIN) and a Refinement Enhancement Network (REN). The PIN extracts geometric features by incorporating fresco line drawings. It utilizes a Mamba-enhanced encoding module and gated convolution in its encoder to capture image texture features effectively, thereby enhancing the clarity of texture details. Then, the Dynamic Multi-scale Semantic Fusion Module (DMSFM) combines global and local information from texture and geometric features, completing the initial inpainting of a damaged mural. The REN specializes in inpainting image details by recovering complex textures, fine edges, and local structures. Randomly selected Narrative, Buddhist, and Caisson murals from a Dunhuang mural painting dataset were used to test the proposed network. Comparative experimental results demonstrate that the proposed method achieves superior mural inpainting outcomes compared with popular existing methods.</div></div>\",\"PeriodicalId\":49523,\"journal\":{\"name\":\"Signal Processing\",\"volume\":\"238 \",\"pages\":\"Article 110096\"},\"PeriodicalIF\":3.4000,\"publicationDate\":\"2025-05-29\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Signal Processing\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0165168425002105\",\"RegionNum\":2,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Signal Processing","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0165168425002105","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
Texture and geometric feature-fusion-based network for Dunhuang mural inpainting
In recent years, deep-learning-based image inpainting methods have emerged as a popular research area. However, the application of such methods to mural inpainting presents several challenges. First, mural images often contain complex textures and rich details. Traditional inpainting methods struggle to preserve texture and detail information effectively and cannot ensure consistency between restored areas and the original mural. Second, the missing regions of murals often contain complex geometric structures and artistic styles, requiring mural inpainting algorithms to understand an image’s global semantics and accurately capture local details. This paper proposes a texture and geometric feature fusion network for Dunhuang mural inpainting consisting of two subnetworks: a Primary Inpainting Network (PIN) and a Refinement Enhancement Network (REN). The PIN extracts geometric features by incorporating fresco line drawings. It utilizes a Mamba-enhanced encoding module and gated convolution in its encoder to capture image texture features effectively, thereby enhancing the clarity of texture details. Then, the Dynamic Multi-scale Semantic Fusion Module (DMSFM) combines global and local information from texture and geometric features, completing the initial inpainting of a damaged mural. The REN specializes in inpainting image details by recovering complex textures, fine edges, and local structures. Randomly selected Narrative, Buddhist, and Caisson murals from a Dunhuang mural painting dataset were used to test the proposed network. Comparative experimental results demonstrate that the proposed method achieves superior mural inpainting outcomes compared with popular existing methods.
期刊介绍:
Signal Processing incorporates all aspects of the theory and practice of signal processing. It features original research work, tutorial and review articles, and accounts of practical developments. It is intended for a rapid dissemination of knowledge and experience to engineers and scientists working in the research, development or practical application of signal processing.
Subject areas covered by the journal include: Signal Theory; Stochastic Processes; Detection and Estimation; Spectral Analysis; Filtering; Signal Processing Systems; Software Developments; Image Processing; Pattern Recognition; Optical Signal Processing; Digital Signal Processing; Multi-dimensional Signal Processing; Communication Signal Processing; Biomedical Signal Processing; Geophysical and Astrophysical Signal Processing; Earth Resources Signal Processing; Acoustic and Vibration Signal Processing; Data Processing; Remote Sensing; Signal Processing Technology; Radar Signal Processing; Sonar Signal Processing; Industrial Applications; New Applications.