Ancient paintings inpainting based on dual encoders and contextual information

IF 2.6 1区 艺术学 Q2 CHEMISTRY, ANALYTICAL
Zengguo Sun, Yanyan Lei, Xiaojun Wu
{"title":"Ancient paintings inpainting based on dual encoders and contextual information","authors":"Zengguo Sun, Yanyan Lei, Xiaojun Wu","doi":"10.1186/s40494-024-01391-2","DOIUrl":null,"url":null,"abstract":"<p>Deep learning-based inpainting models have achieved success in restoring natural images, yet their application to ancient paintings encounters challenges due to the loss of texture, lines, and color. To address these issues, we introduce an ancient painting inpainting model based on dual encoders and contextual information to overcome the lack of feature extraction and detail texture recovery when restoring ancient paintings. Specifically, the proposed model employs a gated encoding branch that aims to minimize information loss and effectively capture semantic information from ancient paintings. A dense multi-scale feature fusion module is designed to extract texture and detail information at various scales, while dilated depthwise separable convolutions are utilized to reduce parameters and enhance computational efficiency. Furthermore, a contextual feature aggregation module is incorporated to extract contextual features, enhancing the overall consistency of the inpainting results. Finally, a color loss function is introduced to ensure color consistency in the restored area, harmonizing it with the surrounding region. The experimental results indicate that the proposed model effectively restores the texture details of ancient paintings, outperforming other methods both qualitatively and quantitatively. Additionally, the model is tested on real damaged ancient paintings to validate its practicality and efficacy.</p>","PeriodicalId":13109,"journal":{"name":"Heritage Science","volume":"65 1","pages":""},"PeriodicalIF":2.6000,"publicationDate":"2024-07-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Heritage Science","FirstCategoryId":"92","ListUrlMain":"https://doi.org/10.1186/s40494-024-01391-2","RegionNum":1,"RegionCategory":"艺术学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"CHEMISTRY, ANALYTICAL","Score":null,"Total":0}
引用次数: 0

Abstract

Deep learning-based inpainting models have achieved success in restoring natural images, yet their application to ancient paintings encounters challenges due to the loss of texture, lines, and color. To address these issues, we introduce an ancient painting inpainting model based on dual encoders and contextual information to overcome the lack of feature extraction and detail texture recovery when restoring ancient paintings. Specifically, the proposed model employs a gated encoding branch that aims to minimize information loss and effectively capture semantic information from ancient paintings. A dense multi-scale feature fusion module is designed to extract texture and detail information at various scales, while dilated depthwise separable convolutions are utilized to reduce parameters and enhance computational efficiency. Furthermore, a contextual feature aggregation module is incorporated to extract contextual features, enhancing the overall consistency of the inpainting results. Finally, a color loss function is introduced to ensure color consistency in the restored area, harmonizing it with the surrounding region. The experimental results indicate that the proposed model effectively restores the texture details of ancient paintings, outperforming other methods both qualitatively and quantitatively. Additionally, the model is tested on real damaged ancient paintings to validate its practicality and efficacy.

Abstract Image

基于双编码器和上下文信息的古画内画
基于深度学习的内绘模型在还原自然图像方面取得了成功,但由于纹理、线条和色彩的损失,将其应用于古画会遇到挑战。为了解决这些问题,我们引入了一种基于双编码器和上下文信息的古画内画模型,以克服在修复古画时缺乏特征提取和细节纹理恢复的问题。具体来说,该模型采用了门控编码分支,旨在最大限度地减少信息丢失,并有效捕捉古画中的语义信息。设计的密集多尺度特征融合模块可提取不同尺度的纹理和细节信息,同时利用扩张的深度可分离卷积来减少参数并提高计算效率。此外,还采用了上下文特征聚合模块来提取上下文特征,从而增强了内绘结果的整体一致性。最后,还引入了颜色损失函数,以确保修复区域的颜色一致性,使其与周围区域相协调。实验结果表明,所提出的模型能有效恢复古画的纹理细节,在定性和定量方面都优于其他方法。此外,该模型还在真实的受损古画上进行了测试,以验证其实用性和有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Heritage Science
Heritage Science Arts and Humanities-Conservation
CiteScore
4.00
自引率
20.00%
发文量
183
审稿时长
19 weeks
期刊介绍: Heritage Science is an open access journal publishing original peer-reviewed research covering: Understanding of the manufacturing processes, provenances, and environmental contexts of material types, objects, and buildings, of cultural significance including their historical significance. Understanding and prediction of physico-chemical and biological degradation processes of cultural artefacts, including climate change, and predictive heritage studies. Development and application of analytical and imaging methods or equipments for non-invasive, non-destructive or portable analysis of artwork and objects of cultural significance to identify component materials, degradation products and deterioration markers. Development and application of invasive and destructive methods for understanding the provenance of objects of cultural significance. Development and critical assessment of treatment materials and methods for artwork and objects of cultural significance. Development and application of statistical methods and algorithms for data analysis to further understanding of culturally significant objects. Publication of reference and corpus datasets as supplementary information to the statistical and analytical studies above. Description of novel technologies that can assist in the understanding of cultural heritage.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信