基于自适应风格的动态残差多阶段多池化任意图像风格转移方法

Wenrui Yi, Anmin Zhu
{"title":"基于自适应风格的动态残差多阶段多池化任意图像风格转移方法","authors":"Wenrui Yi, Anmin Zhu","doi":"10.1109/CTISC52352.2021.00070","DOIUrl":null,"url":null,"abstract":"Style transfer means that the characteristic information of the style image is transferred to the content image under a given content and style picture. Meanwhile, the transferred image is faithful to the content image. Currently, the transferred image has many problems such as artifacts and distortion of the spatial structure. To solve these problems, a dynamic residual-multi-level and multi-pooling network combined with our improved style transfer algorithm is proposed in this paper to achieve better effect of arbitrary image style transfer. To elaborate more specifically, First of all, We add a dynamic-residual network layer to the network to speed up the training speed of the network model and improve the robustness of the network. Secondly, we use a multi-level and multi-pooling layer to extract more specific spatial structure, edge texture and other information in the image, and the network layer also has an additional denoising effect. Third, the improved style transfer algorithm aligns the mean and variance of the content and style images from a statistical point of view, and realizes adaptive arbitrary style transfer of the original input image features. Finally, during model training, the convergence speed of the proposed approach is faster than other current advanced methods, and the transferred renderings are better than other network models in quantitative comparisons such as SSIM and Gram. It is worth mentioning that our model has the fastest real-time transfer speed and can realize any image style transfer.","PeriodicalId":268378,"journal":{"name":"2021 3rd International Conference on Advances in Computer Technology, Information Science and Communication (CTISC)","volume":"35 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"An adaptive-stylization-based dynamic residual-multi-stage and multi-pooling approach to arbitrary image style transfer\",\"authors\":\"Wenrui Yi, Anmin Zhu\",\"doi\":\"10.1109/CTISC52352.2021.00070\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Style transfer means that the characteristic information of the style image is transferred to the content image under a given content and style picture. Meanwhile, the transferred image is faithful to the content image. Currently, the transferred image has many problems such as artifacts and distortion of the spatial structure. To solve these problems, a dynamic residual-multi-level and multi-pooling network combined with our improved style transfer algorithm is proposed in this paper to achieve better effect of arbitrary image style transfer. To elaborate more specifically, First of all, We add a dynamic-residual network layer to the network to speed up the training speed of the network model and improve the robustness of the network. Secondly, we use a multi-level and multi-pooling layer to extract more specific spatial structure, edge texture and other information in the image, and the network layer also has an additional denoising effect. Third, the improved style transfer algorithm aligns the mean and variance of the content and style images from a statistical point of view, and realizes adaptive arbitrary style transfer of the original input image features. Finally, during model training, the convergence speed of the proposed approach is faster than other current advanced methods, and the transferred renderings are better than other network models in quantitative comparisons such as SSIM and Gram. It is worth mentioning that our model has the fastest real-time transfer speed and can realize any image style transfer.\",\"PeriodicalId\":268378,\"journal\":{\"name\":\"2021 3rd International Conference on Advances in Computer Technology, Information Science and Communication (CTISC)\",\"volume\":\"35 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-04-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 3rd International Conference on Advances in Computer Technology, Information Science and Communication (CTISC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CTISC52352.2021.00070\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 3rd International Conference on Advances in Computer Technology, Information Science and Communication (CTISC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CTISC52352.2021.00070","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

风格转移是指在给定的内容和风格图片下,将风格图像的特征信息转移到内容图像中。同时,传输图像忠实于内容图像。目前,传输后的图像存在伪影、空间结构失真等问题。为了解决这些问题,本文结合改进的风格迁移算法,提出了一种动态残差多级多池网络,以达到更好的任意图像风格迁移效果。首先,我们在网络中加入了动态残差网络层,加快了网络模型的训练速度,提高了网络的鲁棒性。其次,我们使用多层次多池化层提取图像中更具体的空间结构、边缘纹理等信息,并且网络层还具有额外的去噪效果。第三,改进的风格转移算法从统计的角度对内容和风格图像的均值和方差进行对齐,实现对原始输入图像特征的自适应任意风格转移。最后,在模型训练过程中,本文方法的收敛速度比目前其他先进方法要快,并且在定量比较中传输的渲染图优于其他网络模型,如SSIM和Gram。值得一提的是,我们的模型具有最快的实时传输速度,可以实现任何图像风格的传输。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
An adaptive-stylization-based dynamic residual-multi-stage and multi-pooling approach to arbitrary image style transfer
Style transfer means that the characteristic information of the style image is transferred to the content image under a given content and style picture. Meanwhile, the transferred image is faithful to the content image. Currently, the transferred image has many problems such as artifacts and distortion of the spatial structure. To solve these problems, a dynamic residual-multi-level and multi-pooling network combined with our improved style transfer algorithm is proposed in this paper to achieve better effect of arbitrary image style transfer. To elaborate more specifically, First of all, We add a dynamic-residual network layer to the network to speed up the training speed of the network model and improve the robustness of the network. Secondly, we use a multi-level and multi-pooling layer to extract more specific spatial structure, edge texture and other information in the image, and the network layer also has an additional denoising effect. Third, the improved style transfer algorithm aligns the mean and variance of the content and style images from a statistical point of view, and realizes adaptive arbitrary style transfer of the original input image features. Finally, during model training, the convergence speed of the proposed approach is faster than other current advanced methods, and the transferred renderings are better than other network models in quantitative comparisons such as SSIM and Gram. It is worth mentioning that our model has the fastest real-time transfer speed and can realize any image style transfer.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信