{"title":"基于自适应风格的动态残差多阶段多池化任意图像风格转移方法","authors":"Wenrui Yi, Anmin Zhu","doi":"10.1109/CTISC52352.2021.00070","DOIUrl":null,"url":null,"abstract":"Style transfer means that the characteristic information of the style image is transferred to the content image under a given content and style picture. Meanwhile, the transferred image is faithful to the content image. Currently, the transferred image has many problems such as artifacts and distortion of the spatial structure. To solve these problems, a dynamic residual-multi-level and multi-pooling network combined with our improved style transfer algorithm is proposed in this paper to achieve better effect of arbitrary image style transfer. To elaborate more specifically, First of all, We add a dynamic-residual network layer to the network to speed up the training speed of the network model and improve the robustness of the network. Secondly, we use a multi-level and multi-pooling layer to extract more specific spatial structure, edge texture and other information in the image, and the network layer also has an additional denoising effect. Third, the improved style transfer algorithm aligns the mean and variance of the content and style images from a statistical point of view, and realizes adaptive arbitrary style transfer of the original input image features. Finally, during model training, the convergence speed of the proposed approach is faster than other current advanced methods, and the transferred renderings are better than other network models in quantitative comparisons such as SSIM and Gram. It is worth mentioning that our model has the fastest real-time transfer speed and can realize any image style transfer.","PeriodicalId":268378,"journal":{"name":"2021 3rd International Conference on Advances in Computer Technology, Information Science and Communication (CTISC)","volume":"35 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"An adaptive-stylization-based dynamic residual-multi-stage and multi-pooling approach to arbitrary image style transfer\",\"authors\":\"Wenrui Yi, Anmin Zhu\",\"doi\":\"10.1109/CTISC52352.2021.00070\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Style transfer means that the characteristic information of the style image is transferred to the content image under a given content and style picture. Meanwhile, the transferred image is faithful to the content image. Currently, the transferred image has many problems such as artifacts and distortion of the spatial structure. To solve these problems, a dynamic residual-multi-level and multi-pooling network combined with our improved style transfer algorithm is proposed in this paper to achieve better effect of arbitrary image style transfer. To elaborate more specifically, First of all, We add a dynamic-residual network layer to the network to speed up the training speed of the network model and improve the robustness of the network. Secondly, we use a multi-level and multi-pooling layer to extract more specific spatial structure, edge texture and other information in the image, and the network layer also has an additional denoising effect. Third, the improved style transfer algorithm aligns the mean and variance of the content and style images from a statistical point of view, and realizes adaptive arbitrary style transfer of the original input image features. Finally, during model training, the convergence speed of the proposed approach is faster than other current advanced methods, and the transferred renderings are better than other network models in quantitative comparisons such as SSIM and Gram. It is worth mentioning that our model has the fastest real-time transfer speed and can realize any image style transfer.\",\"PeriodicalId\":268378,\"journal\":{\"name\":\"2021 3rd International Conference on Advances in Computer Technology, Information Science and Communication (CTISC)\",\"volume\":\"35 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-04-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 3rd International Conference on Advances in Computer Technology, Information Science and Communication (CTISC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CTISC52352.2021.00070\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 3rd International Conference on Advances in Computer Technology, Information Science and Communication (CTISC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CTISC52352.2021.00070","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
An adaptive-stylization-based dynamic residual-multi-stage and multi-pooling approach to arbitrary image style transfer
Style transfer means that the characteristic information of the style image is transferred to the content image under a given content and style picture. Meanwhile, the transferred image is faithful to the content image. Currently, the transferred image has many problems such as artifacts and distortion of the spatial structure. To solve these problems, a dynamic residual-multi-level and multi-pooling network combined with our improved style transfer algorithm is proposed in this paper to achieve better effect of arbitrary image style transfer. To elaborate more specifically, First of all, We add a dynamic-residual network layer to the network to speed up the training speed of the network model and improve the robustness of the network. Secondly, we use a multi-level and multi-pooling layer to extract more specific spatial structure, edge texture and other information in the image, and the network layer also has an additional denoising effect. Third, the improved style transfer algorithm aligns the mean and variance of the content and style images from a statistical point of view, and realizes adaptive arbitrary style transfer of the original input image features. Finally, during model training, the convergence speed of the proposed approach is faster than other current advanced methods, and the transferred renderings are better than other network models in quantitative comparisons such as SSIM and Gram. It is worth mentioning that our model has the fastest real-time transfer speed and can realize any image style transfer.