{"title":"基于双注意模块的任意风格迁移网络","authors":"Yueming Wang","doi":"10.1109/IMCEC51613.2021.9482055","DOIUrl":null,"url":null,"abstract":"Arbitrary style transfer means that stylized images can be generated from a set of arbitrary input image pairs of content images and style images. Recent arbitrary style transfer algorithms lead to distortion of content or incompletion of style transfer because network need to make a balance between the content structure and style. In this paper, we introduce a dual attention network based on style attention and channel attention, which can flexibly transfer local styles, pay more attention to content structure, keep content structure intact and reduce unnecessary style transfer. Experimental results show that the network can synthesize high quality stylized images while maintaining real-time performance.","PeriodicalId":240400,"journal":{"name":"2021 IEEE 4th Advanced Information Management, Communicates, Electronic and Automation Control Conference (IMCEC)","volume":"58 4 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-06-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"An Arbitrary Style Transfer Network based on Dual Attention Module\",\"authors\":\"Yueming Wang\",\"doi\":\"10.1109/IMCEC51613.2021.9482055\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Arbitrary style transfer means that stylized images can be generated from a set of arbitrary input image pairs of content images and style images. Recent arbitrary style transfer algorithms lead to distortion of content or incompletion of style transfer because network need to make a balance between the content structure and style. In this paper, we introduce a dual attention network based on style attention and channel attention, which can flexibly transfer local styles, pay more attention to content structure, keep content structure intact and reduce unnecessary style transfer. Experimental results show that the network can synthesize high quality stylized images while maintaining real-time performance.\",\"PeriodicalId\":240400,\"journal\":{\"name\":\"2021 IEEE 4th Advanced Information Management, Communicates, Electronic and Automation Control Conference (IMCEC)\",\"volume\":\"58 4 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-06-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 IEEE 4th Advanced Information Management, Communicates, Electronic and Automation Control Conference (IMCEC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IMCEC51613.2021.9482055\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE 4th Advanced Information Management, Communicates, Electronic and Automation Control Conference (IMCEC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IMCEC51613.2021.9482055","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
An Arbitrary Style Transfer Network based on Dual Attention Module
Arbitrary style transfer means that stylized images can be generated from a set of arbitrary input image pairs of content images and style images. Recent arbitrary style transfer algorithms lead to distortion of content or incompletion of style transfer because network need to make a balance between the content structure and style. In this paper, we introduce a dual attention network based on style attention and channel attention, which can flexibly transfer local styles, pay more attention to content structure, keep content structure intact and reduce unnecessary style transfer. Experimental results show that the network can synthesize high quality stylized images while maintaining real-time performance.