Chen Yang, Xinrong Hu, Yangjun Ou, Saishang Zhong, Tao Peng, Lei Zhu, P. Li, Bin Sheng
{"title":"Unsupervised Embroidery Generation Using Embroidery Channel Attention","authors":"Chen Yang, Xinrong Hu, Yangjun Ou, Saishang Zhong, Tao Peng, Lei Zhu, P. Li, Bin Sheng","doi":"10.1145/3574131.3574430","DOIUrl":null,"url":null,"abstract":"It is a challenging task to synthesize an embroidery image with complex texture from a colorful image. Existing style transfer methods to synthesize embroidery images will lead to color shift and texture clutter. In this paper, the generative adversarial network architecture with embroidery channel attention is proposed to synthesize embroidery images based on the unaligned dataset. Our method can synthesize the color and texture images generated separately from the features of the input image without extra data and cycle network. The generator with embroidery channel attention in our network can generate three attention masks (texture attention mask, color attention mask, original attention mask) and two content masks (color content mask and texture content mask). The color image and texture image of embroidery are synthesized separately with these masks. Meanwhile, a color loss function is proposed to encourage the color of the generated image to be close to that of the original image. In addition, a white padding processing technology is proposed to improve the stability of global embroidery texture synthesis. Existing extensive experiments show that our method synthesizes the embroidery images with realistic color and stable texture to solve the color shift and texture clutter. In the case of ensuring the content of the input images, the results synthesized by our method are closer to the real embroidery.","PeriodicalId":111802,"journal":{"name":"Proceedings of the 18th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and its Applications in Industry","volume":"7 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 18th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and its Applications in Industry","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3574131.3574430","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
It is a challenging task to synthesize an embroidery image with complex texture from a colorful image. Existing style transfer methods to synthesize embroidery images will lead to color shift and texture clutter. In this paper, the generative adversarial network architecture with embroidery channel attention is proposed to synthesize embroidery images based on the unaligned dataset. Our method can synthesize the color and texture images generated separately from the features of the input image without extra data and cycle network. The generator with embroidery channel attention in our network can generate three attention masks (texture attention mask, color attention mask, original attention mask) and two content masks (color content mask and texture content mask). The color image and texture image of embroidery are synthesized separately with these masks. Meanwhile, a color loss function is proposed to encourage the color of the generated image to be close to that of the original image. In addition, a white padding processing technology is proposed to improve the stability of global embroidery texture synthesis. Existing extensive experiments show that our method synthesizes the embroidery images with realistic color and stable texture to solve the color shift and texture clutter. In the case of ensuring the content of the input images, the results synthesized by our method are closer to the real embroidery.