StylishGAN: Toward Fashion Illustration Generation

IF 0.6 4区 工程技术 Q4 MATERIALS SCIENCE, TEXTILES
Xingxing Zou, W. Wong
{"title":"StylishGAN: Toward Fashion Illustration Generation","authors":"Xingxing Zou, W. Wong","doi":"10.1177/24723444221147972","DOIUrl":null,"url":null,"abstract":"In this article, we propose StylishGAN, a generative adversarial network that generates a fashion illustration sketch given an actual photo of a human model. The generated stylish sketches not only capture the image style from real photos to hand drawings with a cleaner background, but also adjust model’s body into a perfectly proportioned shape. StylishGAN learns proportional transformation and texture information through a proposed body-shaping attentional module. Furthermore, we introduce a contextual fashionable loss that augments the design details, especially the fabric texture, of the clothing. To implement our method, we prepare a new fashion dataset, namely, StylishU, that consists of 3578 paired photo–sketch images. In each pair, we have one real photo collected from the fashion show and one corresponding illustration sketch created by professional fashion illustrators. Extensive experiments show the performance of our method qualitatively and quantitatively.","PeriodicalId":6955,"journal":{"name":"AATCC Journal of Research","volume":null,"pages":null},"PeriodicalIF":0.6000,"publicationDate":"2023-02-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"AATCC Journal of Research","FirstCategoryId":"88","ListUrlMain":"https://doi.org/10.1177/24723444221147972","RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"MATERIALS SCIENCE, TEXTILES","Score":null,"Total":0}
引用次数: 0

Abstract

In this article, we propose StylishGAN, a generative adversarial network that generates a fashion illustration sketch given an actual photo of a human model. The generated stylish sketches not only capture the image style from real photos to hand drawings with a cleaner background, but also adjust model’s body into a perfectly proportioned shape. StylishGAN learns proportional transformation and texture information through a proposed body-shaping attentional module. Furthermore, we introduce a contextual fashionable loss that augments the design details, especially the fabric texture, of the clothing. To implement our method, we prepare a new fashion dataset, namely, StylishU, that consists of 3578 paired photo–sketch images. In each pair, we have one real photo collected from the fashion show and one corresponding illustration sketch created by professional fashion illustrators. Extensive experiments show the performance of our method qualitatively and quantitatively.
时尚gan:走向时尚插画一代
在本文中,我们提出了styishgan,这是一个生成式对抗网络,可以根据人体模特的实际照片生成时尚插图草图。生成的时尚草图不仅将真实照片的图像风格捕捉到背景更清晰的手绘,而且还将模特的身体调整成完美的比例形状。StylishGAN通过提出的塑形注意模块学习比例变换和纹理信息。此外,我们引入了一种语境时尚损失,增加了服装的设计细节,特别是织物纹理。为了实现我们的方法,我们准备了一个新的时尚数据集,即StylishU,它由3578对照片素描图像组成。在每一对中,我们都有一张从时装秀中收集的真实照片和一张由专业时装插画师创作的相应插图草图。大量的实验证明了我们的方法在定性和定量上的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
AATCC Journal of Research
AATCC Journal of Research MATERIALS SCIENCE, TEXTILES-
CiteScore
1.30
自引率
0.00%
发文量
34
期刊介绍: AATCC Journal of Research. This textile research journal has a broad scope: from advanced materials, fibers, and textile and polymer chemistry, to color science, apparel design, and sustainability. Now indexed by Science Citation Index Extended (SCIE) and discoverable in the Clarivate Analytics Web of Science Core Collection! The Journal’s impact factor is available in Journal Citation Reports.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信