{"title":"Cvstgan: A Controllable Generative Adversarial Network for Video Style Transfer of Chinese Painting","authors":"Zunfu Wang, Fang Liu, Changjuan Ran","doi":"10.1007/s00530-024-01457-y","DOIUrl":null,"url":null,"abstract":"<p>Style transfer aims to apply the stylistic characteristics of a reference image onto a target image or video. Existing studies on style transfer suffer from either fixed style without adjustability or unclear stylistic patterns in output results. Moreover, concerning video style transfer, issues such as discontinuity in content and time, flickering, and local distortions are common. Current research on artistic image style transfer mainly focuses on Western painting. In view of the differences between Eastern and Western painting, the existing methods cannot be directly applied to the style transfer of Chinese painting. To address the aforementioned issues, we propose a controllable style transfer method based on generative adversarial networks. The method operates directly in the feature space of style and content domains, synthesizing target images by merging style features and content features. To enhance the output stylization effect of Chinese painting, we incorporate stroke constraints and ink diffusion constraints to improve the visual quality. To mitigate issues such as blank spaces, highlights, and color confusion resulting in flickering and noise in Chinese painting style videos, we propose a flow-based stylized video optimization strategy to ensure consistency in content and time. Qualitative and quantitative experimental results show that our method outperforms state-of-the-art style transfer methods.</p>","PeriodicalId":51138,"journal":{"name":"Multimedia Systems","volume":"17 1","pages":""},"PeriodicalIF":3.5000,"publicationDate":"2024-08-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Multimedia Systems","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1007/s00530-024-01457-y","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0
Abstract
Style transfer aims to apply the stylistic characteristics of a reference image onto a target image or video. Existing studies on style transfer suffer from either fixed style without adjustability or unclear stylistic patterns in output results. Moreover, concerning video style transfer, issues such as discontinuity in content and time, flickering, and local distortions are common. Current research on artistic image style transfer mainly focuses on Western painting. In view of the differences between Eastern and Western painting, the existing methods cannot be directly applied to the style transfer of Chinese painting. To address the aforementioned issues, we propose a controllable style transfer method based on generative adversarial networks. The method operates directly in the feature space of style and content domains, synthesizing target images by merging style features and content features. To enhance the output stylization effect of Chinese painting, we incorporate stroke constraints and ink diffusion constraints to improve the visual quality. To mitigate issues such as blank spaces, highlights, and color confusion resulting in flickering and noise in Chinese painting style videos, we propose a flow-based stylized video optimization strategy to ensure consistency in content and time. Qualitative and quantitative experimental results show that our method outperforms state-of-the-art style transfer methods.
期刊介绍:
This journal details innovative research ideas, emerging technologies, state-of-the-art methods and tools in all aspects of multimedia computing, communication, storage, and applications. It features theoretical, experimental, and survey articles.