Using Decoupled Features for Photorealistic Style Transfer

Trevor Canham, Adrián Martín Fernández, M. Bertalmío, J. Portilla
{"title":"Using Decoupled Features for Photorealistic Style Transfer","authors":"Trevor Canham, Adrián Martín Fernández, M. Bertalmío, J. Portilla","doi":"10.1137/22m1512491","DOIUrl":null,"url":null,"abstract":"In this work we propose a photorealistic style transfer method for image and video that is based on vision science principles and on a recent mathematical formulation for the deterministic decoupling of sample statistics. The novel aspects of our approach include matching decoupled moments of higher order than in common style transfer approaches, and matching a descriptor of the power spectrum so as to characterize and transfer diffusion effects between source and target, which is something that has not been considered before in the literature. The results are of high visual quality, without spatio-temporal artifacts, and validation tests in the form of observer preference experiments show that our method compares very well with the state-of-the-art. The computational complexity of the algorithm is low, and we propose a numerical implementation that is amenable for real-time video application. Finally, another contribution of our work is to point out that current deep learning approaches for photorealistic style transfer don't really achieve photorealistic quality outside of limited examples, because the results too often show unacceptable visual artifacts.","PeriodicalId":185319,"journal":{"name":"SIAM J. Imaging Sci.","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"SIAM J. Imaging Sci.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1137/22m1512491","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

In this work we propose a photorealistic style transfer method for image and video that is based on vision science principles and on a recent mathematical formulation for the deterministic decoupling of sample statistics. The novel aspects of our approach include matching decoupled moments of higher order than in common style transfer approaches, and matching a descriptor of the power spectrum so as to characterize and transfer diffusion effects between source and target, which is something that has not been considered before in the literature. The results are of high visual quality, without spatio-temporal artifacts, and validation tests in the form of observer preference experiments show that our method compares very well with the state-of-the-art. The computational complexity of the algorithm is low, and we propose a numerical implementation that is amenable for real-time video application. Finally, another contribution of our work is to point out that current deep learning approaches for photorealistic style transfer don't really achieve photorealistic quality outside of limited examples, because the results too often show unacceptable visual artifacts.
使用解耦特征的逼真风格转移
在这项工作中,我们提出了一种基于视觉科学原理和样本统计的确定性解耦的最新数学公式的图像和视频的逼真风格转移方法。我们的方法的新颖之处包括匹配高阶解耦矩,而不是普通的风格转移方法,以及匹配功率谱的描述符,以表征和转移源和目标之间的扩散效应,这是以前在文献中没有考虑过的。结果具有高视觉质量,没有时空伪影,并且以观察者偏好实验形式进行的验证测试表明,我们的方法与最先进的方法相比非常好。该算法的计算复杂度较低,并提出了一种适合实时视频应用的数值实现方法。最后,我们工作的另一个贡献是指出,目前用于逼真风格迁移的深度学习方法在有限的示例之外并不能真正达到逼真的质量,因为结果经常显示出不可接受的视觉伪影。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信