AttentionPainter: An Efficient and Adaptive Stroke Predictor for Scene Painting.

IF 6.5
Yizhe Tang, Yue Wang, Teng Hu, Ran Yi, Xin Tan, Lizhuang Ma, Yu-Kun Lai, Paul L Rosin
{"title":"AttentionPainter: An Efficient and Adaptive Stroke Predictor for Scene Painting.","authors":"Yizhe Tang, Yue Wang, Teng Hu, Ran Yi, Xin Tan, Lizhuang Ma, Yu-Kun Lai, Paul L Rosin","doi":"10.1109/TVCG.2025.3618184","DOIUrl":null,"url":null,"abstract":"<p><p>Stroke-based Rendering (SBR) aims to decompose an input image into a sequence of parameterized strokes, which can be rendered into a painting that resembles the input image. Recently, Neural Painting methods that utilize deep learning and reinforcement learning models to predict the stroke sequences have been developed, but suffer from longer inference time or unstable training. To address these issues, we propose AttentionPainter, an efficient and adaptive model for single-step neural painting. First, we propose a novel scalable stroke predictor, which predicts a large number of stroke parameters within a single forward process, instead of the iterative prediction of previous Reinforcement Learning or auto-regressive methods, which makes AttentionPainter faster than previous neural painting methods. To further increase the training efficiency, we propose a Fast Stroke Stacking algorithm, which brings 13 times acceleration for training. Moreover, we propose Stroke-density Loss, which encourages the model to use small strokes for detailed information, to help improve the reconstruction quality. Finally, we design a Stroke Diffusion Model as an application of AttentionPainter, which conducts the denoising process in the stroke parameter space and facilitates stroke-based inpainting and editing applications helpful for human artists' design. Extensive experiments show that AttentionPainter outperforms the state-of-the-art neural painting methods.</p>","PeriodicalId":94035,"journal":{"name":"IEEE transactions on visualization and computer graphics","volume":"PP ","pages":""},"PeriodicalIF":6.5000,"publicationDate":"2025-10-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on visualization and computer graphics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/TVCG.2025.3618184","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Stroke-based Rendering (SBR) aims to decompose an input image into a sequence of parameterized strokes, which can be rendered into a painting that resembles the input image. Recently, Neural Painting methods that utilize deep learning and reinforcement learning models to predict the stroke sequences have been developed, but suffer from longer inference time or unstable training. To address these issues, we propose AttentionPainter, an efficient and adaptive model for single-step neural painting. First, we propose a novel scalable stroke predictor, which predicts a large number of stroke parameters within a single forward process, instead of the iterative prediction of previous Reinforcement Learning or auto-regressive methods, which makes AttentionPainter faster than previous neural painting methods. To further increase the training efficiency, we propose a Fast Stroke Stacking algorithm, which brings 13 times acceleration for training. Moreover, we propose Stroke-density Loss, which encourages the model to use small strokes for detailed information, to help improve the reconstruction quality. Finally, we design a Stroke Diffusion Model as an application of AttentionPainter, which conducts the denoising process in the stroke parameter space and facilitates stroke-based inpainting and editing applications helpful for human artists' design. Extensive experiments show that AttentionPainter outperforms the state-of-the-art neural painting methods.

AttentionPainter:一个高效和自适应的笔画预测器。
基于笔画的渲染(Stroke-based Rendering, SBR)旨在将输入图像分解为一系列参数化的笔画,这些笔画可以被渲染成与输入图像相似的绘画。近年来,利用深度学习和强化学习模型来预测笔划序列的神经绘画方法得到了发展,但存在推理时间较长或训练不稳定的问题。为了解决这些问题,我们提出了AttentionPainter,一个高效的、自适应的单步神经绘画模型。首先,我们提出了一种新的可扩展的笔划预测器,它在单个前向过程中预测大量笔划参数,而不是以前的强化学习或自回归方法的迭代预测,这使得AttentionPainter比以前的神经绘画方法更快。为了进一步提高训练效率,我们提出了一种快速笔画叠加算法,该算法为训练带来了13倍的加速。此外,我们提出了Stroke-density Loss,该方法鼓励模型使用小笔画来获取详细信息,以帮助提高重建质量。最后,我们设计了一个笔画扩散模型作为AttentionPainter的应用,该模型在笔画参数空间中进行去噪处理,便于基于笔画的绘画和编辑应用,有助于人类艺术家的设计。大量的实验表明,AttentionPainter优于最先进的神经绘画方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信