Decoupling Contact for Fine-Grained Motion Style Transfer

Xiangjun Tang, Linjun Wu, He Wang, Yiqian Wu, Bo Hu, Songnan Li, Xu Gong, Yuchen Liao, Qilong Kou, Xiaogang Jin
{"title":"Decoupling Contact for Fine-Grained Motion Style Transfer","authors":"Xiangjun Tang, Linjun Wu, He Wang, Yiqian Wu, Bo Hu, Songnan Li, Xu Gong, Yuchen Liao, Qilong Kou, Xiaogang Jin","doi":"arxiv-2409.05387","DOIUrl":null,"url":null,"abstract":"Motion style transfer changes the style of a motion while retaining its\ncontent and is useful in computer animations and games. Contact is an essential\ncomponent of motion style transfer that should be controlled explicitly in\norder to express the style vividly while enhancing motion naturalness and\nquality. However, it is unknown how to decouple and control contact to achieve\nfine-grained control in motion style transfer. In this paper, we present a\nnovel style transfer method for fine-grained control over contacts while\nachieving both motion naturalness and spatial-temporal variations of style.\nBased on our empirical evidence, we propose controlling contact indirectly\nthrough the hip velocity, which can be further decomposed into the trajectory\nand contact timing, respectively. To this end, we propose a new model that\nexplicitly models the correlations between motions and trajectory/contact\ntiming/style, allowing us to decouple and control each separately. Our approach\nis built around a motion manifold, where hip controls can be easily integrated\ninto a Transformer-based decoder. It is versatile in that it can generate\nmotions directly as well as be used as post-processing for existing methods to\nimprove quality and contact controllability. In addition, we propose a new\nmetric that measures a correlation pattern of motions based on our empirical\nevidence, aligning well with human perception in terms of motion naturalness.\nBased on extensive evaluation, our method outperforms existing methods in terms\nof style expressivity and motion quality.","PeriodicalId":501174,"journal":{"name":"arXiv - CS - Graphics","volume":"284 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Graphics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.05387","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Motion style transfer changes the style of a motion while retaining its content and is useful in computer animations and games. Contact is an essential component of motion style transfer that should be controlled explicitly in order to express the style vividly while enhancing motion naturalness and quality. However, it is unknown how to decouple and control contact to achieve fine-grained control in motion style transfer. In this paper, we present a novel style transfer method for fine-grained control over contacts while achieving both motion naturalness and spatial-temporal variations of style. Based on our empirical evidence, we propose controlling contact indirectly through the hip velocity, which can be further decomposed into the trajectory and contact timing, respectively. To this end, we propose a new model that explicitly models the correlations between motions and trajectory/contact timing/style, allowing us to decouple and control each separately. Our approach is built around a motion manifold, where hip controls can be easily integrated into a Transformer-based decoder. It is versatile in that it can generate motions directly as well as be used as post-processing for existing methods to improve quality and contact controllability. In addition, we propose a new metric that measures a correlation pattern of motions based on our empirical evidence, aligning well with human perception in terms of motion naturalness. Based on extensive evaluation, our method outperforms existing methods in terms of style expressivity and motion quality.
去耦接触实现精细运动风格传递
运动风格转换在保留运动内容的同时改变运动风格,在电脑动画和游戏中非常有用。接触是运动风格转换的重要组成部分,应明确加以控制,以便在增强运动自然度和质量的同时生动地表达运动风格。然而,目前还不知道如何解耦和控制接触,以实现运动风格转移的精细控制。在本文中,我们提出了一种新的风格转换方法,在实现动作自然性和风格的时空变化的同时,对触点进行精细控制。基于我们的经验证据,我们提出通过髋关节速度间接控制触点,而髋关节速度又可分别分解为运动轨迹和触点时间。为此,我们提出了一个新模型,该模型明确地模拟了运动和轨迹/接触时机/风格之间的相关性,使我们能够将两者分离并分别进行控制。我们的方法是围绕运动流形建立的,其中臀部控制可以轻松集成到基于变换器的解码器中。这种方法用途广泛,既可以直接生成动作,也可以用作现有方法的后处理,以提高质量和接触可控性。此外,我们还根据经验证据提出了一种新的测量方法,用于测量运动的相关模式,在运动自然度方面与人类的感知非常吻合。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信