Slippage-Preserving Reshaping of Human-Made 3D Content

Chrystiano Araújo, Nicholas Vining, Silver Burla, Manuel Ruivo De Oliveira, Enrique Rosales, Alla Sheffer
{"title":"Slippage-Preserving Reshaping of Human-Made 3D Content","authors":"Chrystiano Araújo, Nicholas Vining, Silver Burla, Manuel Ruivo De Oliveira, Enrique Rosales, Alla Sheffer","doi":"10.1145/3618391","DOIUrl":null,"url":null,"abstract":"Artists often need to reshape 3D models of human-made objects by changing the relative proportions or scales of different model parts or elements while preserving the look and structure of the inputs. Manually reshaping inputs to satisfy these criteria is highly time-consuming; the edit in our teaser took an artist 5 hours to complete. However, existing methods for 3D shape editing are largely designed for other tasks and produce undesirable outputs when repurposed for reshaping. Prior work on 2D curve network reshaping suggests that in 2D settings the user-expected outcome is achieved when the reshaping edit keeps the orientations of the different model elements and when these elements scale as-locally-uniformly-as-possible (ALUP). However, our observations suggest that in 3D viewers are tolerant of non-uniform tangential scaling if and when this scaling preserves slippage and reduces changes in element size, or scale, relative to the input. Slippage preservation requires surfaces which are locally slippable with respect to a given rigid motion to retain this property post-reshaping (a motion is slippable if when applied to the surface, it slides the surface along itself without gaps). We build on these observations by first extending the 2D ALUP framework to 3D and then modifying it to allow non-uniform scaling while promoting slippage and scale preservation. Our 3D ALUP extension produces reshaped outputs better aligned with viewer expectations than prior alternatives; our slippage-aware method further improves the outcome producing results on par with manual reshaping ones. Our method does not require any user input beyond specifying control handles and their target locations. We validate our method by applying it to over one hundred diverse inputs and by comparing our results to those generated by alternative approaches and manually. Comparative study participants preferred our outputs over the best performing traditional deformation method by a 65% margin and over our 3D ALUP extension by a 61% margin; they judged our outputs as at least on par with manually produced ones.","PeriodicalId":7077,"journal":{"name":"ACM Transactions on Graphics (TOG)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2023-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACM Transactions on Graphics (TOG)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3618391","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Artists often need to reshape 3D models of human-made objects by changing the relative proportions or scales of different model parts or elements while preserving the look and structure of the inputs. Manually reshaping inputs to satisfy these criteria is highly time-consuming; the edit in our teaser took an artist 5 hours to complete. However, existing methods for 3D shape editing are largely designed for other tasks and produce undesirable outputs when repurposed for reshaping. Prior work on 2D curve network reshaping suggests that in 2D settings the user-expected outcome is achieved when the reshaping edit keeps the orientations of the different model elements and when these elements scale as-locally-uniformly-as-possible (ALUP). However, our observations suggest that in 3D viewers are tolerant of non-uniform tangential scaling if and when this scaling preserves slippage and reduces changes in element size, or scale, relative to the input. Slippage preservation requires surfaces which are locally slippable with respect to a given rigid motion to retain this property post-reshaping (a motion is slippable if when applied to the surface, it slides the surface along itself without gaps). We build on these observations by first extending the 2D ALUP framework to 3D and then modifying it to allow non-uniform scaling while promoting slippage and scale preservation. Our 3D ALUP extension produces reshaped outputs better aligned with viewer expectations than prior alternatives; our slippage-aware method further improves the outcome producing results on par with manual reshaping ones. Our method does not require any user input beyond specifying control handles and their target locations. We validate our method by applying it to over one hundred diverse inputs and by comparing our results to those generated by alternative approaches and manually. Comparative study participants preferred our outputs over the best performing traditional deformation method by a 65% margin and over our 3D ALUP extension by a 61% margin; they judged our outputs as at least on par with manually produced ones.
人工三维内容的防滑重塑
艺术家经常需要通过改变不同模型部分或元素的相对比例或比例来重塑人造物体的3D模型,同时保留输入的外观和结构。手动重塑输入以满足这些标准是非常耗时的;我们的预告片编辑花了一个美工5个小时才完成。然而,现有的3D形状编辑方法主要是为其他任务设计的,并且在重新用于重塑时产生不希望的输出。先前关于二维曲线网络重塑的工作表明,在二维设置中,当重塑编辑保持不同模型元素的方向并且这些元素尽可能局部均匀缩放(ALUP)时,用户期望的结果就会实现。然而,我们的观察表明,在3D中,如果并且当这种缩放保持滑动并减少相对于输入的元素大小或缩放的变化时,则观看者可以容忍非均匀切向缩放。滑移保持要求相对于给定刚性运动的局部可滑动的表面在重塑后保持这种属性(如果运动应用于表面时,它沿着表面本身滑动而没有间隙,则该运动是可滑动的)。我们在这些观察的基础上,首先将2D ALUP框架扩展到3D,然后对其进行修改,以允许非均匀缩放,同时促进滑动和尺度保存。我们的3D ALUP扩展产生重塑输出更好地与观众的期望比以前的替代品对齐;我们的滑移感知方法进一步改善了结果,产生的结果与手动重塑的结果相当。除了指定控件句柄及其目标位置外,我们的方法不需要任何用户输入。我们通过将我们的方法应用于一百多个不同的输入,并将我们的结果与其他方法和手动生成的结果进行比较,来验证我们的方法。比较研究参与者更喜欢我们的输出,而不是表现最好的传统变形方法65%的边际和我们的3D ALUP扩展61%的边际;他们认为我们的产品至少与手工生产的产品不相上下。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信