Chengan He, Xin Sun, Zhixin Shu, Fujun Luan, Sören Pirk, Jorge Alejandro Amador Herrera, Dominik L. Michels, Tuanfeng Y. Wang, Meng Zhang, Holly Rushmeier, Yi Zhou
{"title":"\\textsc{Perm}:多风格三维发型建模的参数表示法","authors":"Chengan He, Xin Sun, Zhixin Shu, Fujun Luan, Sören Pirk, Jorge Alejandro Amador Herrera, Dominik L. Michels, Tuanfeng Y. Wang, Meng Zhang, Holly Rushmeier, Yi Zhou","doi":"arxiv-2407.19451","DOIUrl":null,"url":null,"abstract":"We present \\textsc{Perm}, a learned parametric model of human 3D hair\ndesigned to facilitate various hair-related applications. Unlike previous work\nthat jointly models the global hair shape and local strand details, we propose\nto disentangle them using a PCA-based strand representation in the frequency\ndomain, thereby allowing more precise editing and output control. Specifically,\nwe leverage our strand representation to fit and decompose hair geometry\ntextures into low- to high-frequency hair structures. These decomposed textures\nare later parameterized with different generative models, emulating common\nstages in the hair modeling process. We conduct extensive experiments to\nvalidate the architecture design of \\textsc{Perm}, and finally deploy the\ntrained model as a generic prior to solve task-agnostic problems, further\nshowcasing its flexibility and superiority in tasks such as 3D hair\nparameterization, hairstyle interpolation, single-view hair reconstruction, and\nhair-conditioned image generation. Our code and data will be available at:\n\\url{https://github.com/c-he/perm}.","PeriodicalId":501174,"journal":{"name":"arXiv - CS - Graphics","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-07-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"\\\\textsc{Perm}: A Parametric Representation for Multi-Style 3D Hair Modeling\",\"authors\":\"Chengan He, Xin Sun, Zhixin Shu, Fujun Luan, Sören Pirk, Jorge Alejandro Amador Herrera, Dominik L. Michels, Tuanfeng Y. Wang, Meng Zhang, Holly Rushmeier, Yi Zhou\",\"doi\":\"arxiv-2407.19451\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We present \\\\textsc{Perm}, a learned parametric model of human 3D hair\\ndesigned to facilitate various hair-related applications. Unlike previous work\\nthat jointly models the global hair shape and local strand details, we propose\\nto disentangle them using a PCA-based strand representation in the frequency\\ndomain, thereby allowing more precise editing and output control. Specifically,\\nwe leverage our strand representation to fit and decompose hair geometry\\ntextures into low- to high-frequency hair structures. These decomposed textures\\nare later parameterized with different generative models, emulating common\\nstages in the hair modeling process. We conduct extensive experiments to\\nvalidate the architecture design of \\\\textsc{Perm}, and finally deploy the\\ntrained model as a generic prior to solve task-agnostic problems, further\\nshowcasing its flexibility and superiority in tasks such as 3D hair\\nparameterization, hairstyle interpolation, single-view hair reconstruction, and\\nhair-conditioned image generation. Our code and data will be available at:\\n\\\\url{https://github.com/c-he/perm}.\",\"PeriodicalId\":501174,\"journal\":{\"name\":\"arXiv - CS - Graphics\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-07-28\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - CS - Graphics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2407.19451\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Graphics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2407.19451","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
\textsc{Perm}: A Parametric Representation for Multi-Style 3D Hair Modeling
We present \textsc{Perm}, a learned parametric model of human 3D hair
designed to facilitate various hair-related applications. Unlike previous work
that jointly models the global hair shape and local strand details, we propose
to disentangle them using a PCA-based strand representation in the frequency
domain, thereby allowing more precise editing and output control. Specifically,
we leverage our strand representation to fit and decompose hair geometry
textures into low- to high-frequency hair structures. These decomposed textures
are later parameterized with different generative models, emulating common
stages in the hair modeling process. We conduct extensive experiments to
validate the architecture design of \textsc{Perm}, and finally deploy the
trained model as a generic prior to solve task-agnostic problems, further
showcasing its flexibility and superiority in tasks such as 3D hair
parameterization, hairstyle interpolation, single-view hair reconstruction, and
hair-conditioned image generation. Our code and data will be available at:
\url{https://github.com/c-he/perm}.