姿态捕捉和评估在VR环境

Karn Kiattikunrat, T. Leelasawassuk, S. Hasegawa, C. Mitsantisuk
{"title":"姿态捕捉和评估在VR环境","authors":"Karn Kiattikunrat, T. Leelasawassuk, S. Hasegawa, C. Mitsantisuk","doi":"10.1109/ITC-CSCC58803.2023.10212946","DOIUrl":null,"url":null,"abstract":"This proposed method contributes to the existing body of knowledge and lays a strong foundation for future advancements in the field of VR-based pose analysis and interaction. As the technology continues to evolve, it is expected that further improvements and innovations will emerge, further enhancing the capabilities and applications of pose capturing and evaluation in VR environments. In this study, we introduce an innovative method for evaluating and comparing human postures by harnessing the power of optical motion capture and virtual reality (VR) technologies. Our cutting-edge pose matching algorithm enables users to refine their performance by providing real-time feedback on their postural alignment with a template pose. Capitalizing on the high accuracy and low latency of optical motion capture systems, our approach records detailed posture information, including time stamps, frame indices, and the orientation and position of multiple body joints in Cartesian coordinates. The algorithm computes a similarity score between the user's pose and the template by calculating a normalized loss function based on their 3D posture data. We seamlessly integrate the evaluation model into a VR environment tailored for posture imitation exercises. The template pose is pre-recorded, and the user's pose is dynamically synchronized with the physical world, visualized as an interactive 3D humanoid model within the virtual space. As users mimic the displayed postures and movements, the system generates instantaneous feedback on the similarity score, empowering them to refine their technique and enhance their performance, all within a safe and immersive virtual setting. As a result, we have developed a versatile VR application that successfully compares the similarity of postures and provides users with valuable feedback to improve their skills. The application demonstrates the efficacy of our pose matching algorithm and serves as a foundation for further development and expansion into various domains, including sports training, rehabilitation, and performance arts.","PeriodicalId":220939,"journal":{"name":"2023 International Technical Conference on Circuits/Systems, Computers, and Communications (ITC-CSCC)","volume":"49 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Pose Capturing and Evaluation in a VR Environment\",\"authors\":\"Karn Kiattikunrat, T. Leelasawassuk, S. Hasegawa, C. Mitsantisuk\",\"doi\":\"10.1109/ITC-CSCC58803.2023.10212946\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This proposed method contributes to the existing body of knowledge and lays a strong foundation for future advancements in the field of VR-based pose analysis and interaction. As the technology continues to evolve, it is expected that further improvements and innovations will emerge, further enhancing the capabilities and applications of pose capturing and evaluation in VR environments. In this study, we introduce an innovative method for evaluating and comparing human postures by harnessing the power of optical motion capture and virtual reality (VR) technologies. Our cutting-edge pose matching algorithm enables users to refine their performance by providing real-time feedback on their postural alignment with a template pose. Capitalizing on the high accuracy and low latency of optical motion capture systems, our approach records detailed posture information, including time stamps, frame indices, and the orientation and position of multiple body joints in Cartesian coordinates. The algorithm computes a similarity score between the user's pose and the template by calculating a normalized loss function based on their 3D posture data. We seamlessly integrate the evaluation model into a VR environment tailored for posture imitation exercises. The template pose is pre-recorded, and the user's pose is dynamically synchronized with the physical world, visualized as an interactive 3D humanoid model within the virtual space. As users mimic the displayed postures and movements, the system generates instantaneous feedback on the similarity score, empowering them to refine their technique and enhance their performance, all within a safe and immersive virtual setting. As a result, we have developed a versatile VR application that successfully compares the similarity of postures and provides users with valuable feedback to improve their skills. The application demonstrates the efficacy of our pose matching algorithm and serves as a foundation for further development and expansion into various domains, including sports training, rehabilitation, and performance arts.\",\"PeriodicalId\":220939,\"journal\":{\"name\":\"2023 International Technical Conference on Circuits/Systems, Computers, and Communications (ITC-CSCC)\",\"volume\":\"49 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-06-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 International Technical Conference on Circuits/Systems, Computers, and Communications (ITC-CSCC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ITC-CSCC58803.2023.10212946\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 International Technical Conference on Circuits/Systems, Computers, and Communications (ITC-CSCC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ITC-CSCC58803.2023.10212946","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

该方法对现有的知识体系做出了贡献,并为未来基于vr的姿态分析和交互领域的发展奠定了坚实的基础。随着技术的不断发展,预计将出现进一步的改进和创新,进一步增强VR环境中姿态捕捉和评估的能力和应用。在这项研究中,我们介绍了一种利用光学运动捕捉和虚拟现实(VR)技术的力量来评估和比较人体姿势的创新方法。我们的尖端姿势匹配算法使用户能够通过提供实时反馈来改进他们的姿势与模板姿势对齐。利用光学运动捕捉系统的高精度和低延迟,我们的方法记录了详细的姿势信息,包括时间戳、帧索引以及笛卡尔坐标下多个身体关节的方向和位置。该算法基于用户的3D姿态数据,通过计算归一化损失函数来计算用户姿态与模板之间的相似性得分。我们将评估模型无缝集成到为姿势模仿练习量身定制的VR环境中。预先记录模板姿势,用户的姿势与物理世界动态同步,在虚拟空间中可视化为交互式3D人形模型。当用户模仿显示的姿势和动作时,系统会对相似度得分产生即时反馈,使他们能够在安全和沉浸式的虚拟环境中改进技术并提高表现。因此,我们开发了一个多功能的VR应用程序,可以成功地比较姿势的相似性,并为用户提供有价值的反馈,以提高他们的技能。该应用程序证明了我们的姿势匹配算法的有效性,并为进一步发展和扩展到各个领域奠定了基础,包括运动训练、康复和表演艺术。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Pose Capturing and Evaluation in a VR Environment
This proposed method contributes to the existing body of knowledge and lays a strong foundation for future advancements in the field of VR-based pose analysis and interaction. As the technology continues to evolve, it is expected that further improvements and innovations will emerge, further enhancing the capabilities and applications of pose capturing and evaluation in VR environments. In this study, we introduce an innovative method for evaluating and comparing human postures by harnessing the power of optical motion capture and virtual reality (VR) technologies. Our cutting-edge pose matching algorithm enables users to refine their performance by providing real-time feedback on their postural alignment with a template pose. Capitalizing on the high accuracy and low latency of optical motion capture systems, our approach records detailed posture information, including time stamps, frame indices, and the orientation and position of multiple body joints in Cartesian coordinates. The algorithm computes a similarity score between the user's pose and the template by calculating a normalized loss function based on their 3D posture data. We seamlessly integrate the evaluation model into a VR environment tailored for posture imitation exercises. The template pose is pre-recorded, and the user's pose is dynamically synchronized with the physical world, visualized as an interactive 3D humanoid model within the virtual space. As users mimic the displayed postures and movements, the system generates instantaneous feedback on the similarity score, empowering them to refine their technique and enhance their performance, all within a safe and immersive virtual setting. As a result, we have developed a versatile VR application that successfully compares the similarity of postures and provides users with valuable feedback to improve their skills. The application demonstrates the efficacy of our pose matching algorithm and serves as a foundation for further development and expansion into various domains, including sports training, rehabilitation, and performance arts.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信