Karn Kiattikunrat, T. Leelasawassuk, S. Hasegawa, C. Mitsantisuk
{"title":"Pose Capturing and Evaluation in a VR Environment","authors":"Karn Kiattikunrat, T. Leelasawassuk, S. Hasegawa, C. Mitsantisuk","doi":"10.1109/ITC-CSCC58803.2023.10212946","DOIUrl":null,"url":null,"abstract":"This proposed method contributes to the existing body of knowledge and lays a strong foundation for future advancements in the field of VR-based pose analysis and interaction. As the technology continues to evolve, it is expected that further improvements and innovations will emerge, further enhancing the capabilities and applications of pose capturing and evaluation in VR environments. In this study, we introduce an innovative method for evaluating and comparing human postures by harnessing the power of optical motion capture and virtual reality (VR) technologies. Our cutting-edge pose matching algorithm enables users to refine their performance by providing real-time feedback on their postural alignment with a template pose. Capitalizing on the high accuracy and low latency of optical motion capture systems, our approach records detailed posture information, including time stamps, frame indices, and the orientation and position of multiple body joints in Cartesian coordinates. The algorithm computes a similarity score between the user's pose and the template by calculating a normalized loss function based on their 3D posture data. We seamlessly integrate the evaluation model into a VR environment tailored for posture imitation exercises. The template pose is pre-recorded, and the user's pose is dynamically synchronized with the physical world, visualized as an interactive 3D humanoid model within the virtual space. As users mimic the displayed postures and movements, the system generates instantaneous feedback on the similarity score, empowering them to refine their technique and enhance their performance, all within a safe and immersive virtual setting. As a result, we have developed a versatile VR application that successfully compares the similarity of postures and provides users with valuable feedback to improve their skills. The application demonstrates the efficacy of our pose matching algorithm and serves as a foundation for further development and expansion into various domains, including sports training, rehabilitation, and performance arts.","PeriodicalId":220939,"journal":{"name":"2023 International Technical Conference on Circuits/Systems, Computers, and Communications (ITC-CSCC)","volume":"49 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 International Technical Conference on Circuits/Systems, Computers, and Communications (ITC-CSCC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ITC-CSCC58803.2023.10212946","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
This proposed method contributes to the existing body of knowledge and lays a strong foundation for future advancements in the field of VR-based pose analysis and interaction. As the technology continues to evolve, it is expected that further improvements and innovations will emerge, further enhancing the capabilities and applications of pose capturing and evaluation in VR environments. In this study, we introduce an innovative method for evaluating and comparing human postures by harnessing the power of optical motion capture and virtual reality (VR) technologies. Our cutting-edge pose matching algorithm enables users to refine their performance by providing real-time feedback on their postural alignment with a template pose. Capitalizing on the high accuracy and low latency of optical motion capture systems, our approach records detailed posture information, including time stamps, frame indices, and the orientation and position of multiple body joints in Cartesian coordinates. The algorithm computes a similarity score between the user's pose and the template by calculating a normalized loss function based on their 3D posture data. We seamlessly integrate the evaluation model into a VR environment tailored for posture imitation exercises. The template pose is pre-recorded, and the user's pose is dynamically synchronized with the physical world, visualized as an interactive 3D humanoid model within the virtual space. As users mimic the displayed postures and movements, the system generates instantaneous feedback on the similarity score, empowering them to refine their technique and enhance their performance, all within a safe and immersive virtual setting. As a result, we have developed a versatile VR application that successfully compares the similarity of postures and provides users with valuable feedback to improve their skills. The application demonstrates the efficacy of our pose matching algorithm and serves as a foundation for further development and expansion into various domains, including sports training, rehabilitation, and performance arts.