{"title":"Markerless Optical Motion Capture System for Asymmetrical Swimming Stroke","authors":"F. Ferryanto, A. Mahyuddin, M. Nakashima","doi":"10.5614/j.eng.technol.sci.2022.54.5.3","DOIUrl":null,"url":null,"abstract":"This work presents the development of a markerless optical motion capture system of the front-crawl swimming stroke. The system only uses one underwater camera to record swimming motion in the sagittal plane. The participant in this experiment was a swimmer who is active in the university’s swimming club. The recorded images were then segmented to obtain silhouettes of the participant by a Gaussian Mixture Model. One of the swimming images was employed to generate a human body model that consists of 15 segments. The silhouette and model of the participant were subjected to an image matching process. The shape of the body segment was used as the feature in the image matching. The model was transformed to estimate the pose of the participant. The intraclass correlation coefficient between the results of the developed system and references were evaluated. In general, all body segments, except head and trunk, had a correlation coefficient higher than 0.95. Then, dynamics analysis by SWUM was conducted based on the joint angle acquired by the present work. The simulation implied that the developed system was suitable for daily training of athletes and coaches due to its simplicity and accuracy.","PeriodicalId":15689,"journal":{"name":"Journal of Engineering and Technological Sciences","volume":" ","pages":""},"PeriodicalIF":0.9000,"publicationDate":"2022-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Engineering and Technological Sciences","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.5614/j.eng.technol.sci.2022.54.5.3","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENGINEERING, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0
Abstract
This work presents the development of a markerless optical motion capture system of the front-crawl swimming stroke. The system only uses one underwater camera to record swimming motion in the sagittal plane. The participant in this experiment was a swimmer who is active in the university’s swimming club. The recorded images were then segmented to obtain silhouettes of the participant by a Gaussian Mixture Model. One of the swimming images was employed to generate a human body model that consists of 15 segments. The silhouette and model of the participant were subjected to an image matching process. The shape of the body segment was used as the feature in the image matching. The model was transformed to estimate the pose of the participant. The intraclass correlation coefficient between the results of the developed system and references were evaluated. In general, all body segments, except head and trunk, had a correlation coefficient higher than 0.95. Then, dynamics analysis by SWUM was conducted based on the joint angle acquired by the present work. The simulation implied that the developed system was suitable for daily training of athletes and coaches due to its simplicity and accuracy.
期刊介绍:
Journal of Engineering and Technological Sciences welcomes full research articles in the area of Engineering Sciences from the following subject areas: Aerospace Engineering, Biotechnology, Chemical Engineering, Civil Engineering, Electrical Engineering, Engineering Physics, Environmental Engineering, Industrial Engineering, Information Engineering, Mechanical Engineering, Material Science and Engineering, Manufacturing Processes, Microelectronics, Mining Engineering, Petroleum Engineering, and other application of physical, biological, chemical and mathematical sciences in engineering. Authors are invited to submit articles that have not been published previously and are not under consideration elsewhere.