A. Fayez, A. Sharshar, Ahmed Hesham, Islam Eldifrawi, W. Gomaa
{"title":"ValS: A Leading Visual and Inertial Dataset of Squats","authors":"A. Fayez, A. Sharshar, Ahmed Hesham, Islam Eldifrawi, W. Gomaa","doi":"10.1109/IMCOM53663.2022.9721738","DOIUrl":null,"url":null,"abstract":"Human movement recognition has sparked a lot of attention because of its wide range of applications in sports, animation, simulation, and entertainment. These applications necessitate the analysis of datasets in order to distinguish motions in either the visual or inertial motion data. The use of camera and motion sensors, particularly, the gyroscope and accelerometer, has expanded recently as a result of the availability of mobile smartphones, smartwatches, etc., and is now being employed in real-world applications. In this paper, we present a new Visual and Inertial dataset ValS Dataset focusing on performing the squats exercises. The same actors and activities were captured using various hardware systems in two capture rounds, including video using mobile cameras and inertial measurement units (IMUs). The data from the IMUs and the videos are synced. Squats are being performed by 24 males and 3 females. To provide the highest data diversity, we recorded some sessions outdoors in the daylight and at night while others to be indoors. In this paper, we offer further details about the nature of the data and how would the dataset can be of many benefits, as well as how the dataset was collected and post-processed, including the synchronization process. Seeking evaluating the quality of the data, we ran certain tests to produce a proper data analysis.","PeriodicalId":367038,"journal":{"name":"2022 16th International Conference on Ubiquitous Information Management and Communication (IMCOM)","volume":"14 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-01-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 16th International Conference on Ubiquitous Information Management and Communication (IMCOM)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IMCOM53663.2022.9721738","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3
Abstract
Human movement recognition has sparked a lot of attention because of its wide range of applications in sports, animation, simulation, and entertainment. These applications necessitate the analysis of datasets in order to distinguish motions in either the visual or inertial motion data. The use of camera and motion sensors, particularly, the gyroscope and accelerometer, has expanded recently as a result of the availability of mobile smartphones, smartwatches, etc., and is now being employed in real-world applications. In this paper, we present a new Visual and Inertial dataset ValS Dataset focusing on performing the squats exercises. The same actors and activities were captured using various hardware systems in two capture rounds, including video using mobile cameras and inertial measurement units (IMUs). The data from the IMUs and the videos are synced. Squats are being performed by 24 males and 3 females. To provide the highest data diversity, we recorded some sessions outdoors in the daylight and at night while others to be indoors. In this paper, we offer further details about the nature of the data and how would the dataset can be of many benefits, as well as how the dataset was collected and post-processed, including the synchronization process. Seeking evaluating the quality of the data, we ran certain tests to produce a proper data analysis.