Muhammad Aliff Rosly, H. Yussof, Svamimi Shamsuddin, N. I. Zahari, Ahmad Zamir Che Daud
{"title":"Eye Contact Measurement using NAO Robot Vision for Autism Intervention","authors":"Muhammad Aliff Rosly, H. Yussof, Svamimi Shamsuddin, N. I. Zahari, Ahmad Zamir Che Daud","doi":"10.1109/ICCSCE54767.2022.9935637","DOIUrl":null,"url":null,"abstract":"Eye-tracking is regarded as a valuable instrument for evaluating intervention programmes, especially those in the social or communication categories. It includes the robot-mediated intervention in which a robot is utilised to converse with children during therapy. Nevertheless, recent robot-mediated interventions continue to measure eye contact manually using video recordings for evaluation purposes. Using an additional measuring device other than the robot itself is inefficient without exploring its advanced robotics capabilities. Therefore, this research suggests measuring eye contact using an NAO robot vision and compares it to the conventional recorded video analysis. During a therapy session, the NAO robot's cameras automatically measure and compute eye contact data. The NAOqi PeoplePerception ALGazeAnalysis API analyses the detected individual's gaze direction. The ‘look’ and ‘not look’ events are alternately raised till the end of the module time, with each eye contact duration added to the total sum for calculation. The code has been improved to account for unnecessary detection during momentary eye contact aversion or glance for a more accurate assessment. Then, an experiment is undertaken to compare the measurement to the traditional recorded video approach at each range. The ON difference data were plotted on a Bland-Altman graph to determine the degree of agreement between the two approaches. Even their 95 per cent confidence intervals fall well inside the maximum variance allowed. This indicates that both methods demonstrate excellent agreement, and there is no noticeable difference between them. Consequently, it may be argued that the NAO robot can replace the traditional recorded methodology or that the two methods are interchangeable.","PeriodicalId":346014,"journal":{"name":"2022 IEEE 12th International Conference on Control System, Computing and Engineering (ICCSCE)","volume":"37 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-10-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE 12th International Conference on Control System, Computing and Engineering (ICCSCE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCSCE54767.2022.9935637","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Eye-tracking is regarded as a valuable instrument for evaluating intervention programmes, especially those in the social or communication categories. It includes the robot-mediated intervention in which a robot is utilised to converse with children during therapy. Nevertheless, recent robot-mediated interventions continue to measure eye contact manually using video recordings for evaluation purposes. Using an additional measuring device other than the robot itself is inefficient without exploring its advanced robotics capabilities. Therefore, this research suggests measuring eye contact using an NAO robot vision and compares it to the conventional recorded video analysis. During a therapy session, the NAO robot's cameras automatically measure and compute eye contact data. The NAOqi PeoplePerception ALGazeAnalysis API analyses the detected individual's gaze direction. The ‘look’ and ‘not look’ events are alternately raised till the end of the module time, with each eye contact duration added to the total sum for calculation. The code has been improved to account for unnecessary detection during momentary eye contact aversion or glance for a more accurate assessment. Then, an experiment is undertaken to compare the measurement to the traditional recorded video approach at each range. The ON difference data were plotted on a Bland-Altman graph to determine the degree of agreement between the two approaches. Even their 95 per cent confidence intervals fall well inside the maximum variance allowed. This indicates that both methods demonstrate excellent agreement, and there is no noticeable difference between them. Consequently, it may be argued that the NAO robot can replace the traditional recorded methodology or that the two methods are interchangeable.