Muhammad Aliff Rosly, H. Yussof, Svamimi Shamsuddin, N. I. Zahari, Ahmad Zamir Che Daud
{"title":"NAO机器人视觉在自闭症干预中的眼接触测量","authors":"Muhammad Aliff Rosly, H. Yussof, Svamimi Shamsuddin, N. I. Zahari, Ahmad Zamir Che Daud","doi":"10.1109/ICCSCE54767.2022.9935637","DOIUrl":null,"url":null,"abstract":"Eye-tracking is regarded as a valuable instrument for evaluating intervention programmes, especially those in the social or communication categories. It includes the robot-mediated intervention in which a robot is utilised to converse with children during therapy. Nevertheless, recent robot-mediated interventions continue to measure eye contact manually using video recordings for evaluation purposes. Using an additional measuring device other than the robot itself is inefficient without exploring its advanced robotics capabilities. Therefore, this research suggests measuring eye contact using an NAO robot vision and compares it to the conventional recorded video analysis. During a therapy session, the NAO robot's cameras automatically measure and compute eye contact data. The NAOqi PeoplePerception ALGazeAnalysis API analyses the detected individual's gaze direction. The ‘look’ and ‘not look’ events are alternately raised till the end of the module time, with each eye contact duration added to the total sum for calculation. The code has been improved to account for unnecessary detection during momentary eye contact aversion or glance for a more accurate assessment. Then, an experiment is undertaken to compare the measurement to the traditional recorded video approach at each range. The ON difference data were plotted on a Bland-Altman graph to determine the degree of agreement between the two approaches. Even their 95 per cent confidence intervals fall well inside the maximum variance allowed. This indicates that both methods demonstrate excellent agreement, and there is no noticeable difference between them. Consequently, it may be argued that the NAO robot can replace the traditional recorded methodology or that the two methods are interchangeable.","PeriodicalId":346014,"journal":{"name":"2022 IEEE 12th International Conference on Control System, Computing and Engineering (ICCSCE)","volume":"37 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-10-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Eye Contact Measurement using NAO Robot Vision for Autism Intervention\",\"authors\":\"Muhammad Aliff Rosly, H. Yussof, Svamimi Shamsuddin, N. I. Zahari, Ahmad Zamir Che Daud\",\"doi\":\"10.1109/ICCSCE54767.2022.9935637\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Eye-tracking is regarded as a valuable instrument for evaluating intervention programmes, especially those in the social or communication categories. It includes the robot-mediated intervention in which a robot is utilised to converse with children during therapy. Nevertheless, recent robot-mediated interventions continue to measure eye contact manually using video recordings for evaluation purposes. Using an additional measuring device other than the robot itself is inefficient without exploring its advanced robotics capabilities. Therefore, this research suggests measuring eye contact using an NAO robot vision and compares it to the conventional recorded video analysis. During a therapy session, the NAO robot's cameras automatically measure and compute eye contact data. The NAOqi PeoplePerception ALGazeAnalysis API analyses the detected individual's gaze direction. The ‘look’ and ‘not look’ events are alternately raised till the end of the module time, with each eye contact duration added to the total sum for calculation. The code has been improved to account for unnecessary detection during momentary eye contact aversion or glance for a more accurate assessment. Then, an experiment is undertaken to compare the measurement to the traditional recorded video approach at each range. The ON difference data were plotted on a Bland-Altman graph to determine the degree of agreement between the two approaches. Even their 95 per cent confidence intervals fall well inside the maximum variance allowed. This indicates that both methods demonstrate excellent agreement, and there is no noticeable difference between them. Consequently, it may be argued that the NAO robot can replace the traditional recorded methodology or that the two methods are interchangeable.\",\"PeriodicalId\":346014,\"journal\":{\"name\":\"2022 IEEE 12th International Conference on Control System, Computing and Engineering (ICCSCE)\",\"volume\":\"37 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-10-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 IEEE 12th International Conference on Control System, Computing and Engineering (ICCSCE)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICCSCE54767.2022.9935637\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE 12th International Conference on Control System, Computing and Engineering (ICCSCE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCSCE54767.2022.9935637","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
摘要
眼动追踪被认为是评估干预方案的一种有价值的工具,特别是在社会或交流领域。它包括机器人介导的干预,在治疗期间,机器人被用来与儿童交谈。然而,最近的机器人干预仍然是为了评估目的而使用视频记录手动测量目光接触。使用机器人本身以外的额外测量设备是低效的,而不探索其先进的机器人功能。因此,本研究建议使用NAO机器人视觉来测量目光接触,并将其与传统的录制视频分析进行比较。在治疗过程中,NAO机器人的摄像头会自动测量和计算眼神交流数据。NAOqi PeoplePerception algeanalysis API分析被检测个体的凝视方向。' look '和' not look '事件交替引发,直到模块时间结束,每次目光接触持续时间都被添加到计算的总和中。该代码已得到改进,以解释在瞬间眼神接触厌恶或一瞥期间不必要的检测,以进行更准确的评估。然后,在每个范围内进行实验,将测量结果与传统的录制视频方法进行比较。ON差异数据绘制在Bland-Altman图上,以确定两种方法之间的一致程度。即使它们95%的置信区间也完全落在允许的最大方差之内。这表明两种方法具有很好的一致性,两者之间没有明显的差异。因此,可以认为NAO机器人可以取代传统的记录方法,或者这两种方法是可互换的。
Eye Contact Measurement using NAO Robot Vision for Autism Intervention
Eye-tracking is regarded as a valuable instrument for evaluating intervention programmes, especially those in the social or communication categories. It includes the robot-mediated intervention in which a robot is utilised to converse with children during therapy. Nevertheless, recent robot-mediated interventions continue to measure eye contact manually using video recordings for evaluation purposes. Using an additional measuring device other than the robot itself is inefficient without exploring its advanced robotics capabilities. Therefore, this research suggests measuring eye contact using an NAO robot vision and compares it to the conventional recorded video analysis. During a therapy session, the NAO robot's cameras automatically measure and compute eye contact data. The NAOqi PeoplePerception ALGazeAnalysis API analyses the detected individual's gaze direction. The ‘look’ and ‘not look’ events are alternately raised till the end of the module time, with each eye contact duration added to the total sum for calculation. The code has been improved to account for unnecessary detection during momentary eye contact aversion or glance for a more accurate assessment. Then, an experiment is undertaken to compare the measurement to the traditional recorded video approach at each range. The ON difference data were plotted on a Bland-Altman graph to determine the degree of agreement between the two approaches. Even their 95 per cent confidence intervals fall well inside the maximum variance allowed. This indicates that both methods demonstrate excellent agreement, and there is no noticeable difference between them. Consequently, it may be argued that the NAO robot can replace the traditional recorded methodology or that the two methods are interchangeable.