开发一种算法,用于在使用上肢假体时自动收集注视数据分析

A. A. Zaid, Mohammad Sobuh, Musa Al Yaman
{"title":"开发一种算法,用于在使用上肢假体时自动收集注视数据分析","authors":"A. A. Zaid, Mohammad Sobuh, Musa Al Yaman","doi":"10.1109/ICMSAO.2017.7934914","DOIUrl":null,"url":null,"abstract":"Studying gaze behavior (where someone looks) is an emerging area of research particularly in the domain of behavioral psychology. Gaze behavior can, for instance, indicate the amount of attention paid in certain tasks and whether the performance is natural or not. Also, it can indicate the level of competence and experience of the user. Therefore, it has the potential to be used as an evaluative tool in upper limb prosthetic field. In order to obtain informative results from gaze data, the visual scene need to be divided into areas of interests (AOIs) and the time of gaze fixation (eye fixation) has to be calculated. This process is usually completed manually by a rater, who goes through the gaze trial frame by frame, to report at which AOI the gaze is fixated at each frame. Therefore, this process is extremely tedious and is highly subjective, and thus prone to unsystematic error. The aim of this paper is to report the development of an automated algorithm which used to analyze gaze data and obtain an initial indication of its validity and reliability.","PeriodicalId":265345,"journal":{"name":"2017 7th International Conference on Modeling, Simulation, and Applied Optimization (ICMSAO)","volume":"29 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Development of an algorithm for automating gaze data analysis gathered while using upper limb prostheses\",\"authors\":\"A. A. Zaid, Mohammad Sobuh, Musa Al Yaman\",\"doi\":\"10.1109/ICMSAO.2017.7934914\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Studying gaze behavior (where someone looks) is an emerging area of research particularly in the domain of behavioral psychology. Gaze behavior can, for instance, indicate the amount of attention paid in certain tasks and whether the performance is natural or not. Also, it can indicate the level of competence and experience of the user. Therefore, it has the potential to be used as an evaluative tool in upper limb prosthetic field. In order to obtain informative results from gaze data, the visual scene need to be divided into areas of interests (AOIs) and the time of gaze fixation (eye fixation) has to be calculated. This process is usually completed manually by a rater, who goes through the gaze trial frame by frame, to report at which AOI the gaze is fixated at each frame. Therefore, this process is extremely tedious and is highly subjective, and thus prone to unsystematic error. The aim of this paper is to report the development of an automated algorithm which used to analyze gaze data and obtain an initial indication of its validity and reliability.\",\"PeriodicalId\":265345,\"journal\":{\"name\":\"2017 7th International Conference on Modeling, Simulation, and Applied Optimization (ICMSAO)\",\"volume\":\"29 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-04-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2017 7th International Conference on Modeling, Simulation, and Applied Optimization (ICMSAO)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICMSAO.2017.7934914\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 7th International Conference on Modeling, Simulation, and Applied Optimization (ICMSAO)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICMSAO.2017.7934914","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

研究凝视行为(某人看哪里)是一个新兴的研究领域,特别是在行为心理学领域。例如,凝视行为可以表明人们在某些任务中投入了多少注意力,以及这种表现是否自然。此外,它还可以表明用户的能力水平和经验。因此,它有潜力作为上肢假肢领域的评估工具。为了从注视数据中获得信息丰富的结果,需要将视觉场景划分为兴趣区域(aoi),并计算注视注视时间(eye fixation)。这个过程通常由评分员手动完成,他一帧一帧地进行凝视试验,报告每一帧的凝视固定在哪个AOI上。因此,这个过程是非常繁琐和高度主观的,因此容易出现非系统错误。本文的目的是报告一种自动算法的发展,该算法用于分析凝视数据并获得其有效性和可靠性的初步指示。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Development of an algorithm for automating gaze data analysis gathered while using upper limb prostheses
Studying gaze behavior (where someone looks) is an emerging area of research particularly in the domain of behavioral psychology. Gaze behavior can, for instance, indicate the amount of attention paid in certain tasks and whether the performance is natural or not. Also, it can indicate the level of competence and experience of the user. Therefore, it has the potential to be used as an evaluative tool in upper limb prosthetic field. In order to obtain informative results from gaze data, the visual scene need to be divided into areas of interests (AOIs) and the time of gaze fixation (eye fixation) has to be calculated. This process is usually completed manually by a rater, who goes through the gaze trial frame by frame, to report at which AOI the gaze is fixated at each frame. Therefore, this process is extremely tedious and is highly subjective, and thus prone to unsystematic error. The aim of this paper is to report the development of an automated algorithm which used to analyze gaze data and obtain an initial indication of its validity and reliability.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信