Real-time activity prediction: a gaze-based approach for early recognition of pen-based interaction tasks

Çagla Çig, T. M. Sezgin
{"title":"Real-time activity prediction: a gaze-based approach for early recognition of pen-based interaction tasks","authors":"Çagla Çig, T. M. Sezgin","doi":"10.2312/EXP.20151179","DOIUrl":null,"url":null,"abstract":"Recently there has been a growing interest in sketch recognition technologies for facilitating human-computer interaction. Existing sketch recognition studies mainly focus on recognizing pre-defined symbols and gestures. However, just as there is a need for systems that can automatically recognize symbols and gestures, there is also a pressing need for systems that can automatically recognize pen-based manipulation activities (e.g. dragging, maximizing, minimizing, scrolling). There are two main challenges in classifying manipulation activities. First is the inherent lack of characteristic visual appearances of pen inputs that correspond to manipulation activities. Second is the necessity of real-time classification based upon the principle that users must receive immediate and appropriate visual feedback about the effects of their actions. In this paper (1) an existing activity prediction system for pen-based devices is modified for real-time activity prediction and (2) an alternative time-based activity prediction system is introduced. Both systems use eye gaze movements that naturally accompany pen-based user interaction for activity classification. The results of our comprehensive experiments demonstrate that the newly developed alternative system is a more successful candidate (in terms of prediction accuracy and early prediction speed) than the existing system for real-time activity prediction. More specifically, midway through an activity, the alternative system reaches 66% of its maximum accuracy value (i.e. 66% of 70.34%) whereas the existing system reaches only 36% of its maximum accuracy value (i.e. 36% of 55.69%).","PeriodicalId":289409,"journal":{"name":"International Symposium on Sketch-Based Interfaces and Modeling","volume":"15 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-06-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Symposium on Sketch-Based Interfaces and Modeling","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2312/EXP.20151179","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

Recently there has been a growing interest in sketch recognition technologies for facilitating human-computer interaction. Existing sketch recognition studies mainly focus on recognizing pre-defined symbols and gestures. However, just as there is a need for systems that can automatically recognize symbols and gestures, there is also a pressing need for systems that can automatically recognize pen-based manipulation activities (e.g. dragging, maximizing, minimizing, scrolling). There are two main challenges in classifying manipulation activities. First is the inherent lack of characteristic visual appearances of pen inputs that correspond to manipulation activities. Second is the necessity of real-time classification based upon the principle that users must receive immediate and appropriate visual feedback about the effects of their actions. In this paper (1) an existing activity prediction system for pen-based devices is modified for real-time activity prediction and (2) an alternative time-based activity prediction system is introduced. Both systems use eye gaze movements that naturally accompany pen-based user interaction for activity classification. The results of our comprehensive experiments demonstrate that the newly developed alternative system is a more successful candidate (in terms of prediction accuracy and early prediction speed) than the existing system for real-time activity prediction. More specifically, midway through an activity, the alternative system reaches 66% of its maximum accuracy value (i.e. 66% of 70.34%) whereas the existing system reaches only 36% of its maximum accuracy value (i.e. 36% of 55.69%).
实时活动预测:一种基于凝视的方法,用于早期识别基于笔的交互任务
近年来,人们对促进人机交互的草图识别技术越来越感兴趣。现有的草图识别研究主要集中在识别预定义的符号和手势。然而,正如我们需要能够自动识别符号和手势的系统一样,我们也迫切需要能够自动识别基于笔的操作活动(例如拖动、最大化、最小化、滚动)的系统。对操纵活动进行分类有两个主要挑战。首先是固有的缺乏与操作活动相对应的笔输入的特征视觉外观。其次是实时分类的必要性,基于用户必须收到关于其行为效果的即时和适当的视觉反馈的原则。本文(1)对现有的笔式设备活动预测系统进行了改进,以实现实时活动预测;(2)介绍了一种替代的基于时间的活动预测系统。这两个系统都使用眼睛注视运动,自然伴随着基于笔的用户交互来进行活动分类。综合实验结果表明,新开发的替代系统(在预测精度和早期预测速度方面)比现有系统更成功地用于实时活动预测。更具体地说,在一个活动的中途,替代系统达到其最大精度值的66%(即70.34%的66%),而现有系统仅达到其最大精度值的36%(即55.69%的36%)。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信