Saliency and optical flow for gaze guidance in videos

S. Sridharan, Reynold J. Bailey
{"title":"Saliency and optical flow for gaze guidance in videos","authors":"S. Sridharan, Reynold J. Bailey","doi":"10.1145/2931002.2948725","DOIUrl":null,"url":null,"abstract":"Computer-based gaze guidance techniques have important applications in computer graphics, data visualization, image analysis, and training. Bailey et al. [2009] showed that it is possible to influence exactly where attention is allocated using a technique called Subtle Gaze Direction (SGD). The SGD approach combines eye tracking with brief image-space modulations in the peripheral regions of the field of view to guide viewer gaze about a scene. A fast eye-tracker is used to monitor gaze in real-time and the modulations are terminated before they can be scrutinized by the viewer's high acuity foveal vision. The SGD technique has been shown to improve spatial learning, visual search task performance, and problem solving in static digital imagery [Sridharan et al. 2012]. However, guiding attention in videos is challenging due to competing motion cues in the visual stimuli. We propose a novel method that uses scene saliency (spatial information) and optical flow (temporal information) to enable gaze guidance in dynamic scenes. The results of a user study show that the accuracy of responses to questions related to target regions in videos was higher among subjects who were gaze guided with our approach compared to a control group that was not actively guided.","PeriodicalId":102213,"journal":{"name":"Proceedings of the ACM Symposium on Applied Perception","volume":"18 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-07-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the ACM Symposium on Applied Perception","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2931002.2948725","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Computer-based gaze guidance techniques have important applications in computer graphics, data visualization, image analysis, and training. Bailey et al. [2009] showed that it is possible to influence exactly where attention is allocated using a technique called Subtle Gaze Direction (SGD). The SGD approach combines eye tracking with brief image-space modulations in the peripheral regions of the field of view to guide viewer gaze about a scene. A fast eye-tracker is used to monitor gaze in real-time and the modulations are terminated before they can be scrutinized by the viewer's high acuity foveal vision. The SGD technique has been shown to improve spatial learning, visual search task performance, and problem solving in static digital imagery [Sridharan et al. 2012]. However, guiding attention in videos is challenging due to competing motion cues in the visual stimuli. We propose a novel method that uses scene saliency (spatial information) and optical flow (temporal information) to enable gaze guidance in dynamic scenes. The results of a user study show that the accuracy of responses to questions related to target regions in videos was higher among subjects who were gaze guided with our approach compared to a control group that was not actively guided.
视频中注视引导的显著性和光流
基于计算机的注视引导技术在计算机图形学、数据可视化、图像分析和训练等领域有着重要的应用。Bailey等人[2009]表明,使用一种称为微妙凝视方向(SGD)的技术,可以准确地影响注意力分配的位置。SGD方法将眼动追踪与视野外围区域的简短图像空间调制相结合,以引导观看者注视场景。一个快速眼动追踪器被用来实时监控凝视,在观看者的高灵敏度中央凹视觉检查到凝视之前,调制被终止。SGD技术已被证明可以提高静态数字图像的空间学习、视觉搜索任务性能和问题解决能力[Sridharan等人,2012]。然而,由于视觉刺激中的运动线索相互竞争,在视频中引导注意力是具有挑战性的。我们提出了一种利用场景显著性(空间信息)和光流(时间信息)实现动态场景凝视引导的新方法。一项用户研究的结果表明,在我们的方法引导下,受试者对视频中与目标区域相关的问题的回答准确性高于没有积极引导的对照组。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信