AR-SSVEP for brain-machine interface: Estimating user's gaze in head-mounted display with USB camera

S. Horii, S. Nakauchi, M. Kitazaki
{"title":"AR-SSVEP for brain-machine interface: Estimating user's gaze in head-mounted display with USB camera","authors":"S. Horii, S. Nakauchi, M. Kitazaki","doi":"10.1109/VR.2015.7223361","DOIUrl":null,"url":null,"abstract":"We aim to develop a brain-machine interface (BMI) system that estimates user's gaze or attention on an object to pick it up in the real world. In Experiment 1 and 2 we measured steady-state visual evoked potential (SSVEP) using luminance and/or contrast modulated flickers of photographic scenes presented on a head-mounted display (HMD). We applied multiclass SVM to estimate gaze locations for every 2s time-window data, and obtained significantly good classifications of gaze locations with the leave-one-session-out cross validation. In Experiment 3 we measured SSVEP using luminance and contrast modulated flickers of real scenes that were online captured by a USB camera and presented on the HMD. We put AR markers on real objects and made their locations flickering on HMD. We obtained the best performance of gaze classification with highest luminance and contrast modulation (73-91% accuracy at chance level 33%), and significantly good classification with low (25% of the highest) luminance and contrast modulation (42-50% accuracy). These results suggest that the luminance-modulated flickers of real scenes through USB camera can be applied to BMI by using augmented reality technology.","PeriodicalId":231501,"journal":{"name":"2015 IEEE Virtual Reality (VR)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2015 IEEE Virtual Reality (VR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/VR.2015.7223361","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 7

Abstract

We aim to develop a brain-machine interface (BMI) system that estimates user's gaze or attention on an object to pick it up in the real world. In Experiment 1 and 2 we measured steady-state visual evoked potential (SSVEP) using luminance and/or contrast modulated flickers of photographic scenes presented on a head-mounted display (HMD). We applied multiclass SVM to estimate gaze locations for every 2s time-window data, and obtained significantly good classifications of gaze locations with the leave-one-session-out cross validation. In Experiment 3 we measured SSVEP using luminance and contrast modulated flickers of real scenes that were online captured by a USB camera and presented on the HMD. We put AR markers on real objects and made their locations flickering on HMD. We obtained the best performance of gaze classification with highest luminance and contrast modulation (73-91% accuracy at chance level 33%), and significantly good classification with low (25% of the highest) luminance and contrast modulation (42-50% accuracy). These results suggest that the luminance-modulated flickers of real scenes through USB camera can be applied to BMI by using augmented reality technology.
AR-SSVEP脑机接口:通过USB摄像头估算用户在头戴式显示器上的凝视
我们的目标是开发一个脑机接口(BMI)系统,它可以估计用户对一个物体的凝视或注意力,从而在现实世界中拾取它。在实验1和2中,我们使用头戴式显示器(HMD)上呈现的摄影场景的亮度和/或对比度调制闪烁来测量稳态视觉诱发电位(SSVEP)。我们采用多类支持向量机对每15个时间窗数据进行凝视位置估计,并通过留一会话交叉验证获得了较好的凝视位置分类。在实验3中,我们使用USB相机在线捕获并在HMD上显示的真实场景的亮度和对比度调制闪烁来测量SSVEP。我们把AR标记放在真实物体上,让它们的位置在HMD上闪烁。我们获得了最高亮度和对比度调制时的最佳分类性能(准确率为73-91%,随机水平为33%),较低亮度和对比度调制(准确率为最高的25%)时的分类效果显著好(准确率为42-50%)。这些结果表明,通过USB摄像头拍摄的真实场景的亮度调制闪烁可以通过增强现实技术应用于BMI。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信