Combining EEG and eye-tracking in virtual reality: Obtaining fixation-onset event-related potentials and event-related spectral perturbations.

IF 1.7 4区 心理学 Q3 PSYCHOLOGY
Debora Nolte, Marc Vidal De Palol, Ashima Keshava, John Madrid-Carvajal, Anna L Gert, Eva-Marie von Butler, Pelin Kömürlüoğlu, Peter König
{"title":"Combining EEG and eye-tracking in virtual reality: Obtaining fixation-onset event-related potentials and event-related spectral perturbations.","authors":"Debora Nolte, Marc Vidal De Palol, Ashima Keshava, John Madrid-Carvajal, Anna L Gert, Eva-Marie von Butler, Pelin Kömürlüoğlu, Peter König","doi":"10.3758/s13414-024-02917-3","DOIUrl":null,"url":null,"abstract":"<p><p>Extensive research conducted in controlled laboratory settings has prompted an inquiry into how results can be generalized to real-world situations influenced by the subjects' actions. Virtual reality lends itself ideally to investigating complex situations but requires accurate classification of eye movements, especially when combining it with time-sensitive data such as EEG. We recorded eye-tracking data in virtual reality and classified it into gazes and saccades using a velocity-based classification algorithm, and we cut the continuous data into smaller segments to deal with varying noise levels, as introduced in the REMoDNav algorithm. Furthermore, we corrected for participants' translational movement in virtual reality. Various measures, including visual inspection, event durations, and the velocity and dispersion distributions before and after gaze onset, indicate that we can accurately classify the continuous, free-exploration data. Combining the classified eye-tracking with the EEG data, we generated fixation-onset event-related potentials (ERPs) and event-related spectral perturbations (ERSPs), providing further evidence for the quality of the eye-movement classification and timing of the onset of events. Finally, investigating the correlation between single trials and the average ERP and ERSP identified that fixation-onset ERSPs are less time sensitive, require fewer repetitions of the same behavior, and are potentially better suited to study EEG signatures in naturalistic settings. We modified, designed, and tested an algorithm that allows the combination of EEG and eye-tracking data recorded in virtual reality.</p>","PeriodicalId":55433,"journal":{"name":"Attention Perception & Psychophysics","volume":null,"pages":null},"PeriodicalIF":1.7000,"publicationDate":"2024-07-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Attention Perception & Psychophysics","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.3758/s13414-024-02917-3","RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"PSYCHOLOGY","Score":null,"Total":0}
引用次数: 0

Abstract

Extensive research conducted in controlled laboratory settings has prompted an inquiry into how results can be generalized to real-world situations influenced by the subjects' actions. Virtual reality lends itself ideally to investigating complex situations but requires accurate classification of eye movements, especially when combining it with time-sensitive data such as EEG. We recorded eye-tracking data in virtual reality and classified it into gazes and saccades using a velocity-based classification algorithm, and we cut the continuous data into smaller segments to deal with varying noise levels, as introduced in the REMoDNav algorithm. Furthermore, we corrected for participants' translational movement in virtual reality. Various measures, including visual inspection, event durations, and the velocity and dispersion distributions before and after gaze onset, indicate that we can accurately classify the continuous, free-exploration data. Combining the classified eye-tracking with the EEG data, we generated fixation-onset event-related potentials (ERPs) and event-related spectral perturbations (ERSPs), providing further evidence for the quality of the eye-movement classification and timing of the onset of events. Finally, investigating the correlation between single trials and the average ERP and ERSP identified that fixation-onset ERSPs are less time sensitive, require fewer repetitions of the same behavior, and are potentially better suited to study EEG signatures in naturalistic settings. We modified, designed, and tested an algorithm that allows the combination of EEG and eye-tracking data recorded in virtual reality.

在虚拟现实中结合脑电图和眼动追踪:获取定点启动事件相关电位和事件相关频谱扰动。
在受控实验室环境中进行的大量研究促使人们探究如何将研究结果推广到受实验对象行动影响的真实世界环境中。虚拟现实非常适合研究复杂情况,但需要对眼球运动进行准确分类,尤其是在与脑电图等时间敏感数据相结合时。我们记录了虚拟现实中的眼动跟踪数据,并使用基于速度的分类算法将其分为注视和眼球转动,我们还将连续数据切割成较小的片段,以处理 REMoDNav 算法中引入的不同噪音水平。此外,我们还对参与者在虚拟现实中的平移运动进行了校正。包括视觉检查、事件持续时间、凝视开始前后的速度和分散分布在内的各种测量结果表明,我们可以对连续的自由探索数据进行准确分类。将分类后的眼动追踪与脑电图数据相结合,我们生成了定点起始事件相关电位(ERP)和事件相关频谱扰动(ERSP),为眼动分类的质量和事件起始时间提供了进一步的证据。最后,通过调查单次试验与平均 ERP 和 ERSP 之间的相关性,我们发现定点启动的 ERSP 对时间的敏感性较低,对相同行为的重复次数较少,可能更适合在自然环境中研究脑电图特征。我们修改、设计并测试了一种算法,可将虚拟现实中记录的脑电图和眼动跟踪数据结合起来。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
3.60
自引率
17.60%
发文量
197
审稿时长
4-8 weeks
期刊介绍: The journal Attention, Perception, & Psychophysics is an official journal of the Psychonomic Society. It spans all areas of research in sensory processes, perception, attention, and psychophysics. Most articles published are reports of experimental work; the journal also presents theoretical, integrative, and evaluative reviews. Commentary on issues of importance to researchers appears in a special section of the journal. Founded in 1966 as Perception & Psychophysics, the journal assumed its present name in 2009.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信