Exploring temporal sensitivity in the brain using multi-timescale language models: an EEG decoding study

IF 9.3 2区 计算机科学
Sijie Ling, Alex Murphy, Alona Fyshe
{"title":"Exploring temporal sensitivity in the brain using multi-timescale language models: an EEG decoding study","authors":"Sijie Ling, Alex Murphy, Alona Fyshe","doi":"10.1162/coli_a_00533","DOIUrl":null,"url":null,"abstract":"The brain’s ability to perform complex computations at varying timescales is crucial, ranging from understanding single words to grasping the overarching narrative of a story. Recently, multi-timescale long short-term memory (MT-LSTM) models (Mahto et al. 2020; Jain et al. 2020) have been introduced, which use temporally-tuned parameters to induce sensitivity to different timescales of language processing (i.e. related to near/distant words). However, there has not been an exploration of the relation between such temporally-tuned information processing in MT-LSTMs and the brain’s language processing using high temporal resolution recording modalities, such as electroencephalography (EEG). To bridge this gap, we used an EEG dataset recorded while participants listened to Chapter 1 of “Alice in Wonderland” and trained ridge regression models to predict the temporally-tuned MT-LSTM embeddings from EEG responses. Our analysis reveals that EEG signals can be used to predict MT-LSTM embeddings across various timescales. For longer timescales, our models produced accurate predictions within an extended time window of ±2 s around word onset, while for shorter timescales, significant predictions are confined to a narrow window ranging from −180 ms to 790 ms. Intriguingly, we observed that short timescale information is not only processed in the vicinity of word onset but also at distant time points. These observations underscore the parallels and discrepancies between computational models and the neural mechanisms of the brain. As word embeddings are used more as in silico models of semantic representation in the brain, a more explicit consideration of timescale-dependent processing enables more targeted explorations of language processing in humans and machines.","PeriodicalId":49089,"journal":{"name":"Computational Linguistics","volume":null,"pages":null},"PeriodicalIF":9.3000,"publicationDate":"2024-07-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computational Linguistics","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1162/coli_a_00533","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

The brain’s ability to perform complex computations at varying timescales is crucial, ranging from understanding single words to grasping the overarching narrative of a story. Recently, multi-timescale long short-term memory (MT-LSTM) models (Mahto et al. 2020; Jain et al. 2020) have been introduced, which use temporally-tuned parameters to induce sensitivity to different timescales of language processing (i.e. related to near/distant words). However, there has not been an exploration of the relation between such temporally-tuned information processing in MT-LSTMs and the brain’s language processing using high temporal resolution recording modalities, such as electroencephalography (EEG). To bridge this gap, we used an EEG dataset recorded while participants listened to Chapter 1 of “Alice in Wonderland” and trained ridge regression models to predict the temporally-tuned MT-LSTM embeddings from EEG responses. Our analysis reveals that EEG signals can be used to predict MT-LSTM embeddings across various timescales. For longer timescales, our models produced accurate predictions within an extended time window of ±2 s around word onset, while for shorter timescales, significant predictions are confined to a narrow window ranging from −180 ms to 790 ms. Intriguingly, we observed that short timescale information is not only processed in the vicinity of word onset but also at distant time points. These observations underscore the parallels and discrepancies between computational models and the neural mechanisms of the brain. As word embeddings are used more as in silico models of semantic representation in the brain, a more explicit consideration of timescale-dependent processing enables more targeted explorations of language processing in humans and machines.
利用多时间尺度语言模型探索大脑的时间敏感性:脑电图解码研究
大脑在不同时间尺度上进行复杂计算的能力至关重要,从理解单个单词到把握一个故事的总体叙事,不一而足。最近,多时间尺度长短时记忆(MT-LSTM)模型(Mahto 等人,2020 年;Jain 等人,2020 年)被引入,该模型使用时间调整参数来诱导对语言处理的不同时间尺度(即与近/远单词相关)的敏感性。然而,目前还没有人利用脑电图(EEG)等高时间分辨率记录模式探索 MT-LSTM 中这种时间调谐信息处理与大脑语言处理之间的关系。为了弥合这一差距,我们使用了参与者聆听《爱丽丝梦游仙境》第一章时记录的脑电图数据集,并训练了脊回归模型,以便从脑电图反应中预测时间调谐的 MT-LSTM 嵌入。我们的分析表明,脑电信号可用于预测不同时间尺度的 MT-LSTM 嵌入。对于较长的时间尺度,我们的模型在单词开始前后±2 秒的扩展时间窗口内产生了准确的预测,而对于较短的时间尺度,有意义的预测则局限在-180 毫秒到 790 毫秒的狭窄窗口内。耐人寻味的是,我们观察到短时标信息不仅在词开始的附近被处理,而且在较远的时间点也被处理。这些观察结果凸显了计算模型与大脑神经机制之间的相似之处和差异。随着词嵌入被更多地用作大脑语义表征的硅模型,更明确地考虑时标依赖性处理可以更有针对性地探索人类和机器的语言处理过程。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Computational Linguistics
Computational Linguistics Computer Science-Artificial Intelligence
自引率
0.00%
发文量
45
期刊介绍: Computational Linguistics is the longest-running publication devoted exclusively to the computational and mathematical properties of language and the design and analysis of natural language processing systems. This highly regarded quarterly offers university and industry linguists, computational linguists, artificial intelligence and machine learning investigators, cognitive scientists, speech specialists, and philosophers the latest information about the computational aspects of all the facets of research on language.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信