基于神经网络估计器的认知模型潜变量序列辨识。

IF 3.9 2区 心理学 Q1 PSYCHOLOGY, EXPERIMENTAL
Ti-Fen Pan, Jing-Jing Li, Bill Thompson, Anne Ge Collins
{"title":"基于神经网络估计器的认知模型潜变量序列辨识。","authors":"Ti-Fen Pan, Jing-Jing Li, Bill Thompson, Anne Ge Collins","doi":"10.3758/s13428-025-02794-0","DOIUrl":null,"url":null,"abstract":"<p><p>Extracting time-varying latent variables from computational cognitive models plays a key role in uncovering the dynamic cognitive processes that drive behaviors. However, existing methods are limited to inferring latent variable sequences in a relatively narrow class of cognitive models. For example, a broad class of relevant cognitive models with intractable likelihood is currently out of reach of standard techniques, based on maximum a posteriori parameter estimation. Here, we present a simulation-based approach that leverages recurrent neural networks to map experimental data directly to the targeted latent variable space. We first show in simulations that our approach achieves competitive performance in inferring latent variable sequences in both likelihood-tractable and intractable models. We then demonstrate its applicability in real world datasets. Furthermore, the approach is practical to standard-size, individual data, generalizable across different computational models, and adaptable for continuous and discrete latent spaces. Our work underscores that combining recurrent neural networks and simulated data to identify model latent variable sequences broadens the scope of cognitive models researchers can explore, enabling testing a wider range of theories.</p>","PeriodicalId":8717,"journal":{"name":"Behavior Research Methods","volume":"57 10","pages":"272"},"PeriodicalIF":3.9000,"publicationDate":"2025-08-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12394392/pdf/","citationCount":"0","resultStr":"{\"title\":\"Latent variable sequence identification for cognitive models with neural network estimators.\",\"authors\":\"Ti-Fen Pan, Jing-Jing Li, Bill Thompson, Anne Ge Collins\",\"doi\":\"10.3758/s13428-025-02794-0\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Extracting time-varying latent variables from computational cognitive models plays a key role in uncovering the dynamic cognitive processes that drive behaviors. However, existing methods are limited to inferring latent variable sequences in a relatively narrow class of cognitive models. For example, a broad class of relevant cognitive models with intractable likelihood is currently out of reach of standard techniques, based on maximum a posteriori parameter estimation. Here, we present a simulation-based approach that leverages recurrent neural networks to map experimental data directly to the targeted latent variable space. We first show in simulations that our approach achieves competitive performance in inferring latent variable sequences in both likelihood-tractable and intractable models. We then demonstrate its applicability in real world datasets. Furthermore, the approach is practical to standard-size, individual data, generalizable across different computational models, and adaptable for continuous and discrete latent spaces. Our work underscores that combining recurrent neural networks and simulated data to identify model latent variable sequences broadens the scope of cognitive models researchers can explore, enabling testing a wider range of theories.</p>\",\"PeriodicalId\":8717,\"journal\":{\"name\":\"Behavior Research Methods\",\"volume\":\"57 10\",\"pages\":\"272\"},\"PeriodicalIF\":3.9000,\"publicationDate\":\"2025-08-28\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12394392/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Behavior Research Methods\",\"FirstCategoryId\":\"102\",\"ListUrlMain\":\"https://doi.org/10.3758/s13428-025-02794-0\",\"RegionNum\":2,\"RegionCategory\":\"心理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"PSYCHOLOGY, EXPERIMENTAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Behavior Research Methods","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.3758/s13428-025-02794-0","RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PSYCHOLOGY, EXPERIMENTAL","Score":null,"Total":0}
引用次数: 0

摘要

从计算认知模型中提取时变潜在变量对于揭示驱动行为的动态认知过程起着关键作用。然而,现有的方法仅限于在相对狭窄的认知模型类别中推断潜在变量序列。例如,基于最大后验参数估计的一类具有难处理似然的相关认知模型目前是标准技术无法达到的。在这里,我们提出了一种基于模拟的方法,利用递归神经网络将实验数据直接映射到目标潜在变量空间。我们首先在模拟中表明,我们的方法在推断可能性易处理和难处理模型中的潜在变量序列方面都取得了竞争表现。然后我们演示了它在现实世界数据集中的适用性。此外,该方法适用于标准大小的单个数据,可推广到不同的计算模型,并适用于连续和离散潜在空间。我们的工作强调,结合递归神经网络和模拟数据来识别模型潜在变量序列,拓宽了研究人员可以探索的认知模型的范围,使测试更广泛的理论成为可能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Latent variable sequence identification for cognitive models with neural network estimators.

Extracting time-varying latent variables from computational cognitive models plays a key role in uncovering the dynamic cognitive processes that drive behaviors. However, existing methods are limited to inferring latent variable sequences in a relatively narrow class of cognitive models. For example, a broad class of relevant cognitive models with intractable likelihood is currently out of reach of standard techniques, based on maximum a posteriori parameter estimation. Here, we present a simulation-based approach that leverages recurrent neural networks to map experimental data directly to the targeted latent variable space. We first show in simulations that our approach achieves competitive performance in inferring latent variable sequences in both likelihood-tractable and intractable models. We then demonstrate its applicability in real world datasets. Furthermore, the approach is practical to standard-size, individual data, generalizable across different computational models, and adaptable for continuous and discrete latent spaces. Our work underscores that combining recurrent neural networks and simulated data to identify model latent variable sequences broadens the scope of cognitive models researchers can explore, enabling testing a wider range of theories.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
10.30
自引率
9.30%
发文量
266
期刊介绍: Behavior Research Methods publishes articles concerned with the methods, techniques, and instrumentation of research in experimental psychology. The journal focuses particularly on the use of computer technology in psychological research. An annual special issue is devoted to this field.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信