具有注意力的序列到序列模型在机制上映射到人类记忆搜索的体系结构。

Nikolaus Salvatore, Qiong Zhang
{"title":"具有注意力的序列到序列模型在机制上映射到人类记忆搜索的体系结构。","authors":"Nikolaus Salvatore, Qiong Zhang","doi":"10.1038/s44271-025-00322-6","DOIUrl":null,"url":null,"abstract":"<p><p>Past work has long recognized the important role of context in guiding how humans search their memory. While context-based memory models can explain many memory phenomena, it remains unclear why humans develop such architectures over possible alternatives in the first place. In this work, we demonstrate that foundational architectures in neural machine translation - specifically, recurrent neural network (RNN)-based sequence-to-sequence models with attention - exhibit mechanisms that directly correspond to those specified in the Context Maintenance and Retrieval (CMR) model of human memory. Since neural machine translation models have evolved to optimize task performance, their convergence with human memory models provides a deeper understanding of the functional role of context in human memory, as well as presenting alternative ways to model human memory. Leveraging this convergence, we implement a neural machine translation model as a cognitive model of human memory search that is both interpretable and capable of capturing complex dynamics of learning. We show that our model accounts for both averaged and optimal human behavioral patterns as effectively as context-based memory models using a publicly available free recall experiment dataset involving 171 participants. Further, we demonstrate additional strengths of the proposed model by evaluating how memory search performance emerges from the interaction of different model components.</p>","PeriodicalId":501698,"journal":{"name":"Communications Psychology","volume":"3 1","pages":"146"},"PeriodicalIF":0.0000,"publicationDate":"2025-10-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12521410/pdf/","citationCount":"0","resultStr":"{\"title\":\"Sequence-to-sequence models with attention mechanistically map to the architecture of human memory search.\",\"authors\":\"Nikolaus Salvatore, Qiong Zhang\",\"doi\":\"10.1038/s44271-025-00322-6\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Past work has long recognized the important role of context in guiding how humans search their memory. While context-based memory models can explain many memory phenomena, it remains unclear why humans develop such architectures over possible alternatives in the first place. In this work, we demonstrate that foundational architectures in neural machine translation - specifically, recurrent neural network (RNN)-based sequence-to-sequence models with attention - exhibit mechanisms that directly correspond to those specified in the Context Maintenance and Retrieval (CMR) model of human memory. Since neural machine translation models have evolved to optimize task performance, their convergence with human memory models provides a deeper understanding of the functional role of context in human memory, as well as presenting alternative ways to model human memory. Leveraging this convergence, we implement a neural machine translation model as a cognitive model of human memory search that is both interpretable and capable of capturing complex dynamics of learning. We show that our model accounts for both averaged and optimal human behavioral patterns as effectively as context-based memory models using a publicly available free recall experiment dataset involving 171 participants. Further, we demonstrate additional strengths of the proposed model by evaluating how memory search performance emerges from the interaction of different model components.</p>\",\"PeriodicalId\":501698,\"journal\":{\"name\":\"Communications Psychology\",\"volume\":\"3 1\",\"pages\":\"146\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2025-10-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12521410/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Communications Psychology\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1038/s44271-025-00322-6\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Communications Psychology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1038/s44271-025-00322-6","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

过去的研究早就认识到背景在指导人类如何搜索记忆方面的重要作用。虽然基于上下文的记忆模型可以解释许多记忆现象,但人们仍然不清楚为什么人类首先会开发出这样的架构,而不是可能的替代方案。在这项工作中,我们证明了神经机器翻译的基本架构-特别是基于循环神经网络(RNN)的具有注意力的序列到序列模型-展示了直接对应于人类记忆的上下文维护和检索(CMR)模型中指定的机制。由于神经机器翻译模型已经进化到优化任务性能,它们与人类记忆模型的收敛提供了对上下文在人类记忆中的功能作用的更深层次的理解,以及提供了模拟人类记忆的替代方法。利用这种收敛性,我们实现了一个神经机器翻译模型作为人类记忆搜索的认知模型,该模型既可解释又能够捕获复杂的学习动态。我们使用公开的171名参与者的免费回忆实验数据集,表明我们的模型与基于上下文的记忆模型一样有效地解释了平均和最佳的人类行为模式。此外,我们通过评估记忆搜索性能如何从不同模型组件的交互中产生,证明了所提出模型的其他优势。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Sequence-to-sequence models with attention mechanistically map to the architecture of human memory search.

Past work has long recognized the important role of context in guiding how humans search their memory. While context-based memory models can explain many memory phenomena, it remains unclear why humans develop such architectures over possible alternatives in the first place. In this work, we demonstrate that foundational architectures in neural machine translation - specifically, recurrent neural network (RNN)-based sequence-to-sequence models with attention - exhibit mechanisms that directly correspond to those specified in the Context Maintenance and Retrieval (CMR) model of human memory. Since neural machine translation models have evolved to optimize task performance, their convergence with human memory models provides a deeper understanding of the functional role of context in human memory, as well as presenting alternative ways to model human memory. Leveraging this convergence, we implement a neural machine translation model as a cognitive model of human memory search that is both interpretable and capable of capturing complex dynamics of learning. We show that our model accounts for both averaged and optimal human behavioral patterns as effectively as context-based memory models using a publicly available free recall experiment dataset involving 171 participants. Further, we demonstrate additional strengths of the proposed model by evaluating how memory search performance emerges from the interaction of different model components.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信