The Missing Half of Language Learning in Current Developmental Language Models: Exogenous and Endogenous Linguistic Input.

Q1 Social Sciences
Open Mind Pub Date : 2025-09-17 eCollection Date: 2025-01-01 DOI:10.1162/OPMI.a.33
Nan Zhao, Xufeng Duan, Zhenguang G Cai
{"title":"The Missing Half of Language Learning in Current Developmental Language Models: Exogenous and Endogenous Linguistic Input.","authors":"Nan Zhao, Xufeng Duan, Zhenguang G Cai","doi":"10.1162/OPMI.a.33","DOIUrl":null,"url":null,"abstract":"<p><p>Developmental language models (DLMs) aim to replicate the efficiency of child language acquisition but often focus solely on the estimation of exogenous linguistic input. We argue that a child's linguistic growth is also critically shaped by endogenous processes, including (1) co-opting language in non-linguistic perception and cognition, (2) engaging in private and inner speech, and (3) benefiting from neural replay of linguistic information during sleep. These endogenous processes amplify and refine exogenous linguistic input in ways that current DLMs do not replicate. To align DLMs with child language acquisition, we propose redefining \"linguistic exposure\" to encompass both exogenous and endogenous linguistic input. By integrating label feedback, self-generated speech, and sleep-like consolidation, researchers can narrow the gap between artificial and human learning. Collaborations across machine learning, psychology, and linguistics will be essential to ground models in empirical data on child behavior and build DLMs that truly reflect the marvel of language acquisition.</p>","PeriodicalId":32558,"journal":{"name":"Open Mind","volume":"9 ","pages":"1543-1549"},"PeriodicalIF":0.0000,"publicationDate":"2025-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12506926/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Open Mind","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1162/OPMI.a.33","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2025/1/1 0:00:00","PubModel":"eCollection","JCR":"Q1","JCRName":"Social Sciences","Score":null,"Total":0}
引用次数: 0

Abstract

Developmental language models (DLMs) aim to replicate the efficiency of child language acquisition but often focus solely on the estimation of exogenous linguistic input. We argue that a child's linguistic growth is also critically shaped by endogenous processes, including (1) co-opting language in non-linguistic perception and cognition, (2) engaging in private and inner speech, and (3) benefiting from neural replay of linguistic information during sleep. These endogenous processes amplify and refine exogenous linguistic input in ways that current DLMs do not replicate. To align DLMs with child language acquisition, we propose redefining "linguistic exposure" to encompass both exogenous and endogenous linguistic input. By integrating label feedback, self-generated speech, and sleep-like consolidation, researchers can narrow the gap between artificial and human learning. Collaborations across machine learning, psychology, and linguistics will be essential to ground models in empirical data on child behavior and build DLMs that truly reflect the marvel of language acquisition.

当前发展性语言模型中缺失的语言学习的一半:外生和内生语言输入。
发展性语言模型旨在复制儿童语言习得的效率,但往往只关注外源性语言输入的估计。我们认为,儿童的语言成长也受到内源性过程的关键影响,包括(1)在非语言感知和认知中选择语言,(2)参与私人和内心语言,以及(3)受益于睡眠时语言信息的神经回放。这些内源性过程以当前dlm无法复制的方式放大和完善外源性语言输入。为了使dlm与儿童语言习得保持一致,我们建议重新定义“语言暴露”,以包括外源性和内源性语言输入。通过整合标签反馈、自生成语音和睡眠样巩固,研究人员可以缩小人工学习和人类学习之间的差距。机器学习、心理学和语言学之间的合作对于儿童行为经验数据的基础模型和构建真正反映语言习得奇迹的dlm至关重要。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Open Mind
Open Mind Social Sciences-Linguistics and Language
CiteScore
3.20
自引率
0.00%
发文量
15
审稿时长
53 weeks
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信