多层变压器早期诊断婴儿自闭症谱系障碍和语言障碍

IF 1.7 4区 计算机科学 Q3 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Jingyu Liu, Honghua Wang, Jialing Wang, Han Wu, Yifei Xu, Tao Zhang, Guangfeng Zhou
{"title":"多层变压器早期诊断婴儿自闭症谱系障碍和语言障碍","authors":"Jingyu Liu,&nbsp;Honghua Wang,&nbsp;Jialing Wang,&nbsp;Han Wu,&nbsp;Yifei Xu,&nbsp;Tao Zhang,&nbsp;Guangfeng Zhou","doi":"10.1111/coin.70131","DOIUrl":null,"url":null,"abstract":"<div>\n \n <p>Current diagnosis of autism spectrum disorder (ASD) and developmental language/reading disorder (DLD/DD) relies predominantly on subjective behavioral assessments, this underscoring the urgent need for objective biomarkers to enable early intervention. This study proposes a Multi-Tier Transformer (MTT) model for early identification of ASD and DLD/DD using resting-state EEG baseline power values. To address severe class imbalance, we augmented the dataset using the SMOTE method. The MTT architecture integrates a Feature-Embedding layer, a Feature-Attention mechanism that dynamically weights multi-spectral inputs, and a dual-attention encoding block comprising both self-attention and cross-attention to enhance contextual representation learning from limited samples. Transfer learning was further employed to improve robustness by pre-training on augmented data and fine-tuning on original samples. Evaluated on clinical infant EEG data, the proposed MTT achieved an accuracy of 0.91 (95% CI: 0.89–0.93), recall of 0.89, and AUC of 0.97, significantly outperforming the state-of-the-art FT-transformer (<i>p</i> = 0.00091). The results indicate that MTT provides a robust and interpretable deep learning tool for auxiliary diagnosis of neurodevelopmental disorders in infancy.</p>\n </div>","PeriodicalId":55228,"journal":{"name":"Computational Intelligence","volume":"41 6","pages":""},"PeriodicalIF":1.7000,"publicationDate":"2025-10-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"MTT: Multi-Tier Transformer for Early Diagnosis of Autism Spectrum Disorder and Language Disorder in Infants\",\"authors\":\"Jingyu Liu,&nbsp;Honghua Wang,&nbsp;Jialing Wang,&nbsp;Han Wu,&nbsp;Yifei Xu,&nbsp;Tao Zhang,&nbsp;Guangfeng Zhou\",\"doi\":\"10.1111/coin.70131\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div>\\n \\n <p>Current diagnosis of autism spectrum disorder (ASD) and developmental language/reading disorder (DLD/DD) relies predominantly on subjective behavioral assessments, this underscoring the urgent need for objective biomarkers to enable early intervention. This study proposes a Multi-Tier Transformer (MTT) model for early identification of ASD and DLD/DD using resting-state EEG baseline power values. To address severe class imbalance, we augmented the dataset using the SMOTE method. The MTT architecture integrates a Feature-Embedding layer, a Feature-Attention mechanism that dynamically weights multi-spectral inputs, and a dual-attention encoding block comprising both self-attention and cross-attention to enhance contextual representation learning from limited samples. Transfer learning was further employed to improve robustness by pre-training on augmented data and fine-tuning on original samples. Evaluated on clinical infant EEG data, the proposed MTT achieved an accuracy of 0.91 (95% CI: 0.89–0.93), recall of 0.89, and AUC of 0.97, significantly outperforming the state-of-the-art FT-transformer (<i>p</i> = 0.00091). The results indicate that MTT provides a robust and interpretable deep learning tool for auxiliary diagnosis of neurodevelopmental disorders in infancy.</p>\\n </div>\",\"PeriodicalId\":55228,\"journal\":{\"name\":\"Computational Intelligence\",\"volume\":\"41 6\",\"pages\":\"\"},\"PeriodicalIF\":1.7000,\"publicationDate\":\"2025-10-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computational Intelligence\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://onlinelibrary.wiley.com/doi/10.1111/coin.70131\",\"RegionNum\":4,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computational Intelligence","FirstCategoryId":"94","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1111/coin.70131","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

目前自闭症谱系障碍(ASD)和发展性语言/阅读障碍(DLD/DD)的诊断主要依赖于主观行为评估,这强调了迫切需要客观的生物标志物来实现早期干预。本研究提出了一种多层变压器(Multi-Tier Transformer, MTT)模型,利用静息状态脑电图基线功率值来早期识别ASD和DLD/DD。为了解决严重的类不平衡问题,我们使用SMOTE方法增强了数据集。MTT架构集成了一个特征嵌入层、一个动态加权多光谱输入的特征注意机制和一个包含自注意和交叉注意的双注意编码块,以增强有限样本的上下文表示学习。通过对增强数据的预训练和对原始样本的微调,进一步利用迁移学习来提高鲁棒性。对临床婴儿脑电图数据进行评估,MTT的准确率为0.91 (95% CI: 0.89 - 0.93),召回率为0.89,AUC为0.97,显著优于最先进的FT-transformer (p = 0.00091)。结果表明,MTT为婴幼儿神经发育障碍的辅助诊断提供了一个强大的、可解释的深度学习工具。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

MTT: Multi-Tier Transformer for Early Diagnosis of Autism Spectrum Disorder and Language Disorder in Infants

MTT: Multi-Tier Transformer for Early Diagnosis of Autism Spectrum Disorder and Language Disorder in Infants

Current diagnosis of autism spectrum disorder (ASD) and developmental language/reading disorder (DLD/DD) relies predominantly on subjective behavioral assessments, this underscoring the urgent need for objective biomarkers to enable early intervention. This study proposes a Multi-Tier Transformer (MTT) model for early identification of ASD and DLD/DD using resting-state EEG baseline power values. To address severe class imbalance, we augmented the dataset using the SMOTE method. The MTT architecture integrates a Feature-Embedding layer, a Feature-Attention mechanism that dynamically weights multi-spectral inputs, and a dual-attention encoding block comprising both self-attention and cross-attention to enhance contextual representation learning from limited samples. Transfer learning was further employed to improve robustness by pre-training on augmented data and fine-tuning on original samples. Evaluated on clinical infant EEG data, the proposed MTT achieved an accuracy of 0.91 (95% CI: 0.89–0.93), recall of 0.89, and AUC of 0.97, significantly outperforming the state-of-the-art FT-transformer (p = 0.00091). The results indicate that MTT provides a robust and interpretable deep learning tool for auxiliary diagnosis of neurodevelopmental disorders in infancy.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Computational Intelligence
Computational Intelligence 工程技术-计算机:人工智能
CiteScore
6.90
自引率
3.60%
发文量
65
审稿时长
>12 weeks
期刊介绍: This leading international journal promotes and stimulates research in the field of artificial intelligence (AI). Covering a wide range of issues - from the tools and languages of AI to its philosophical implications - Computational Intelligence provides a vigorous forum for the publication of both experimental and theoretical research, as well as surveys and impact studies. The journal is designed to meet the needs of a wide range of AI workers in academic and industrial research.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信