Large-scale foundation models and generative AI for BigData neuroscience.

IF 2.4 4区 医学 Q3 NEUROSCIENCES
Ran Wang, Zhe Sage Chen
{"title":"Large-scale foundation models and generative AI for BigData neuroscience.","authors":"Ran Wang, Zhe Sage Chen","doi":"10.1016/j.neures.2024.06.003","DOIUrl":null,"url":null,"abstract":"<p><p>Recent advances in machine learning have led to revolutionary breakthroughs in computer games, image and natural language understanding, and scientific discovery. Foundation models and large-scale language models (LLMs) have recently achieved human-like intelligence thanks to BigData. With the help of self-supervised learning (SSL) and transfer learning, these models may potentially reshape the landscapes of neuroscience research and make a significant impact on the future. Here we present a mini-review on recent advances in foundation models and generative AI models as well as their applications in neuroscience, including natural language and speech, semantic memory, brain-machine interfaces (BMIs), and data augmentation. We argue that this paradigm-shift framework will open new avenues for many neuroscience research directions and discuss the accompanying challenges and opportunities.</p>","PeriodicalId":19146,"journal":{"name":"Neuroscience Research","volume":" ","pages":""},"PeriodicalIF":2.4000,"publicationDate":"2024-06-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neuroscience Research","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1016/j.neures.2024.06.003","RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"NEUROSCIENCES","Score":null,"Total":0}
引用次数: 0

Abstract

Recent advances in machine learning have led to revolutionary breakthroughs in computer games, image and natural language understanding, and scientific discovery. Foundation models and large-scale language models (LLMs) have recently achieved human-like intelligence thanks to BigData. With the help of self-supervised learning (SSL) and transfer learning, these models may potentially reshape the landscapes of neuroscience research and make a significant impact on the future. Here we present a mini-review on recent advances in foundation models and generative AI models as well as their applications in neuroscience, including natural language and speech, semantic memory, brain-machine interfaces (BMIs), and data augmentation. We argue that this paradigm-shift framework will open new avenues for many neuroscience research directions and discuss the accompanying challenges and opportunities.

用于大数据神经科学的大规模基础模型和生成式人工智能。
机器学习的最新进展为计算机游戏、图像和自然语言理解以及科学发现带来了革命性的突破。得益于 BigData,基础模型和大规模语言模型(LLMs)最近实现了类似人类的智能。在自我监督学习(SSL)和迁移学习的帮助下,这些模型有可能重塑神经科学研究的格局,并对未来产生重大影响。在此,我们将对基础模型和生成式人工智能模型的最新进展及其在神经科学中的应用做一个小型回顾,包括自然语言和语音、语义记忆、脑机接口(BMI)和数据增强。我们认为,这种范式转换框架将为许多神经科学研究方向开辟新的途径,并讨论了随之而来的挑战和机遇。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Neuroscience Research
Neuroscience Research 医学-神经科学
CiteScore
5.60
自引率
3.40%
发文量
136
审稿时长
28 days
期刊介绍: The international journal publishing original full-length research articles, short communications, technical notes, and reviews on all aspects of neuroscience Neuroscience Research is an international journal for high quality articles in all branches of neuroscience, from the molecular to the behavioral levels. The journal is published in collaboration with the Japan Neuroscience Society and is open to all contributors in the world.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信