Nature computational science最新文献

筛选
英文 中文
Author Correction: Approaching coupled-cluster accuracy for molecular electronic structures with multi-task learning.
IF 12
Nature computational science Pub Date : 2025-01-22 DOI: 10.1038/s43588-025-00767-z
Hao Tang, Brian Xiao, Wenhao He, Pero Subasic, Avetik R Harutyunyan, Yao Wang, Fang Liu, Haowei Xu, Ju Li
{"title":"Author Correction: Approaching coupled-cluster accuracy for molecular electronic structures with multi-task learning.","authors":"Hao Tang, Brian Xiao, Wenhao He, Pero Subasic, Avetik R Harutyunyan, Yao Wang, Fang Liu, Haowei Xu, Ju Li","doi":"10.1038/s43588-025-00767-z","DOIUrl":"10.1038/s43588-025-00767-z","url":null,"abstract":"","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":" ","pages":""},"PeriodicalIF":12.0,"publicationDate":"2025-01-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143026218","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Boosting AI with neuromorphic computing. 用神经形态计算增强人工智能。
IF 12
Nature computational science Pub Date : 2025-01-21 DOI: 10.1038/s43588-025-00770-4
{"title":"Boosting AI with neuromorphic computing.","authors":"","doi":"10.1038/s43588-025-00770-4","DOIUrl":"https://doi.org/10.1038/s43588-025-00770-4","url":null,"abstract":"","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":" ","pages":""},"PeriodicalIF":12.0,"publicationDate":"2025-01-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143017818","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Memristors enabling probabilistic AI at the edge. 在边缘启用概率AI的忆阻器。
IF 12
Nature computational science Pub Date : 2025-01-17 DOI: 10.1038/s43588-024-00761-x
Damien Querlioz
{"title":"Memristors enabling probabilistic AI at the edge.","authors":"Damien Querlioz","doi":"10.1038/s43588-024-00761-x","DOIUrl":"https://doi.org/10.1038/s43588-024-00761-x","url":null,"abstract":"","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":" ","pages":""},"PeriodicalIF":12.0,"publicationDate":"2025-01-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143017826","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Efficient large language model with analog in-memory computing. 高效的大型语言模型与模拟内存计算。
IF 12
Nature computational science Pub Date : 2025-01-17 DOI: 10.1038/s43588-024-00760-y
Anand Subramoney
{"title":"Efficient large language model with analog in-memory computing.","authors":"Anand Subramoney","doi":"10.1038/s43588-024-00760-y","DOIUrl":"https://doi.org/10.1038/s43588-024-00760-y","url":null,"abstract":"","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":" ","pages":""},"PeriodicalIF":12.0,"publicationDate":"2025-01-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143017821","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Energy-efficient multimodal zero-shot learning using in-memory reservoir computing. 基于内存库计算的节能多模态零学习。
IF 12
Nature computational science Pub Date : 2025-01-13 DOI: 10.1038/s43588-024-00762-w
{"title":"Energy-efficient multimodal zero-shot learning using in-memory reservoir computing.","authors":"","doi":"10.1038/s43588-024-00762-w","DOIUrl":"https://doi.org/10.1038/s43588-024-00762-w","url":null,"abstract":"","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":" ","pages":""},"PeriodicalIF":12.0,"publicationDate":"2025-01-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142980780","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Bridging generations and cultures in mathematics and computer science. 在数学和计算机科学中架起代际和文化的桥梁。
IF 12
Nature computational science Pub Date : 2025-01-09 DOI: 10.1038/s43588-024-00756-8
Alyssa April Dellow, Fatimah Abdul Razak
{"title":"Bridging generations and cultures in mathematics and computer science.","authors":"Alyssa April Dellow, Fatimah Abdul Razak","doi":"10.1038/s43588-024-00756-8","DOIUrl":"https://doi.org/10.1038/s43588-024-00756-8","url":null,"abstract":"","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":" ","pages":""},"PeriodicalIF":12.0,"publicationDate":"2025-01-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142959947","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A new tool for shape and structure optimization of soft materials. 软质材料形状和结构优化的新工具。
IF 12
Nature computational science Pub Date : 2025-01-09 DOI: 10.1038/s43588-024-00754-w
{"title":"A new tool for shape and structure optimization of soft materials.","authors":"","doi":"10.1038/s43588-024-00754-w","DOIUrl":"https://doi.org/10.1038/s43588-024-00754-w","url":null,"abstract":"","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":" ","pages":""},"PeriodicalIF":12.0,"publicationDate":"2025-01-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142959944","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Resistive memory-based zero-shot liquid state machine for multimodal event data learning. 多模态事件数据学习的电阻式记忆零射击液体状态机。
IF 12
Nature computational science Pub Date : 2025-01-09 DOI: 10.1038/s43588-024-00751-z
Ning Lin, Shaocong Wang, Yi Li, Bo Wang, Shuhui Shi, Yangu He, Woyu Zhang, Yifei Yu, Yue Zhang, Xinyuan Zhang, Kwunhang Wong, Songqi Wang, Xiaoming Chen, Hao Jiang, Xumeng Zhang, Peng Lin, Xiaoxin Xu, Xiaojuan Qi, Zhongrui Wang, Dashan Shang, Qi Liu, Ming Liu
{"title":"Resistive memory-based zero-shot liquid state machine for multimodal event data learning.","authors":"Ning Lin, Shaocong Wang, Yi Li, Bo Wang, Shuhui Shi, Yangu He, Woyu Zhang, Yifei Yu, Yue Zhang, Xinyuan Zhang, Kwunhang Wong, Songqi Wang, Xiaoming Chen, Hao Jiang, Xumeng Zhang, Peng Lin, Xiaoxin Xu, Xiaojuan Qi, Zhongrui Wang, Dashan Shang, Qi Liu, Ming Liu","doi":"10.1038/s43588-024-00751-z","DOIUrl":"10.1038/s43588-024-00751-z","url":null,"abstract":"<p><p>The human brain is a complex spiking neural network (SNN) capable of learning multimodal signals in a zero-shot manner by generalizing existing knowledge. Remarkably, it maintains minimal power consumption through event-based signal propagation. However, replicating the human brain in neuromorphic hardware presents both hardware and software challenges. Hardware limitations, such as the slowdown of Moore's law and Von Neumann bottleneck, hinder the efficiency of digital computers. In addition, SNNs are characterized by their software training complexities. Here, to this end, we propose a hardware-software co-design on a 40 nm 256 kB in-memory computing macro that physically integrates a fixed and random liquid state machine SNN encoder with trainable artificial neural network projections. We showcase the zero-shot learning of multimodal events on the N-MNIST and N-TIDIGITS datasets, including visual and audio data association, as well as neural and visual data alignment for brain-machine interfaces. Our co-design achieves classification accuracy comparable to fully optimized software models, resulting in a 152.83- and 393.07-fold reduction in training costs compared with state-of-the-art spiking recurrent neural network-based contrastive learning and prototypical networks, and a 23.34- and 160-fold improvement in energy efficiency compared with cutting-edge digital hardware, respectively. These proof-of-principle prototypes demonstrate zero-shot multimodal events learning capability for emerging efficient and compact neuromorphic hardware.</p>","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":" ","pages":""},"PeriodicalIF":12.0,"publicationDate":"2025-01-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142959950","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Efficient scaling of large language models with mixture of experts and 3D analog in-memory computing. 高效缩放与专家和三维模拟内存计算混合的大型语言模型。
IF 12
Nature computational science Pub Date : 2025-01-08 DOI: 10.1038/s43588-024-00753-x
Julian Büchel, Athanasios Vasilopoulos, William Andrew Simon, Irem Boybat, HsinYu Tsai, Geoffrey W Burr, Hernan Castro, Bill Filipiak, Manuel Le Gallo, Abbas Rahimi, Vijay Narayanan, Abu Sebastian
{"title":"Efficient scaling of large language models with mixture of experts and 3D analog in-memory computing.","authors":"Julian Büchel, Athanasios Vasilopoulos, William Andrew Simon, Irem Boybat, HsinYu Tsai, Geoffrey W Burr, Hernan Castro, Bill Filipiak, Manuel Le Gallo, Abbas Rahimi, Vijay Narayanan, Abu Sebastian","doi":"10.1038/s43588-024-00753-x","DOIUrl":"10.1038/s43588-024-00753-x","url":null,"abstract":"<p><p>Large language models (LLMs), with their remarkable generative capacities, have greatly impacted a range of fields, but they face scalability challenges due to their large parameter counts, which result in high costs for training and inference. The trend of increasing model sizes is exacerbating these challenges, particularly in terms of memory footprint, latency and energy consumption. Here we explore the deployment of 'mixture of experts' (MoEs) networks-networks that use conditional computing to keep computational demands low despite having many parameters-on three-dimensional (3D) non-volatile memory (NVM)-based analog in-memory computing (AIMC) hardware. When combined with the MoE architecture, this hardware, utilizing stacked NVM devices arranged in a crossbar array, offers a solution to the parameter-fetching bottleneck typical in traditional models deployed on conventional von-Neumann-based architectures. By simulating the deployment of MoEs on an abstract 3D AIMC system, we demonstrate that, due to their conditional compute mechanism, MoEs are inherently better suited to this hardware than conventional, dense model architectures. Our findings suggest that MoEs, in conjunction with emerging 3D NVM-based AIMC, can substantially reduce the inference costs of state-of-the-art LLMs, making them more accessible and energy-efficient.</p>","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":" ","pages":""},"PeriodicalIF":12.0,"publicationDate":"2025-01-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142959948","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Decoupled peak property learning for efficient and interpretable electronic circular dichroism spectrum prediction. 解耦峰属性学习用于高效和可解释的电子圆二色光谱预测。
IF 12
Nature computational science Pub Date : 2025-01-03 DOI: 10.1038/s43588-024-00757-7
Hao Li, Da Long, Li Yuan, Yu Wang, Yonghong Tian, Xinchang Wang, Fanyang Mo
{"title":"Decoupled peak property learning for efficient and interpretable electronic circular dichroism spectrum prediction.","authors":"Hao Li, Da Long, Li Yuan, Yu Wang, Yonghong Tian, Xinchang Wang, Fanyang Mo","doi":"10.1038/s43588-024-00757-7","DOIUrl":"https://doi.org/10.1038/s43588-024-00757-7","url":null,"abstract":"<p><p>Electronic circular dichroism (ECD) spectra contain key information about molecular chirality by discriminating the absolute configurations of chiral molecules, which is crucial in asymmetric organic synthesis and the drug industry. However, existing predictive approaches lack the consideration of ECD spectra owing to the data scarcity and the limited interpretability to achieve trustworthy prediction. Here we establish a large-scale dataset for chiral molecular ECD spectra and propose ECDFormer for accurate and interpretable ECD spectrum prediction. ECDFormer decomposes ECD spectra into peak entities, uses the QFormer architecture to learn peak properties and renders peaks into spectra. Compared with spectrum sequence prediction methods, our decoupled peak prediction approach substantially enhances both accuracy and efficiency, improving the peak symbol accuracy from 37.3% to 72.7% and decreasing the time cost from an average of 4.6 central processing unit hours to 1.5 s. Moreover, ECDFormer demonstrated its ability to capture molecular orbital information directly from spectral data using the explainable peak-decoupling approach. Furthermore, ECDFormer proved to be equally proficient at predicting various types of spectrum, including infrared and mass spectroscopies, highlighting its substantial generalization capabilities.</p>","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":" ","pages":""},"PeriodicalIF":12.0,"publicationDate":"2025-01-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142928880","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信