Nature computational science最新文献

筛选
英文 中文
Integrative deep learning of spatial multi-omics with SWITCH 基于SWITCH的空间多组学集成深度学习。
IF 18.3
Nature computational science Pub Date : 2025-10-29 DOI: 10.1038/s43588-025-00891-w
Zhongzhan Li, Sanqing Qu, Haixin Liang, Ruohui Tang, Xudong Zhang, Fan Lu, Jiani Yang, Ziling Gan, Shaorong Gao, Yanping Zhang, Guang Chen
{"title":"Integrative deep learning of spatial multi-omics with SWITCH","authors":"Zhongzhan Li, Sanqing Qu, Haixin Liang, Ruohui Tang, Xudong Zhang, Fan Lu, Jiani Yang, Ziling Gan, Shaorong Gao, Yanping Zhang, Guang Chen","doi":"10.1038/s43588-025-00891-w","DOIUrl":"10.1038/s43588-025-00891-w","url":null,"abstract":"Advancements in spatial omics permit spatially resolved measurements across several biological modalities. The high cost of acquiring co-profiled multimodal data limits the analysis. This underscores the necessity for computational methods to integrate unpaired spatial multi-omics data and perform cross-modal predictions on single-modality data. The integration of spatial omics is challenging due to typically low signal-to-noise ratios. Here we introduce SWITCH (Spatially Weighted Multi-omics Integration and Cross-modal Translation with Cycle-mapping Harmonization), a deep generative model for spatial multi-omics integration. SWITCH presents a cycle-mapping mechanism that produces dependable cross-modal translations without requiring additional paired data. These cross-modal translations function as pseudo-pairs to provide supplementary signals. Systematic evaluations demonstrate that SWITCH outperforms existing methods in terms of integration accuracy and achieves more precise spatial domain delineation, resolving brain cortical structures at higher resolution. The reliability of cross-modal translations was validated, facilitating various downstream analyses such as differential analysis, trajectory inference and gene regulatory network inference. In this study the authors present SWITCH, a deep learning model that integrates unpaired spatial multi-omics data and enables unsupervised cross-modal prediction, aiding spatial domain identification and downstream biological analysis.","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":"5 11","pages":"1051-1063"},"PeriodicalIF":18.3,"publicationDate":"2025-10-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145402993","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Quantum approximate multi-objective optimization 量子近似多目标优化。
IF 18.3
Nature computational science Pub Date : 2025-10-24 DOI: 10.1038/s43588-025-00873-y
Ayse Kotil, Elijah Pelofske, Stephanie Riedmüller, Daniel J. Egger, Stephan Eidenbenz, Thorsten Koch, Stefan Woerner
{"title":"Quantum approximate multi-objective optimization","authors":"Ayse Kotil, Elijah Pelofske, Stephanie Riedmüller, Daniel J. Egger, Stephan Eidenbenz, Thorsten Koch, Stefan Woerner","doi":"10.1038/s43588-025-00873-y","DOIUrl":"10.1038/s43588-025-00873-y","url":null,"abstract":"The goal of multi-objective optimization is to understand optimal trade-offs between competing objective functions by finding the Pareto front, that is, the set of all Pareto-optimal solutions, where no objective can be improved without degrading another one. Multi-objective optimization can be challenging classically, even if the corresponding single-objective optimization problems are efficiently solvable. Thus, multi-objective optimization represents a compelling problem class to analyze with quantum computers. Here we use a low-depth quantum approximate optimization algorithm to approximate the optimal Pareto front of certain multi-objective weighted maximum-cut problems. We demonstrate its performance on an IBM Quantum computer, as well as with matrix product state numerical simulation, and show its potential to outperform classical approaches. This study explores the use of quantum computing to address multi-objective optimization challenges. By using a low-depth quantum approximate optimization algorithm to approximate the optimal Pareto front of multi-objective weighted max-cut problems, the authors demonstrate promising results—both in simulation and on IBM Quantum hardware—surpassing classical approaches.","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":"5 12","pages":"1168-1177"},"PeriodicalIF":18.3,"publicationDate":"2025-10-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.nature.comhttps://www.nature.com/articles/s43588-025-00873-y.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145369252","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Discovering network dynamics with neural symbolic regression 用神经符号回归发现网络动力学。
IF 18.3
Nature computational science Pub Date : 2025-10-23 DOI: 10.1038/s43588-025-00893-8
Zihan Yu, Jingtao Ding, Yong Li
{"title":"Discovering network dynamics with neural symbolic regression","authors":"Zihan Yu, Jingtao Ding, Yong Li","doi":"10.1038/s43588-025-00893-8","DOIUrl":"10.1038/s43588-025-00893-8","url":null,"abstract":"Network dynamics are fundamental to analyzing the properties of high-dimensional complex systems and understanding their behavior. Despite the accumulation of observational data across many domains, mathematical models exist in only a few areas with clear underlying principles. Here we show that a neural symbolic regression approach can bridge this gap by automatically deriving formulas from data. Our method reduces searches on high-dimensional networks to equivalent one-dimensional systems and uses pretrained neural networks to guide accurate formula discovery. Applied to ten benchmark systems, it recovers the correct forms and parameters of underlying dynamics. In two empirical natural systems, it corrects existing models of gene regulation and microbial communities, reducing prediction error by 59.98% and 55.94%, respectively. In epidemic transmission across human mobility networks of various scales, it discovers dynamics that exhibit the same power-law distribution of node correlations across scales and reveal country-level differences in intervention effects. These results demonstrate that machine-driven discovery of network dynamics can enhance understandings of complex systems and advance the development of complexity science. This study presents a neural symbolic regression approach that autonomously uncovers network dynamics from data. It was demonstrated to refine existing models of gene regulation and ecology, and identify epidemic transmission patterns across spatial scales to yield scientific insights.","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":"6 2","pages":"156-168"},"PeriodicalIF":18.3,"publicationDate":"2025-10-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145356974","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Transferable neural wavefunctions for solids 固体的可转移神经波函数。
IF 18.3
Nature computational science Pub Date : 2025-10-22 DOI: 10.1038/s43588-025-00872-z
L. Gerard, M. Scherbela, H. Sutterud, W. M. C. Foulkes, P. Grohs
{"title":"Transferable neural wavefunctions for solids","authors":"L. Gerard, M. Scherbela, H. Sutterud, W. M. C. Foulkes, P. Grohs","doi":"10.1038/s43588-025-00872-z","DOIUrl":"10.1038/s43588-025-00872-z","url":null,"abstract":"Deep-learning-based variational Monte Carlo has emerged as a highly accurate method for solving the many-electron Schrödinger equation. Despite favorable scaling with the number of electrons, $${mathcal{O}}({{n}_{{rm{el}}}}^{4})$$ , the practical value of deep-learning-based variational Monte Carlo is limited by the high cost of optimizing the neural network weights for every system studied. Recent research has proposed optimizing a single neural network across multiple systems, reducing the cost per system. Here we extend this approach to solids, which require numerous calculations across different geometries, boundary conditions and supercell sizes. We demonstrate that optimization of a single ansatz across these variations significantly reduces optimization steps. Furthermore, we successfully transfer a network trained on 2 × 2 × 2 supercells of LiH, to 3 × 3 × 3 supercells, reducing the number of optimization steps required to simulate the large system by a factor of 50 compared with previous work. Investigating crystalline materials often requires calculations for many variations of a system, substantially increasing the computational burden. By training a transferable neural wavefunction across these variations, the cost can be reduced by approximately 50-fold for systems such as graphene and lithium hydride.","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":"5 12","pages":"1147-1157"},"PeriodicalIF":18.3,"publicationDate":"2025-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.nature.comhttps://www.nature.com/articles/s43588-025-00872-z.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145350277","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Down to one network for computing crystalline materials 到一个计算晶体材料的网络。
IF 18.3
Nature computational science Pub Date : 2025-10-22 DOI: 10.1038/s43588-025-00877-8
Yubing Qian, Ji Chen
{"title":"Down to one network for computing crystalline materials","authors":"Yubing Qian, Ji Chen","doi":"10.1038/s43588-025-00877-8","DOIUrl":"10.1038/s43588-025-00877-8","url":null,"abstract":"A recent study proposes using a single neural network to model and compute a wide range of solid-state materials, demonstrating exceptional transferability and substantially reduced computational costs — a breakthrough that could accelerate the design of next-generation materials in applications from efficient solar cells to room-temperature superconductors.","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":"5 12","pages":"1098-1099"},"PeriodicalIF":18.3,"publicationDate":"2025-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145350323","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Interpolating perturbations across contexts 跨上下文的插值扰动。
IF 18.3
Nature computational science Pub Date : 2025-10-15 DOI: 10.1038/s43588-025-00830-9
Han Chen, Christina V. Theodoris
{"title":"Interpolating perturbations across contexts","authors":"Han Chen, Christina V. Theodoris","doi":"10.1038/s43588-025-00830-9","DOIUrl":"10.1038/s43588-025-00830-9","url":null,"abstract":"The Large Perturbation Model (LPM) is a computational deep learning framework that predicts gene expression responses to chemical and genetic perturbations across diverse contexts. By modeling perturbation, readout, and context jointly, LPM enables in silico hypothesis generation and drug repurposing.","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":"5 11","pages":"992-993"},"PeriodicalIF":18.3,"publicationDate":"2025-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145304904","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
In silico biological discovery with large perturbation models 具有大扰动模型的硅生物发现。
IF 18.3
Nature computational science Pub Date : 2025-10-15 DOI: 10.1038/s43588-025-00870-1
Djordje Miladinovic, Tobias Höppe, Mathieu Chevalley, Andreas Georgiou, Lachlan Stuart, Arash Mehrjou, Marcus Bantscheff, Bernhard Schölkopf, Patrick Schwab
{"title":"In silico biological discovery with large perturbation models","authors":"Djordje Miladinovic, Tobias Höppe, Mathieu Chevalley, Andreas Georgiou, Lachlan Stuart, Arash Mehrjou, Marcus Bantscheff, Bernhard Schölkopf, Patrick Schwab","doi":"10.1038/s43588-025-00870-1","DOIUrl":"10.1038/s43588-025-00870-1","url":null,"abstract":"Data generated in perturbation experiments link perturbations to the changes they elicit and therefore contain information relevant to numerous biological discovery tasks—from understanding the relationships between biological entities to developing therapeutics. However, these data encompass diverse perturbations and readouts, and the complex dependence of experimental outcomes on their biological context makes it challenging to integrate insights across experiments. Here we present the large perturbation model (LPM), a deep-learning model that integrates multiple, heterogeneous perturbation experiments by representing perturbation, readout and context as disentangled dimensions. LPM outperforms existing methods across multiple biological discovery tasks, including in predicting post-perturbation transcriptomes of unseen experiments, identifying shared molecular mechanisms of action between chemical and genetic perturbations, and facilitating the inference of gene–gene interaction networks. LPM learns meaningful joint representations of perturbations, readouts and contexts, enables the study of biological relationships in silico and could considerably accelerate the derivation of insights from pooled perturbation experiments. A large perturbation model that integrates diverse laboratory experiments is presented to predict biological responses to chemical or genetic perturbations and support various biological discovery tasks.","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":"5 11","pages":"1029-1040"},"PeriodicalIF":18.3,"publicationDate":"2025-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.nature.comhttps://www.nature.com/articles/s43588-025-00870-1.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145304935","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
ECloudGen: leveraging electron clouds as a latent variable to scale up structure-based molecular design ECloudGen:利用电子云作为潜在变量来扩大基于结构的分子设计。
IF 18.3
Nature computational science Pub Date : 2025-10-15 DOI: 10.1038/s43588-025-00886-7
Odin Zhang, Jieyu Jin, Zhenxing Wu, Jintu Zhang, Po Yuan, Yuntao Yu, Haitao Lin, Haiyang Zhong, Xujun Zhang, Chenqing Hua, Weibo Zhao, Zhengshuo Zhang, Kejun Ying, Yufei Huang, Huifeng Zhao, Yu Kang, Peichen Pan, Jike Wang, Dong Guo, Shuangjia Zheng, Chang-Yu Hsieh, Tingjun Hou
{"title":"ECloudGen: leveraging electron clouds as a latent variable to scale up structure-based molecular design","authors":"Odin Zhang, Jieyu Jin, Zhenxing Wu, Jintu Zhang, Po Yuan, Yuntao Yu, Haitao Lin, Haiyang Zhong, Xujun Zhang, Chenqing Hua, Weibo Zhao, Zhengshuo Zhang, Kejun Ying, Yufei Huang, Huifeng Zhao, Yu Kang, Peichen Pan, Jike Wang, Dong Guo, Shuangjia Zheng, Chang-Yu Hsieh, Tingjun Hou","doi":"10.1038/s43588-025-00886-7","DOIUrl":"10.1038/s43588-025-00886-7","url":null,"abstract":"Structure-based molecule generation represents a notable advancement in artificial intelligence-driven drug design. However, progress in this field is constrained by the scarcity of structural data on protein–ligand complexes. Here we propose a latent variable approach that bridges the gap between ligand-only data and protein–ligand complexes, enabling target-aware generative models to explore a broader chemical space, thereby enhancing the quality of molecular generation. Inspired by quantum molecular simulations, we introduce ECloudGen, a generative model that leverages electron clouds as meaningful latent variables. ECloudGen incorporates techniques such as latent diffusion models, Llama architectures and a contrastive learning task, which organizes the chemical space into a structured and highly interpretable latent representation. Benchmark studies demonstrate that ECloudGen outperforms state-of-the-art methods by generating more potent binders with superior physiochemical properties and by covering a broader chemical space. The incorporation of electron clouds as latent variables not only improves generative performance but also introduces model-level interpretability, as illustrated in our case studies. This study presents ECloudGen, which uses latent diffusion to generate electron clouds from protein pockets and decodes them into molecules. The adopted two-stage training expands the chemical space accessible to generative drug design.","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":"5 11","pages":"1017-1028"},"PeriodicalIF":18.3,"publicationDate":"2025-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145304934","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
How neural rhythms can guide word recognition 神经节律如何引导单词识别
IF 18.3
Nature computational science Pub Date : 2025-10-10 DOI: 10.1038/s43588-025-00888-5
Sophie Slaats
{"title":"How neural rhythms can guide word recognition","authors":"Sophie Slaats","doi":"10.1038/s43588-025-00888-5","DOIUrl":"10.1038/s43588-025-00888-5","url":null,"abstract":"The recent computational model ‘BRyBI’ proposes that gamma, theta, and delta neural oscillations can guide the process of word recognition by providing temporal windows for the integration of bottom-up input with top-down information.","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":"5 10","pages":"848-849"},"PeriodicalIF":18.3,"publicationDate":"2025-10-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145256869","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Computational and ethical considerations for using large language models in psychotherapy 在心理治疗中使用大型语言模型的计算和伦理考虑
IF 18.3
Nature computational science Pub Date : 2025-10-10 DOI: 10.1038/s43588-025-00874-x
Renwen Zhang, Han Meng, Marion Neubronner, Yi-Chieh Lee
{"title":"Computational and ethical considerations for using large language models in psychotherapy","authors":"Renwen Zhang, Han Meng, Marion Neubronner, Yi-Chieh Lee","doi":"10.1038/s43588-025-00874-x","DOIUrl":"10.1038/s43588-025-00874-x","url":null,"abstract":"Large language models (LLMs) hold great potential for augmenting psychotherapy by enhancing accessibility, personalization and engagement. However, a systematic understanding of the roles that LLMs can play in psychotherapy remains underexplored. In this Perspective, we propose a taxonomy of LLM roles in psychotherapy that delineates six specific roles of LLMs across two key dimensions: artificial intelligence autonomy and emotional engagement. We discuss key computational and ethical challenges, such as emotion recognition, memory retention, privacy and emotional dependency, and offer recommendations to address these challenges. Large language models (LLMs) offer promising ways to enhance psychotherapy through greater accessibility, personalization and engagement. This Perspective introduces a typology that categorizes the roles of LLMs in psychotherapy along two critical dimensions: autonomy and emotional engagement.","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":"5 10","pages":"854-862"},"PeriodicalIF":18.3,"publicationDate":"2025-10-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145256872","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信
小红书