Nature computational science最新文献

筛选
英文 中文
Advancing neural decoding with deep learning. 用深度学习推进神经解码。
IF 12
Nature computational science Pub Date : 2025-07-11 DOI: 10.1038/s43588-025-00837-2
Ma Feilong, Yuqi Zhang
{"title":"Advancing neural decoding with deep learning.","authors":"Ma Feilong, Yuqi Zhang","doi":"10.1038/s43588-025-00837-2","DOIUrl":"https://doi.org/10.1038/s43588-025-00837-2","url":null,"abstract":"","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":" ","pages":""},"PeriodicalIF":12.0,"publicationDate":"2025-07-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144621446","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Privacy-preserving multicenter differential protein abundance analysis with FedProt. 保护隐私的多中心差异蛋白丰度分析。
IF 12
Nature computational science Pub Date : 2025-07-11 DOI: 10.1038/s43588-025-00832-7
Yuliya Burankova, Miriam Abele, Mohammad Bakhtiari, Christine von Toerne, Teresa K Barth, Lisa Schweizer, Pieter Giesbertz, Johannes R Schmidt, Stefan Kalkhof, Janina Müller-Deile, Peter A van Veelen, Yassene Mohammed, Elke Hammer, Lis Arend, Klaudia Adamowicz, Tanja Laske, Anne Hartebrodt, Tobias Frisch, Chen Meng, Julian Matschinske, Julian Späth, Richard Röttger, Veit Schwämmle, Stefanie M Hauck, Stefan F Lichtenthaler, Axel Imhof, Matthias Mann, Christina Ludwig, Bernhard Kuster, Jan Baumbach, Olga Zolotareva
{"title":"Privacy-preserving multicenter differential protein abundance analysis with FedProt.","authors":"Yuliya Burankova, Miriam Abele, Mohammad Bakhtiari, Christine von Toerne, Teresa K Barth, Lisa Schweizer, Pieter Giesbertz, Johannes R Schmidt, Stefan Kalkhof, Janina Müller-Deile, Peter A van Veelen, Yassene Mohammed, Elke Hammer, Lis Arend, Klaudia Adamowicz, Tanja Laske, Anne Hartebrodt, Tobias Frisch, Chen Meng, Julian Matschinske, Julian Späth, Richard Röttger, Veit Schwämmle, Stefanie M Hauck, Stefan F Lichtenthaler, Axel Imhof, Matthias Mann, Christina Ludwig, Bernhard Kuster, Jan Baumbach, Olga Zolotareva","doi":"10.1038/s43588-025-00832-7","DOIUrl":"https://doi.org/10.1038/s43588-025-00832-7","url":null,"abstract":"<p><p>Quantitative mass spectrometry has revolutionized proteomics by enabling simultaneous quantification of thousands of proteins. Pooling patient-derived data from multiple institutions enhances statistical power but raises serious privacy concerns. Here we introduce FedProt, the first privacy-preserving tool for collaborative differential protein abundance analysis of distributed data, which utilizes federated learning and additive secret sharing. In the absence of a multicenter patient-derived dataset for evaluation, we created two: one at five centers from E. coli experiments and one at three centers from human serum. Evaluations using these datasets confirm that FedProt achieves accuracy equivalent to the DEqMS method applied to pooled data, with completely negligible absolute differences no greater than 4 × 10<sup>-12</sup>. By contrast, -log<sub>10</sub>P computed by the most accurate meta-analysis methods diverged from the centralized analysis results by up to 25-26.</p>","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":" ","pages":""},"PeriodicalIF":12.0,"publicationDate":"2025-07-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144621448","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Inter-individual and inter-site neural code conversion without shared stimuli. 没有共享刺激的个体间和位点间神经编码转换。
IF 12
Nature computational science Pub Date : 2025-07-11 DOI: 10.1038/s43588-025-00826-5
Haibao Wang, Jun Kai Ho, Fan L Cheng, Shuntaro C Aoki, Yusuke Muraki, Misato Tanaka, Jong-Yun Park, Yukiyasu Kamitani
{"title":"Inter-individual and inter-site neural code conversion without shared stimuli.","authors":"Haibao Wang, Jun Kai Ho, Fan L Cheng, Shuntaro C Aoki, Yusuke Muraki, Misato Tanaka, Jong-Yun Park, Yukiyasu Kamitani","doi":"10.1038/s43588-025-00826-5","DOIUrl":"https://doi.org/10.1038/s43588-025-00826-5","url":null,"abstract":"<p><p>Inter-individual variability in fine-grained functional topographies poses challenges for scalable data analysis and modeling. Functional alignment techniques can help mitigate these individual differences but they typically require paired brain data with the same stimuli between individuals, which are often unavailable. Here we present a neural code conversion method that overcomes this constraint by optimizing conversion parameters based on the discrepancy between the stimulus contents represented by original and converted brain activity patterns. This approach, combined with hierarchical features of deep neural networks as latent content representations, achieves conversion accuracies that are comparable with methods using shared stimuli. The converted brain activity from a source subject can be accurately decoded using the target's pre-trained decoders, producing high-quality visual image reconstructions that rival within-individual decoding, even with data across different sites and limited training samples. Our approach offers a promising framework for scalable neural data analysis and modeling and a foundation for brain-to-brain communication.</p>","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":" ","pages":""},"PeriodicalIF":12.0,"publicationDate":"2025-07-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144621447","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A large-scale replication of scenario-based experiments in psychology and management using large language models. 使用大型语言模型在心理学和管理学中大规模复制基于场景的实验。
IF 12
Nature computational science Pub Date : 2025-07-09 DOI: 10.1038/s43588-025-00840-7
Ziyan Cui, Ning Li, Huaikang Zhou
{"title":"A large-scale replication of scenario-based experiments in psychology and management using large language models.","authors":"Ziyan Cui, Ning Li, Huaikang Zhou","doi":"10.1038/s43588-025-00840-7","DOIUrl":"https://doi.org/10.1038/s43588-025-00840-7","url":null,"abstract":"<p><p>We conducted a large-scale study replicating 156 psychological experiments from top social science journals using three state-of-the-art large language models (LLMs). Our results reveal that, while LLMs demonstrated high replication rates for main effects (73-81%) and moderate to strong success with interaction effects (46-63%), they consistently produced larger effect sizes than human studies. Notably, LLMs showed significantly lower replication rates for studies involving socially sensitive topics such as race, gender and ethics. When original studies reported null findings, LLMs produced significant results at remarkably high rates (68-83%); while this could reflect cleaner data with less noise, it also suggests potential risks of effect size overestimation. Our results demonstrate both the promises and the challenges of LLMs in psychological research: while LLMs are efficient tools for pilot testing and rapid hypothesis validation, enriching rather than replacing traditional human-participant studies, they require more nuanced interpretation and human validation for complex social phenomena and culturally sensitive research questions.</p>","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":" ","pages":""},"PeriodicalIF":12.0,"publicationDate":"2025-07-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144602414","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Investigating the volume and diversity of data needed for generalizable antibody-antigen ΔΔG prediction. 调查可推广抗体-抗原ΔΔG预测所需数据的数量和多样性。
IF 12
Nature computational science Pub Date : 2025-07-08 DOI: 10.1038/s43588-025-00823-8
Alissa M Hummer, Constantin Schneider, Lewis Chinery, Charlotte M Deane
{"title":"Investigating the volume and diversity of data needed for generalizable antibody-antigen ΔΔG prediction.","authors":"Alissa M Hummer, Constantin Schneider, Lewis Chinery, Charlotte M Deane","doi":"10.1038/s43588-025-00823-8","DOIUrl":"https://doi.org/10.1038/s43588-025-00823-8","url":null,"abstract":"<p><p>Antibody-antigen binding affinity lies at the heart of therapeutic antibody development: efficacy is guided by specific binding and control of affinity. Here we present Graphinity, an equivariant graph neural network architecture built directly from antibody-antigen structures that achieves test Pearson's correlations of up to 0.87 on experimental change in binding affinity (ΔΔG) prediction. However, our model, like previous methods, appears to be overtraining on the few hundred experimental data points available and performance is not robust to train-test cut-offs. To investigate the amount and type of data required to generalizably predict ΔΔG, we built synthetic datasets of nearly 1 million FoldX-generated and >20,000 Rosetta Flex ddG-generated ΔΔG values. Our results indicate that there are currently insufficient experimental data to accurately and robustly predict ΔΔG, with orders of magnitude more likely needed. Dataset size is not the only consideration; diversity is also an important factor for model predictiveness. These findings provide a lower bound on data requirements to inform future method development and data collection efforts.</p>","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":" ","pages":""},"PeriodicalIF":12.0,"publicationDate":"2025-07-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144593098","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Unbalanced gene-level batch effects in single-cell data. 单细胞数据中不平衡基因水平的批效应。
IF 12
Nature computational science Pub Date : 2025-07-01 DOI: 10.1038/s43588-025-00829-2
{"title":"Unbalanced gene-level batch effects in single-cell data.","authors":"","doi":"10.1038/s43588-025-00829-2","DOIUrl":"https://doi.org/10.1038/s43588-025-00829-2","url":null,"abstract":"","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":" ","pages":""},"PeriodicalIF":12.0,"publicationDate":"2025-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144546425","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Predicting adverse drug reactions for combination pharmacotherapy with cross-scale associative learning via attention modules. 基于注意模块的跨尺度联想学习预测联合药物治疗不良反应。
IF 12
Nature computational science Pub Date : 2025-06-30 DOI: 10.1038/s43588-025-00816-7
Boyang Li, Yifan Qi, Bo Li, Xiaoqiong Li
{"title":"Predicting adverse drug reactions for combination pharmacotherapy with cross-scale associative learning via attention modules.","authors":"Boyang Li, Yifan Qi, Bo Li, Xiaoqiong Li","doi":"10.1038/s43588-025-00816-7","DOIUrl":"https://doi.org/10.1038/s43588-025-00816-7","url":null,"abstract":"<p><p>The rapid emergence of combination pharmacotherapies offers substantial therapeutic advantages but also poses risks of adverse drug reactions (ADRs). The accurate prediction of ADRs with interpretable computational methods is crucial for clinical medication management, drug development and precision medicine. Machine-learning and recently developed deep learning architectures struggle to effectively elucidate the key protein-protein interactions underlying ADRs from an organ perspective and to explicitly represent ADR associations. Here we propose OrganADR, an associative learning-enhanced model to predict ADRs at the organ level for emerging combination pharmacotherapy. It incorporates ADR information at the organ level, drug information at the molecular level and network-based biomedical knowledge into integrated representations with multi-interpretable modules. Evaluation across 15 organs demonstrates that OrganADR not only achieves state-of-the-art performance but also delivers both interpretable insights at the organ level and network-based perspectives. Overall, OrganADR represents a useful tool for cross-scale biomedical information integration and could be used to prevent ADRs during clinical precision medicine.</p>","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":" ","pages":""},"PeriodicalIF":12.0,"publicationDate":"2025-06-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144531465","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Iterative variational learning of committor-consistent transition pathways using artificial neural networks. 基于人工神经网络的提交者一致转换路径的迭代变分学习。
IF 12
Nature computational science Pub Date : 2025-06-30 DOI: 10.1038/s43588-025-00828-3
Alberto Megías, Sergio Contreras Arredondo, Cheng Giuseppe Chen, Chenyu Tang, Benoît Roux, Christophe Chipot
{"title":"Iterative variational learning of committor-consistent transition pathways using artificial neural networks.","authors":"Alberto Megías, Sergio Contreras Arredondo, Cheng Giuseppe Chen, Chenyu Tang, Benoît Roux, Christophe Chipot","doi":"10.1038/s43588-025-00828-3","DOIUrl":"10.1038/s43588-025-00828-3","url":null,"abstract":"<p><p>Discovering transition pathways that are physically meaningful and committor-consistent has long been a challenge in studying rare events in complex systems. Here we introduce a neural network-based strategy that learns simultaneously the committor function and the associated committor-consistent string, offering an unprecedented view of transition processes. Built on the committor time-correlation function, this method operates across diverse dynamical regimes, and extends beyond traditional approaches relying on infinitesimal time-lag approximations, valid only in the overdamped diffusive limit. It also distinguishes multiple competing pathways, crucial for understanding complex biomolecular transformations. Demonstrated on benchmark potentials and biological systems such as peptide isomerization and protein-model folding, this approach robustly reproduces established dynamics, rate constants and transition mechanisms. Its adaptability to collective variables and resilience across neural architectures make it a powerful and versatile tool for enhanced-sampling simulations of rare events, enabling insights into the intricate landscapes of biomolecular systems.</p>","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":" ","pages":""},"PeriodicalIF":12.0,"publicationDate":"2025-06-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144531464","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Quantifying batch effects for individual genes in single-cell data. 量化单细胞数据中单个基因的批效应。
IF 12
Nature computational science Pub Date : 2025-06-27 DOI: 10.1038/s43588-025-00824-7
Yang Zhou, Qiongyu Sheng, Guohua Wang, Li Xu, Shuilin Jin
{"title":"Quantifying batch effects for individual genes in single-cell data.","authors":"Yang Zhou, Qiongyu Sheng, Guohua Wang, Li Xu, Shuilin Jin","doi":"10.1038/s43588-025-00824-7","DOIUrl":"10.1038/s43588-025-00824-7","url":null,"abstract":"<p><p>Batch effects substantially impede the comparison of multiple single-cell experiment batches. Existing methods for batch effect removal and quantification primarily emphasize cell alignment across batches, often overlooking gene-level batch effects. Here we introduce group technical effects (GTE)-a quantitative metric to assess batch effects on individual genes. Using GTE, we show that batch effects unevenly impact genes within the dataset. A portion of highly batch-sensitive genes (HBGs) differ between datasets and dominate the batch effects, whereas non-HBGs exhibit low batch effects. We demonstrate that as few as three HBGs are sufficient to introduce substantial batch effects. Our method also enables the assessment of cell-level batch effects, outperforming existing batch effect quantification methods. We also observe that biologically similar cell types undergo similar batch effects, informing the development of data integration strategies. The GTE method is versatile and applicable to various single-cell omics data types.</p>","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":" ","pages":""},"PeriodicalIF":12.0,"publicationDate":"2025-06-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144512906","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
The impact of language models on the humanities and vice versa. 语言模式对人文学科的影响,反之亦然。
IF 12
Nature computational science Pub Date : 2025-06-25 DOI: 10.1038/s43588-025-00819-4
Ted Underwood
{"title":"The impact of language models on the humanities and vice versa.","authors":"Ted Underwood","doi":"10.1038/s43588-025-00819-4","DOIUrl":"https://doi.org/10.1038/s43588-025-00819-4","url":null,"abstract":"","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":" ","pages":""},"PeriodicalIF":12.0,"publicationDate":"2025-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144499776","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信