Ryan K Krueger, Michael P Brenner, Krishna Shrinivas
{"title":"Generalized design of sequence-ensemble-function relationships for intrinsically disordered proteins.","authors":"Ryan K Krueger, Michael P Brenner, Krishna Shrinivas","doi":"10.1038/s43588-025-00881-y","DOIUrl":"https://doi.org/10.1038/s43588-025-00881-y","url":null,"abstract":"<p><p>The design of folded proteins has advanced substantially in recent years. However, many proteins and protein regions are intrinsically disordered and lack a stable fold, that is, the sequence of an intrinsically disordered protein (IDP) encodes a vast ensemble of spatial conformations that specify its biological function. This conformational plasticity and heterogeneity makes IDP design challenging. Here we introduce a computational framework for de novo design of IDPs through rational and efficient inversion of molecular simulations that approximate the underlying sequence-ensemble relationship. We highlight the versatility of this approach by designing IDPs with diverse properties and arbitrary sequence constraints. These include IDPs with target ensemble dimensions, loops and linkers, highly sensitive sensors of physicochemical stimuli, and binders to target disordered substrates with distinct conformational biases. Overall, our method provides a general framework for designing sequence-ensemble-function relationships of biological macromolecules.</p>","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":" ","pages":""},"PeriodicalIF":18.3,"publicationDate":"2025-10-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145240572","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Yixuan Wang, Xinyuan Liu, Yimin Fan, Binghui Xie, James Cheng, Kam Chung Wong, Peter Cheung, Irwin King, Yu Li
{"title":"Predicting drug responses of unseen cell types through transfer learning with foundation models.","authors":"Yixuan Wang, Xinyuan Liu, Yimin Fan, Binghui Xie, James Cheng, Kam Chung Wong, Peter Cheung, Irwin King, Yu Li","doi":"10.1038/s43588-025-00887-6","DOIUrl":"https://doi.org/10.1038/s43588-025-00887-6","url":null,"abstract":"<p><p>Drug repurposing through single-cell perturbation response prediction provides a cost-effective approach for drug development, but accurately predicting responses in unseen cell types that emerge during disease progression remains challenging. Existing methods struggle to achieve generalizable cell-type-specific predictions. To address these limitations, we introduce the cell-type-specific drug perturbatIon responses predictor (CRISP), a framework for predicting perturbation responses in previously unseen cell types at single-cell resolution. CRISP leverages foundation models and cell-type-specific learning strategies to enable effective transfer of information from control to perturbed states even with limited empirical data. Through systematic evaluation across increasingly challenging scenarios, from unseen cell types to cross-platform predictions, CRISP shows generalizability and performance improvements. We demonstrate CRISP's drug repurposing potential through zero-shot prediction from solid tumor data to sorafenib's therapeutic effects in chronic myeloid leukemia. The predicted anti-tumor mechanisms, including CXCR4 pathway inhibition, are supported by independent studies as an effective therapeutic strategy in chronic myeloid leukemia, aligning with past studies and clinical trials.</p>","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":" ","pages":""},"PeriodicalIF":18.3,"publicationDate":"2025-10-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145226337","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Kunyi Li, Baozhen Shan, Lei Xin, Ming Li, Lusheng Wang
{"title":"Proteoform search from protein database with top-down mass spectra.","authors":"Kunyi Li, Baozhen Shan, Lei Xin, Ming Li, Lusheng Wang","doi":"10.1038/s43588-025-00880-z","DOIUrl":"https://doi.org/10.1038/s43588-025-00880-z","url":null,"abstract":"<p><p>Here we propose a search algorithm for proteoform identification that computes the largest-size error-correction alignments between a protein mass graph and a spectrum mass graph. Our combined method uses a filtering algorithm to identify candidates and then applies a search algorithm to report the final results. Our exact searching method is 3.9 to 9.0 times faster than popular methods such as TopMG and TopPIC. Our combined method can further speed-up the running time of sTopMG without affecting the search accuracy. We develop a pipeline for generating simulated top-down spectra on the basis of input protein sequences with modifications. Experiments on simulated datasets show that our combined method has 95% accuracy, which exceeds existing methods. Experiments on real annotated datasets show that our method has ≥97.1% accuracy using deconvolution method FLASHDeconv.</p>","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":" ","pages":""},"PeriodicalIF":18.3,"publicationDate":"2025-10-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145226292","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Boosting power for time-to-event GWAS analysis affected by case ascertainment.","authors":"","doi":"10.1038/s43588-025-00892-9","DOIUrl":"https://doi.org/10.1038/s43588-025-00892-9","url":null,"abstract":"","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":" ","pages":""},"PeriodicalIF":18.3,"publicationDate":"2025-10-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145214719","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Self-driving labs for biotechnology.","authors":"Evan Collins, Robert Langer, Daniel G Anderson","doi":"10.1038/s43588-025-00885-8","DOIUrl":"https://doi.org/10.1038/s43588-025-00885-8","url":null,"abstract":"","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":" ","pages":""},"PeriodicalIF":18.3,"publicationDate":"2025-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145208389","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Zhe Liu (, ), Yihang Bao (, ), An Gu (, ), Weichen Song (, ), Guan Ning Lin (, )
{"title":"Predicting the regulatory impacts of noncoding variants on gene expression through epigenomic integration across tissues and single-cell landscapes","authors":"Zhe Liu \u0000 (, ), Yihang Bao \u0000 (, ), An Gu \u0000 (, ), Weichen Song \u0000 (, ), Guan Ning Lin \u0000 (, )","doi":"10.1038/s43588-025-00878-7","DOIUrl":"10.1038/s43588-025-00878-7","url":null,"abstract":"Noncoding mutations play a critical role in regulating gene expression, yet predicting their effects across diverse tissues and cell types remains a challenge. Here we present EMO, a transformer-based model that integrates DNA sequence with chromatin accessibility data (assay for transposase-accessible chromatin with sequencing) to predict the regulatory impact of noncoding single nucleotide polymorphisms on gene expression. A key component of EMO is its ability to incorporate personalized functional genomic profiles, enabling individual-level and disease-contextual predictions and addressing critical limitations of current approaches. EMO generalizes across tissues and cell types by modeling both short- and long-range regulatory interactions and capturing dynamic gene expression changes associated with disease progression. In benchmark evaluations, the pretraining-based EMO framework outperformed existing models, with fine-tuning small-sample tissues enhancing the model’s ability to fit target tissues. In single-cell contexts, EMO accurately identified cell-type-specific regulatory patterns and successfully captured the effects of disease-associated single nucleotide polymorphisms in conditions, linking genetic variation to disease-relevant pathways. EMO integrates DNA sequence and chromatin accessibility data to predict how noncoding variants regulate gene expression across tissues and single cells, enabling context-aware personalized insights into genetic effects for precision medicine.","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":"5 10","pages":"927-939"},"PeriodicalIF":18.3,"publicationDate":"2025-09-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145180816","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Olesia Dogonasheva, Keith B. Doelling, Denis Zakharov, Anne-Lise Giraud, Boris Gutkin
{"title":"Rhythm-based hierarchical predictive computations support acoustic−semantic transformation in speech processing","authors":"Olesia Dogonasheva, Keith B. Doelling, Denis Zakharov, Anne-Lise Giraud, Boris Gutkin","doi":"10.1038/s43588-025-00876-9","DOIUrl":"10.1038/s43588-025-00876-9","url":null,"abstract":"Unraveling how humans understand speech despite distortions has long intrigued researchers. A prominent hypothesis highlights the role of multiple endogenous brain rhythms in forming the computational context to predict speech structure and content. Yet how neural processes may implement rhythm-based context formation remains unclear. Here we propose the brain rhythm-based inference model (BRyBI) as a possible neural implementation of speech processing in the auditory cortex based on the interaction of endogenous brain rhythms in a predictive coding framework. BRyBI encodes key rhythmic processes for parsing spectro-temporal representations of the speech signal into phoneme sequences and to govern the formation of the phrasal context. BRyBI matches patterns of human performance in speech recognition tasks and explains contradictory experimental observations of rhythms during speech listening and their dependence on the informational aspect of speech (uncertainty and surprise). This work highlights the computational role of multiscale brain rhythms in predictive speech processing. This study presents a brain rhythm-based inference model (BRyBI) for speech processing in the auditory cortex. BRyBI shows how rhythmic neural activity enables robust speech processing by dynamically predicting context and elucidates mechanistic principles that allow robust speech parsing in the brain.","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":"5 10","pages":"915-926"},"PeriodicalIF":18.3,"publicationDate":"2025-09-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145180835","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The rise of large language models","authors":"","doi":"10.1038/s43588-025-00890-x","DOIUrl":"10.1038/s43588-025-00890-x","url":null,"abstract":"This issue of Nature Computational Science features a Focus that highlights both the promises and perils of large language models, their emerging applications across diverse scientific domains, and the opportunities to overcome the challenges that lie ahead.","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":"5 9","pages":"689-690"},"PeriodicalIF":18.3,"publicationDate":"2025-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.nature.comhttps://www.nature.com/articles/s43588-025-00890-x.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145129541","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Neuromorphic principles in self-attention hardware for efficient transformers","authors":"Nathan Leroux, Jan Finkbeiner, Emre Neftci","doi":"10.1038/s43588-025-00868-9","DOIUrl":"10.1038/s43588-025-00868-9","url":null,"abstract":"Strong barriers remain between neuromorphic engineering and machine learning, especially with regard to recent large language models (LLMs) and transformers. This Comment makes the case that neuromorphic engineering may hold the keys to more efficient inference with transformer-like models.","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":"5 9","pages":"708-710"},"PeriodicalIF":18.3,"publicationDate":"2025-09-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145076704","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"On the compatibility of generative AI and generative linguistics","authors":"Eva Portelance, Masoud Jasbi","doi":"10.1038/s43588-025-00861-2","DOIUrl":"10.1038/s43588-025-00861-2","url":null,"abstract":"Chomsky’s generative linguistics has made substantial contributions to cognitive science and symbolic artificial intelligence. With the rise of neural language models, however, the compatibility between generative artificial intelligence and generative linguistics has come under debate. Here we outline three ways in which generative artificial intelligence aligns with and supports the core ideas of generative linguistics. In turn, generative linguistics can provide criteria to evaluate and improve neural language models as models of human language and cognition. This Perspective discusses that generative AI aligns with generative linguistics by showing that neural language models (NLMs) are formal generative models. Furthermore, generative linguistics offers a framework for evaluating and improving NLMs.","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":"5 9","pages":"745-753"},"PeriodicalIF":18.3,"publicationDate":"2025-09-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145076692","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}