{"title":"Boosting power for time-to-event GWAS analysis affected by case ascertainment.","authors":"","doi":"10.1038/s43588-025-00892-9","DOIUrl":"https://doi.org/10.1038/s43588-025-00892-9","url":null,"abstract":"","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":" ","pages":""},"PeriodicalIF":18.3,"publicationDate":"2025-10-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145214719","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Self-driving labs for biotechnology.","authors":"Evan Collins, Robert Langer, Daniel G Anderson","doi":"10.1038/s43588-025-00885-8","DOIUrl":"https://doi.org/10.1038/s43588-025-00885-8","url":null,"abstract":"","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":" ","pages":""},"PeriodicalIF":18.3,"publicationDate":"2025-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145208389","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Olesia Dogonasheva, Keith B Doelling, Denis Zakharov, Anne-Lise Giraud, Boris Gutkin
{"title":"Rhythm-based hierarchical predictive computations support acoustic-semantic transformation in speech processing.","authors":"Olesia Dogonasheva, Keith B Doelling, Denis Zakharov, Anne-Lise Giraud, Boris Gutkin","doi":"10.1038/s43588-025-00876-9","DOIUrl":"https://doi.org/10.1038/s43588-025-00876-9","url":null,"abstract":"<p><p>Unraveling how humans understand speech despite distortions has long intrigued researchers. A prominent hypothesis highlights the role of multiple endogenous brain rhythms in forming the computational context to predict speech structure and content. Yet how neural processes may implement rhythm-based context formation remains unclear. Here we propose the brain rhythm-based inference model (BRyBI) as a possible neural implementation of speech processing in the auditory cortex based on the interaction of endogenous brain rhythms in a predictive coding framework. BRyBI encodes key rhythmic processes for parsing spectro-temporal representations of the speech signal into phoneme sequences and to govern the formation of the phrasal context. BRyBI matches patterns of human performance in speech recognition tasks and explains contradictory experimental observations of rhythms during speech listening and their dependence on the informational aspect of speech (uncertainty and surprise). This work highlights the computational role of multiscale brain rhythms in predictive speech processing.</p>","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":" ","pages":""},"PeriodicalIF":18.3,"publicationDate":"2025-09-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145180835","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Zhe Liu, Yihang Bao, An Gu, Weichen Song, Guan Ning Lin
{"title":"Predicting the regulatory impacts of noncoding variants on gene expression through epigenomic integration across tissues and single-cell landscapes.","authors":"Zhe Liu, Yihang Bao, An Gu, Weichen Song, Guan Ning Lin","doi":"10.1038/s43588-025-00878-7","DOIUrl":"https://doi.org/10.1038/s43588-025-00878-7","url":null,"abstract":"<p><p>Noncoding mutations play a critical role in regulating gene expression, yet predicting their effects across diverse tissues and cell types remains a challenge. Here we present EMO, a transformer-based model that integrates DNA sequence with chromatin accessibility data (assay for transposase-accessible chromatin with sequencing) to predict the regulatory impact of noncoding single nucleotide polymorphisms on gene expression. A key component of EMO is its ability to incorporate personalized functional genomic profiles, enabling individual-level and disease-contextual predictions and addressing critical limitations of current approaches. EMO generalizes across tissues and cell types by modeling both short- and long-range regulatory interactions and capturing dynamic gene expression changes associated with disease progression. In benchmark evaluations, the pretraining-based EMO framework outperformed existing models, with fine-tuning small-sample tissues enhancing the model's ability to fit target tissues. In single-cell contexts, EMO accurately identified cell-type-specific regulatory patterns and successfully captured the effects of disease-associated single nucleotide polymorphisms in conditions, linking genetic variation to disease-relevant pathways.</p>","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":" ","pages":""},"PeriodicalIF":18.3,"publicationDate":"2025-09-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145180816","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The rise of large language models","authors":"","doi":"10.1038/s43588-025-00890-x","DOIUrl":"10.1038/s43588-025-00890-x","url":null,"abstract":"This issue of Nature Computational Science features a Focus that highlights both the promises and perils of large language models, their emerging applications across diverse scientific domains, and the opportunities to overcome the challenges that lie ahead.","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":"5 9","pages":"689-690"},"PeriodicalIF":18.3,"publicationDate":"2025-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.nature.comhttps://www.nature.com/articles/s43588-025-00890-x.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145129541","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Neuromorphic principles in self-attention hardware for efficient transformers","authors":"Nathan Leroux, Jan Finkbeiner, Emre Neftci","doi":"10.1038/s43588-025-00868-9","DOIUrl":"10.1038/s43588-025-00868-9","url":null,"abstract":"Strong barriers remain between neuromorphic engineering and machine learning, especially with regard to recent large language models (LLMs) and transformers. This Comment makes the case that neuromorphic engineering may hold the keys to more efficient inference with transformer-like models.","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":"5 9","pages":"708-710"},"PeriodicalIF":18.3,"publicationDate":"2025-09-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145076704","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"On the compatibility of generative AI and generative linguistics","authors":"Eva Portelance, Masoud Jasbi","doi":"10.1038/s43588-025-00861-2","DOIUrl":"10.1038/s43588-025-00861-2","url":null,"abstract":"Chomsky’s generative linguistics has made substantial contributions to cognitive science and symbolic artificial intelligence. With the rise of neural language models, however, the compatibility between generative artificial intelligence and generative linguistics has come under debate. Here we outline three ways in which generative artificial intelligence aligns with and supports the core ideas of generative linguistics. In turn, generative linguistics can provide criteria to evaluate and improve neural language models as models of human language and cognition. This Perspective discusses that generative AI aligns with generative linguistics by showing that neural language models (NLMs) are formal generative models. Furthermore, generative linguistics offers a framework for evaluating and improving NLMs.","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":"5 9","pages":"745-753"},"PeriodicalIF":18.3,"publicationDate":"2025-09-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145076692","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Increasing alignment of large language models with language processing in the human brain.","authors":"Changjiang Gao, Zhengwu Ma, Jiajun Chen, Ping Li, Shujian Huang, Jixing Li","doi":"10.1038/s43588-025-00863-0","DOIUrl":"https://doi.org/10.1038/s43588-025-00863-0","url":null,"abstract":"<p><p>Transformer-based large language models (LLMs) have considerably advanced our understanding of how meaning is represented in the human brain; however, the validity of increasingly large LLMs is being questioned due to their extensive training data and their ability to access context thousands of words long. In this study we investigated whether instruction tuning-another core technique in recent LLMs that goes beyond mere scaling-can enhance models' ability to capture linguistic information in the human brain. We compared base and instruction-tuned LLMs of varying sizes against human behavioral and brain activity measured with eye-tracking and functional magnetic resonance imaging during naturalistic reading. We show that simply making LLMs larger leads to a closer match with the human brain than fine-tuning them with instructions. These finding have substantial implications for understanding the cognitive plausibility of LLMs and their role in studying naturalistic language comprehension.</p>","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":" ","pages":""},"PeriodicalIF":18.3,"publicationDate":"2025-09-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145076701","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Ying Li, Yuzhuo Ma, He Xu, Yaoyao Sun, Min Zhu, Weihua Yue, Wei Zhou, Wenjian Bi
{"title":"Applying weighted Cox regression to genome-wide association studies of time-to-event phenotypes.","authors":"Ying Li, Yuzhuo Ma, He Xu, Yaoyao Sun, Min Zhu, Weihua Yue, Wei Zhou, Wenjian Bi","doi":"10.1038/s43588-025-00864-z","DOIUrl":"10.1038/s43588-025-00864-z","url":null,"abstract":"<p><p>With the growing availability of time-stamped electronic health records linked to genetic data in large biobanks and cohorts, time-to-event phenotypes are increasingly studied in genome-wide association studies. Although numerous Cox-regression-based methods have been proposed for a large-scale genome-wide association study, case ascertainment in time-to-event phenotypes has not been well addressed. Here we propose a computationally efficient Cox-based method, named WtCoxG, that accounts for case ascertainment by fitting a weighted Cox proportional hazards null model. A hybrid strategy incorporating saddlepoint approximation largely increases its accuracy when analyzing low-frequency and rare variants. Notably, by leveraging external minor allele frequencies from public resources, WtCoxG further boosts statistical power. Extensive simulation studies demonstrated that WtCoxG is more powerful than ADuLT and other Cox-based methods, while effectively controlling type I error rates. UK Biobank real data analysis validated that leveraging external minor allele frequencies contributes to the power gains of WtCoxG compared with ADuLT in the analysis of type 2 diabetes and coronary atherosclerosis.</p>","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":" ","pages":""},"PeriodicalIF":18.3,"publicationDate":"2025-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145056460","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Peiyi He, Shengbo Wang, Ruibin Mao, Mingrui Jiang, Sebastian Siegel, Giacomo Pedretti, Jim Ignowski, John Paul Strachan, Ruibang Luo, Can Li
{"title":"Real-time raw signal genomic analysis using fully integrated memristor hardware.","authors":"Peiyi He, Shengbo Wang, Ruibin Mao, Mingrui Jiang, Sebastian Siegel, Giacomo Pedretti, Jim Ignowski, John Paul Strachan, Ruibang Luo, Can Li","doi":"10.1038/s43588-025-00867-w","DOIUrl":"10.1038/s43588-025-00867-w","url":null,"abstract":"<p><p>Advances in third-generation sequencing have enabled portable and real-time genomic sequencing, but real-time data processing remains a bottleneck, hampering on-site genomic analysis. These technologies generate noisy analog signals that traditionally require basecalling and read mapping, both demanding costly data movement on von Neumann hardware. Here, to overcome this, we present a memristor-based hardware-software codesign that processes raw sequencer signals directly in analog memory, combining the two separated steps. By exploiting intrinsic device noise for locality-sensitive hashing and implementing parallel approximate searches in content-addressable memory, we experimentally showcase on-site applications, including infectious disease detection and metagenomic classification on a fully integrated memristor chip. Our experimentally validated analysis confirms the effectiveness of this approach on real-world tasks, achieving a 97.15% F1 score in virus raw signal mapping, with 51× speed-up and 477× energy saving over an application-specific integrated circuit. These results demonstrate that in-memory computing hardware provides a viable solution for integration with portable sequencers, enabling real-time and on-site genomic analysis.</p>","PeriodicalId":74246,"journal":{"name":"Nature computational science","volume":" ","pages":""},"PeriodicalIF":18.3,"publicationDate":"2025-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145056487","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}