Jacob A Zavatone-Veth, Blake Bordelon, Cengiz Pehlevan
{"title":"学习连接改变神经表征到行为的汇总统计。","authors":"Jacob A Zavatone-Veth, Blake Bordelon, Cengiz Pehlevan","doi":"10.3389/fncir.2025.1618351","DOIUrl":null,"url":null,"abstract":"<p><p>How can we make sense of large-scale recordings of neural activity across learning? Theories of neural network learning with their origins in statistical physics offer a potential answer: for a given task, there are often a small set of summary statistics that are sufficient to predict performance as the network learns. Here, we review recent advances in how summary statistics can be used to build theoretical understanding of neural network learning. We then argue for how this perspective can inform the analysis of neural data, enabling better understanding of learning in biological and artificial neural networks.</p>","PeriodicalId":12498,"journal":{"name":"Frontiers in Neural Circuits","volume":"19 ","pages":"1618351"},"PeriodicalIF":3.0000,"publicationDate":"2025-08-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12426272/pdf/","citationCount":"0","resultStr":"{\"title\":\"Summary statistics of learning link changing neural representations to behavior.\",\"authors\":\"Jacob A Zavatone-Veth, Blake Bordelon, Cengiz Pehlevan\",\"doi\":\"10.3389/fncir.2025.1618351\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>How can we make sense of large-scale recordings of neural activity across learning? Theories of neural network learning with their origins in statistical physics offer a potential answer: for a given task, there are often a small set of summary statistics that are sufficient to predict performance as the network learns. Here, we review recent advances in how summary statistics can be used to build theoretical understanding of neural network learning. We then argue for how this perspective can inform the analysis of neural data, enabling better understanding of learning in biological and artificial neural networks.</p>\",\"PeriodicalId\":12498,\"journal\":{\"name\":\"Frontiers in Neural Circuits\",\"volume\":\"19 \",\"pages\":\"1618351\"},\"PeriodicalIF\":3.0000,\"publicationDate\":\"2025-08-29\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12426272/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Frontiers in Neural Circuits\",\"FirstCategoryId\":\"3\",\"ListUrlMain\":\"https://doi.org/10.3389/fncir.2025.1618351\",\"RegionNum\":3,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2025/1/1 0:00:00\",\"PubModel\":\"eCollection\",\"JCR\":\"Q2\",\"JCRName\":\"NEUROSCIENCES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Frontiers in Neural Circuits","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.3389/fncir.2025.1618351","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2025/1/1 0:00:00","PubModel":"eCollection","JCR":"Q2","JCRName":"NEUROSCIENCES","Score":null,"Total":0}
Summary statistics of learning link changing neural representations to behavior.
How can we make sense of large-scale recordings of neural activity across learning? Theories of neural network learning with their origins in statistical physics offer a potential answer: for a given task, there are often a small set of summary statistics that are sufficient to predict performance as the network learns. Here, we review recent advances in how summary statistics can be used to build theoretical understanding of neural network learning. We then argue for how this perspective can inform the analysis of neural data, enabling better understanding of learning in biological and artificial neural networks.
期刊介绍:
Frontiers in Neural Circuits publishes rigorously peer-reviewed research on the emergent properties of neural circuits - the elementary modules of the brain. Specialty Chief Editors Takao K. Hensch and Edward Ruthazer at Harvard University and McGill University respectively, are supported by an outstanding Editorial Board of international experts. This multidisciplinary open-access journal is at the forefront of disseminating and communicating scientific knowledge and impactful discoveries to researchers, academics and the public worldwide.
Frontiers in Neural Circuits launched in 2011 with great success and remains a "central watering hole" for research in neural circuits, serving the community worldwide to share data, ideas and inspiration. Articles revealing the anatomy, physiology, development or function of any neural circuitry in any species (from sponges to humans) are welcome. Our common thread seeks the computational strategies used by different circuits to link their structure with function (perceptual, motor, or internal), the general rules by which they operate, and how their particular designs lead to the emergence of complex properties and behaviors. Submissions focused on synaptic, cellular and connectivity principles in neural microcircuits using multidisciplinary approaches, especially newer molecular, developmental and genetic tools, are encouraged. Studies with an evolutionary perspective to better understand how circuit design and capabilities evolved to produce progressively more complex properties and behaviors are especially welcome. The journal is further interested in research revealing how plasticity shapes the structural and functional architecture of neural circuits.