Claudio Runfola, Matteo Neri, Daniele Schön, Benjamin Morillon, Agnès Trébuchon, Giovanni Rabuffo, Pierpaolo Sorrentino, Viktor Jirsa
{"title":"Complexity in speech and music listening via neural manifold flows.","authors":"Claudio Runfola, Matteo Neri, Daniele Schön, Benjamin Morillon, Agnès Trébuchon, Giovanni Rabuffo, Pierpaolo Sorrentino, Viktor Jirsa","doi":"10.1162/netn_a_00422","DOIUrl":null,"url":null,"abstract":"<p><p>Understanding the complex neural mechanisms underlying speech and music perception remains a multifaceted challenge. In this study, we investigated neural dynamics using human intracranial recordings. Employing a novel approach based on low-dimensional reduction techniques, the Manifold Density Flow (MDF), we quantified the complexity of brain dynamics during naturalistic speech and music listening and during resting state. Our results reveal higher complexity in patterns of interdependence between different brain regions during speech and music listening compared with rest, suggesting that the cognitive demands of speech and music listening drive the brain dynamics toward states not observed during rest. Moreover, speech listening has more complexity than music, highlighting the nuanced differences in cognitive demands between these two auditory domains. Additionally, we validated the efficacy of the MDF method through experimentation on a toy model and compared its effectiveness in capturing the complexity of brain dynamics induced by cognitive tasks with another established technique in the literature. Overall, our findings provide a new method to quantify the complexity of brain activity by studying its temporal evolution on a low-dimensional manifold, suggesting insights that are invisible to traditional methodologies in the contexts of speech and music perception.</p>","PeriodicalId":48520,"journal":{"name":"Network Neuroscience","volume":"9 1","pages":"146-158"},"PeriodicalIF":3.6000,"publicationDate":"2025-03-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11949541/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Network Neuroscience","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1162/netn_a_00422","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2025/1/1 0:00:00","PubModel":"eCollection","JCR":"Q2","JCRName":"NEUROSCIENCES","Score":null,"Total":0}
Complexity in speech and music listening via neural manifold flows.
Understanding the complex neural mechanisms underlying speech and music perception remains a multifaceted challenge. In this study, we investigated neural dynamics using human intracranial recordings. Employing a novel approach based on low-dimensional reduction techniques, the Manifold Density Flow (MDF), we quantified the complexity of brain dynamics during naturalistic speech and music listening and during resting state. Our results reveal higher complexity in patterns of interdependence between different brain regions during speech and music listening compared with rest, suggesting that the cognitive demands of speech and music listening drive the brain dynamics toward states not observed during rest. Moreover, speech listening has more complexity than music, highlighting the nuanced differences in cognitive demands between these two auditory domains. Additionally, we validated the efficacy of the MDF method through experimentation on a toy model and compared its effectiveness in capturing the complexity of brain dynamics induced by cognitive tasks with another established technique in the literature. Overall, our findings provide a new method to quantify the complexity of brain activity by studying its temporal evolution on a low-dimensional manifold, suggesting insights that are invisible to traditional methodologies in the contexts of speech and music perception.