John P Veillette, Jacob Rosen, Daniel Margoliash, Howard C Nusbaum
{"title":"大脑和声门的言语时间和运动控制中的反馈延迟问题。","authors":"John P Veillette, Jacob Rosen, Daniel Margoliash, Howard C Nusbaum","doi":"10.1523/JNEUROSCI.2294-24.2025","DOIUrl":null,"url":null,"abstract":"<p><p>To learn complex motor skills, an organism must be able to assign sensory feedback events to the actions that caused them. This matching problem would be simple if motor neuron output led to sensory feedback with a fixed, predictable lag. However, nonlinear dynamics in the brain and the body's periphery can decouple the timing of critical events from that of the motor output which caused them. During human speech production, for example, phonation from the glottis (a sound source for speech) begins suddenly when subglottal pressure and laryngeal tension cross a sharp threshold (i.e., a bifurcation). Only if the brain can predict the timing of these discrete peripheral events resulting from motor output, then, would it be possible to match sensory feedback to movements based on temporal coherence. We show that event onsets in the (male and female) human glottal waveform, measured using electroglottography, are reflected in the electroencephalogram during speech production, leading up to the time of the event itself. Conversely, glottal event times can be decoded from the electroencephalogram. After prolonged exposure to delayed auditory feedback, subjects recalibrate their behavioral threshold for detecting temporal auditory-motor mismatches and decoded event times decouple from actual movements. This suggests decoding performance is driven by plastic predictions of peripheral timing, providing a missing component for hindsight credit assignment, in which specific feedback events are associated with the neural activity that gave rise to movements. We discuss parallel findings from the birdsong system suggesting that results may generalize across vocal learning species.</p>","PeriodicalId":50114,"journal":{"name":"Journal of Neuroscience","volume":" ","pages":""},"PeriodicalIF":4.0000,"publicationDate":"2025-05-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12096056/pdf/","citationCount":"0","resultStr":"{\"title\":\"Timing of Speech in Brain and Glottis and the Feedback Delay Problem in Motor Control.\",\"authors\":\"John P Veillette, Jacob Rosen, Daniel Margoliash, Howard C Nusbaum\",\"doi\":\"10.1523/JNEUROSCI.2294-24.2025\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>To learn complex motor skills, an organism must be able to assign sensory feedback events to the actions that caused them. This matching problem would be simple if motor neuron output led to sensory feedback with a fixed, predictable lag. However, nonlinear dynamics in the brain and the body's periphery can decouple the timing of critical events from that of the motor output which caused them. During human speech production, for example, phonation from the glottis (a sound source for speech) begins suddenly when subglottal pressure and laryngeal tension cross a sharp threshold (i.e., a bifurcation). Only if the brain can predict the timing of these discrete peripheral events resulting from motor output, then, would it be possible to match sensory feedback to movements based on temporal coherence. We show that event onsets in the (male and female) human glottal waveform, measured using electroglottography, are reflected in the electroencephalogram during speech production, leading up to the time of the event itself. Conversely, glottal event times can be decoded from the electroencephalogram. After prolonged exposure to delayed auditory feedback, subjects recalibrate their behavioral threshold for detecting temporal auditory-motor mismatches and decoded event times decouple from actual movements. This suggests decoding performance is driven by plastic predictions of peripheral timing, providing a missing component for hindsight credit assignment, in which specific feedback events are associated with the neural activity that gave rise to movements. We discuss parallel findings from the birdsong system suggesting that results may generalize across vocal learning species.</p>\",\"PeriodicalId\":50114,\"journal\":{\"name\":\"Journal of Neuroscience\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":4.0000,\"publicationDate\":\"2025-05-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12096056/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Neuroscience\",\"FirstCategoryId\":\"3\",\"ListUrlMain\":\"https://doi.org/10.1523/JNEUROSCI.2294-24.2025\",\"RegionNum\":2,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"NEUROSCIENCES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Neuroscience","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1523/JNEUROSCI.2294-24.2025","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"NEUROSCIENCES","Score":null,"Total":0}
Timing of Speech in Brain and Glottis and the Feedback Delay Problem in Motor Control.
To learn complex motor skills, an organism must be able to assign sensory feedback events to the actions that caused them. This matching problem would be simple if motor neuron output led to sensory feedback with a fixed, predictable lag. However, nonlinear dynamics in the brain and the body's periphery can decouple the timing of critical events from that of the motor output which caused them. During human speech production, for example, phonation from the glottis (a sound source for speech) begins suddenly when subglottal pressure and laryngeal tension cross a sharp threshold (i.e., a bifurcation). Only if the brain can predict the timing of these discrete peripheral events resulting from motor output, then, would it be possible to match sensory feedback to movements based on temporal coherence. We show that event onsets in the (male and female) human glottal waveform, measured using electroglottography, are reflected in the electroencephalogram during speech production, leading up to the time of the event itself. Conversely, glottal event times can be decoded from the electroencephalogram. After prolonged exposure to delayed auditory feedback, subjects recalibrate their behavioral threshold for detecting temporal auditory-motor mismatches and decoded event times decouple from actual movements. This suggests decoding performance is driven by plastic predictions of peripheral timing, providing a missing component for hindsight credit assignment, in which specific feedback events are associated with the neural activity that gave rise to movements. We discuss parallel findings from the birdsong system suggesting that results may generalize across vocal learning species.
期刊介绍:
JNeurosci (ISSN 0270-6474) is an official journal of the Society for Neuroscience. It is published weekly by the Society, fifty weeks a year, one volume a year. JNeurosci publishes papers on a broad range of topics of general interest to those working on the nervous system. Authors now have an Open Choice option for their published articles