Xu Duan, Yi Zhang, Yuan Liang, Yingying Huang, Jie Zhang, Hao Yan
{"title":"The effect of speech-gesture asynchrony on the neural coupling of interlocutors in interpreter-mediated communication.","authors":"Xu Duan, Yi Zhang, Yuan Liang, Yingying Huang, Jie Zhang, Hao Yan","doi":"10.1093/scan/nsad027","DOIUrl":null,"url":null,"abstract":"<p><p>In everyday face-to-face communication, speakers use speech to transfer information and rely on co-occurring nonverbal cues, such as hand and facial gestures. The integration of speech and gestures facilitates both language comprehension and the skill of the theory of mind. Consecutive dialogue interpreting (DI) allows dyads of different linguistic backgrounds to communicate with each other. The interpreter interprets after the interlocutor has finished a turn, so the interlocutor watches the gesture first and hears the target language a few seconds later, resulting in speech-gesture asynchrony. In this study, we used the functional near-infrared spectroscopy hyperscanning technique to investigate the influence of speech-gesture asynchrony on different levels of communication. Twenty groups were recruited for the DI experiments. The results showed that when the interpreter performed consecutive interpreting, the time-lagged neural coupling at the temporoparietal junction decreased compared to simultaneous interpreting. It suggests that speech-gesture asynchrony significantly weakened the ability of interlocutors to understand each other's mental state, and the decreased neural coupling was significantly correlated with the interpreter's interpretation skill. In addition, the time-aligned neural coupling at the left inferior frontal gyrus increased, which suggests that, as compensation, the interlocutor's verbal working memory increases in line with the communication process.</p>","PeriodicalId":21789,"journal":{"name":"Social cognitive and affective neuroscience","volume":null,"pages":null},"PeriodicalIF":3.9000,"publicationDate":"2023-06-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10243907/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Social cognitive and affective neuroscience","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1093/scan/nsad027","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"NEUROSCIENCES","Score":null,"Total":0}
引用次数: 0
Abstract
In everyday face-to-face communication, speakers use speech to transfer information and rely on co-occurring nonverbal cues, such as hand and facial gestures. The integration of speech and gestures facilitates both language comprehension and the skill of the theory of mind. Consecutive dialogue interpreting (DI) allows dyads of different linguistic backgrounds to communicate with each other. The interpreter interprets after the interlocutor has finished a turn, so the interlocutor watches the gesture first and hears the target language a few seconds later, resulting in speech-gesture asynchrony. In this study, we used the functional near-infrared spectroscopy hyperscanning technique to investigate the influence of speech-gesture asynchrony on different levels of communication. Twenty groups were recruited for the DI experiments. The results showed that when the interpreter performed consecutive interpreting, the time-lagged neural coupling at the temporoparietal junction decreased compared to simultaneous interpreting. It suggests that speech-gesture asynchrony significantly weakened the ability of interlocutors to understand each other's mental state, and the decreased neural coupling was significantly correlated with the interpreter's interpretation skill. In addition, the time-aligned neural coupling at the left inferior frontal gyrus increased, which suggests that, as compensation, the interlocutor's verbal working memory increases in line with the communication process.
期刊介绍:
SCAN will consider research that uses neuroimaging (fMRI, MRI, PET, EEG, MEG), neuropsychological patient studies, animal lesion studies, single-cell recording, pharmacological perturbation, and transcranial magnetic stimulation. SCAN will also consider submissions that examine the mediational role of neural processes in linking social phenomena to physiological, neuroendocrine, immunological, developmental, and genetic processes. Additionally, SCAN will publish papers that address issues of mental and physical health as they relate to social and affective processes (e.g., autism, anxiety disorders, depression, stress, effects of child rearing) as long as cognitive neuroscience methods are used.