提取人声特征和计算人声同步使用Praat和R:一个教程

IF 2 3区 心理学 Q2 PSYCHOLOGY, MATHEMATICAL
Désirée Schoenherr, Alisa Shugaley, Franziska Roller, Lukas A. Knitter, Bernhard Strauss, Uwe Altmann
{"title":"提取人声特征和计算人声同步使用Praat和R:一个教程","authors":"Désirée Schoenherr, Alisa Shugaley, Franziska Roller, Lukas A. Knitter, Bernhard Strauss, Uwe Altmann","doi":"10.5964/meth.9375","DOIUrl":null,"url":null,"abstract":"<p xmlns=\"http://www.ncbi.nlm.nih.gov/JATS1\">In clinical research, the dependence of the results on the methods used is frequently discussed. In research on nonverbal synchrony, human ratings or automated methods do not lead to congruent results. Even when automated methods are used, the choice of the method and parameter settings are important to obtain congruent results. However, these are often insufficiently reported and do not meet the standard of transparency and reproducibility. This tutorial is aimed at researchers who are not familiar with the software Praat and R and shows in detail how to extract acoustic features like fundamental frequency or speech rate from video or audio files in conversations. Furthermore, it is presented how vocal synchrony indices can be calculated from these characteristics to represent how well two interaction partners vocally adapt to each other. All used scripts as well as a minimal example, can be found on the Open Science Framework and Github.","PeriodicalId":18476,"journal":{"name":"Methodology: European Journal of Research Methods for The Behavioral and Social Sciences","volume":"78 1","pages":"0"},"PeriodicalIF":2.0000,"publicationDate":"2023-09-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Extracting vocal characteristics and calculating vocal synchrony using Praat and R: A tutorial\",\"authors\":\"Désirée Schoenherr, Alisa Shugaley, Franziska Roller, Lukas A. Knitter, Bernhard Strauss, Uwe Altmann\",\"doi\":\"10.5964/meth.9375\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p xmlns=\\\"http://www.ncbi.nlm.nih.gov/JATS1\\\">In clinical research, the dependence of the results on the methods used is frequently discussed. In research on nonverbal synchrony, human ratings or automated methods do not lead to congruent results. Even when automated methods are used, the choice of the method and parameter settings are important to obtain congruent results. However, these are often insufficiently reported and do not meet the standard of transparency and reproducibility. This tutorial is aimed at researchers who are not familiar with the software Praat and R and shows in detail how to extract acoustic features like fundamental frequency or speech rate from video or audio files in conversations. Furthermore, it is presented how vocal synchrony indices can be calculated from these characteristics to represent how well two interaction partners vocally adapt to each other. All used scripts as well as a minimal example, can be found on the Open Science Framework and Github.\",\"PeriodicalId\":18476,\"journal\":{\"name\":\"Methodology: European Journal of Research Methods for The Behavioral and Social Sciences\",\"volume\":\"78 1\",\"pages\":\"0\"},\"PeriodicalIF\":2.0000,\"publicationDate\":\"2023-09-29\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Methodology: European Journal of Research Methods for The Behavioral and Social Sciences\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.5964/meth.9375\",\"RegionNum\":3,\"RegionCategory\":\"心理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"PSYCHOLOGY, MATHEMATICAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Methodology: European Journal of Research Methods for The Behavioral and Social Sciences","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.5964/meth.9375","RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"PSYCHOLOGY, MATHEMATICAL","Score":null,"Total":0}
引用次数: 0

摘要

在临床研究中,经常讨论结果对所用方法的依赖性。在非语言同步性的研究中,人类评分或自动化方法不能得出一致的结果。即使使用自动化方法,方法和参数设置的选择对于获得一致的结果也很重要。然而,这些通常报告不足,不符合透明度和可重复性的标准。本教程针对不熟悉Praat和R软件的研究人员,详细介绍了如何从对话中的视频或音频文件中提取基本频率或语音速率等声学特征。此外,本文还介绍了如何从这些特征中计算出声音同步指数,以表示两个互动伙伴在声音上相互适应的程度。所有使用的脚本以及一个最小的例子,都可以在开放科学框架和Github上找到。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Extracting vocal characteristics and calculating vocal synchrony using Praat and R: A tutorial

In clinical research, the dependence of the results on the methods used is frequently discussed. In research on nonverbal synchrony, human ratings or automated methods do not lead to congruent results. Even when automated methods are used, the choice of the method and parameter settings are important to obtain congruent results. However, these are often insufficiently reported and do not meet the standard of transparency and reproducibility. This tutorial is aimed at researchers who are not familiar with the software Praat and R and shows in detail how to extract acoustic features like fundamental frequency or speech rate from video or audio files in conversations. Furthermore, it is presented how vocal synchrony indices can be calculated from these characteristics to represent how well two interaction partners vocally adapt to each other. All used scripts as well as a minimal example, can be found on the Open Science Framework and Github.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
2.70
自引率
6.50%
发文量
16
审稿时长
36 weeks
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信