Gesture最新文献

筛选
英文 中文
Join ISGS 加入 ISGS
IF 1 4区 文学
Gesture Pub Date : 2023-12-31 DOI: 10.1075/gest.00078.isg
{"title":"Join ISGS","authors":"","doi":"10.1075/gest.00078.isg","DOIUrl":"https://doi.org/10.1075/gest.00078.isg","url":null,"abstract":"<div></div>","PeriodicalId":35125,"journal":{"name":"Gesture","volume":null,"pages":null},"PeriodicalIF":1.0,"publicationDate":"2023-12-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140571469","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Do teachers adapt their gestures in linguistically heterogeneous second language teaching to learners’ language proficiencies? 在语言异质的第二语言教学中,教师是否会根据学习者的语言能力调整手势?
IF 1 4区 文学
Gesture Pub Date : 2023-12-31 DOI: 10.1075/gest.22023.sah
Moritz Sahlender, Inga ten Hagen
{"title":"Do teachers adapt their gestures in linguistically heterogeneous second language teaching to learners’ language proficiencies?","authors":"Moritz Sahlender, Inga ten Hagen","doi":"10.1075/gest.22023.sah","DOIUrl":"https://doi.org/10.1075/gest.22023.sah","url":null,"abstract":"Teachers’ use of gestures in the classroom can support the language acquisition of learners in learning a second language (Stam &amp; Tellier, 2022). Depending on learners’ language skills, different dimensions of gestures (e.g., deictic, metaphorical) are considered to facilitate successful language comprehension. This study investigates which gestures teachers use in German as a second language (GSL) classrooms and to what extent teachers adapt their gestures to learners’ language proficiency. Teacher gestures in 10 video-recorded integration and preparation classes were analyzed. Two coders reliably identified 4143 gestures. Results show that GSL teachers predominantly used deictic gestures, metaphorical gestures, and feedback by head movements. Moreover, between-learner variability in teachers’ use of deictic and metaphorical gestures was explained by teacher-perceived German language proficiency of learners. These results suggest that teachers systematically adapt some dimensions of gestures in GSL classes, thus emphasizing the importance of studying nonverbal interactions for a better understanding of language acquisition processes.","PeriodicalId":35125,"journal":{"name":"Gesture","volume":null,"pages":null},"PeriodicalIF":1.0,"publicationDate":"2023-12-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140627578","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Evidence of Zipfian distributions in three sign languages 三种手语中齐普菲亚分布的证据
IF 1 4区 文学
Gesture Pub Date : 2023-12-31 DOI: 10.1075/gest.23014.kim
Inbal Kimchi, Lucie Wolters, Rose Stamp, Inbal Arnon
{"title":"Evidence of Zipfian distributions in three sign languages","authors":"Inbal Kimchi, Lucie Wolters, Rose Stamp, Inbal Arnon","doi":"10.1075/gest.23014.kim","DOIUrl":"https://doi.org/10.1075/gest.23014.kim","url":null,"abstract":"One striking commonality between languages is their Zipfian distributions: A power-law distribution of word frequency. This distribution is found across languages, speech genres, and within different parts of speech. The recurrence of such distributions is thought to reflect cognitive and/or communicative pressures and to facilitate language learning. However, research on Zipfian distributions has mostly been limited to spoken languages. In this study, we ask whether Zipfian distributions are also found across signed languages, as expected if they reflect a universal property of human language. We find that sign frequencies and ranks in three sign language corpora (BSL, DGS and NGT) show a Zipfian relationship, similar to that found in spoken languages. These findings highlight the commonalities between spoken and signed languages, add to our understanding of the use of signs, and show the prevalence of Zipfian distributions across language modalities, supporting the idea that they facilitate language learning and communication.","PeriodicalId":35125,"journal":{"name":"Gesture","volume":null,"pages":null},"PeriodicalIF":1.0,"publicationDate":"2023-12-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140630425","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Further information and weblinks 更多信息和网络链接
IF 1 4区 文学
Gesture Pub Date : 2023-12-31 DOI: 10.1075/gest.00081.fur
{"title":"Further information and weblinks","authors":"","doi":"10.1075/gest.00081.fur","DOIUrl":"https://doi.org/10.1075/gest.00081.fur","url":null,"abstract":"<div></div>","PeriodicalId":35125,"journal":{"name":"Gesture","volume":null,"pages":null},"PeriodicalIF":1.0,"publicationDate":"2023-12-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140627865","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Recent and forthcoming events 近期和即将举办的活动
IF 1 4区 文学
Gesture Pub Date : 2023-12-31 DOI: 10.1075/gest.00083.eve
{"title":"Recent and forthcoming events","authors":"","doi":"10.1075/gest.00083.eve","DOIUrl":"https://doi.org/10.1075/gest.00083.eve","url":null,"abstract":"<div></div>","PeriodicalId":35125,"journal":{"name":"Gesture","volume":null,"pages":null},"PeriodicalIF":1.0,"publicationDate":"2023-12-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140628008","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
The road to language through gesture 通过手势通向语言之路
IF 1 4区 文学
Gesture Pub Date : 2023-11-27 DOI: 10.1075/gest.22001.wil
Beatrijs Wille, Hilde Nyffels, O. Capirci
{"title":"The road to language through gesture","authors":"Beatrijs Wille, Hilde Nyffels, O. Capirci","doi":"10.1075/gest.22001.wil","DOIUrl":"https://doi.org/10.1075/gest.22001.wil","url":null,"abstract":"This study explores the role of gestures in Flemish Sign Language (VGT) development through a longitudinal observation of three deaf children’s early interactions. These children were followed over a period of one and a half year, at the ages of 6, 9, 12, 18 and 24 months. This research compares the communicative development of a deaf child growing up in a deaf family and two deaf children growing up in hearing families. The latter two children received early cochlear implants when they were respectively 10 and 7 months old. It is the first study describing the types and tokens of children’s gestures used in early dyadic interactions in Flanders (Belgium). The description of our observations shows three distinct developmental patterns in terms of the use of gestures and the production of combinations. The study supports the finding that children’s gestural output is subject to their parental language, and it further indicates an impact of age of cochlear implantation.","PeriodicalId":35125,"journal":{"name":"Gesture","volume":null,"pages":null},"PeriodicalIF":1.0,"publicationDate":"2023-11-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139228588","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Weakest link or strongest link? 最弱的环节还是最强的环节?
4区 文学
Gesture Pub Date : 2023-11-14 DOI: 10.1075/gest.21021.don
Andrea Marquardt Donovan, Sarah A. Brown, Martha W. Alibali
{"title":"Weakest link or strongest link?","authors":"Andrea Marquardt Donovan, Sarah A. Brown, Martha W. Alibali","doi":"10.1075/gest.21021.don","DOIUrl":"https://doi.org/10.1075/gest.21021.don","url":null,"abstract":"Teachers often use gestures to connect representations of mathematical ideas. This research examined (1) whether\u0000 such linking gestures help students understand connections among representations and (2) whether sets of gestures that include\u0000 repeated handshapes and motions – termed gestural catchments – are particularly beneficial. Undergraduates viewed\u0000 one of four video lessons connecting two representations of multiplication. In the control lesson, the instructor\u0000 produced beat gestures that did not link the representations. In the link-only lesson, the instructor used\u0000 gestures to link representations, but the gestures did not form a catchment. In the consistent-catchment lesson,\u0000 the instructor highlighted corresponding elements of the two representations using identical gestures. In the\u0000 inconsistent-catchment lesson, the instructor highlighted non-corresponding elements of the two\u0000 representations using identical gestures. Participants who saw the lesson with the consistent catchment – which highlighted\u0000 similarities between representations – were most likely to understand the novel representation and to report learning from the\u0000 lesson.","PeriodicalId":35125,"journal":{"name":"Gesture","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2023-11-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134901077","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Automatic tool to annotate smile intensities in conversational face-to-face interactions 在面对面对话中注释微笑强度的自动工具
IF 1 4区 文学
Gesture Pub Date : 2023-09-01 DOI: 10.1075/gest.22012.rau
S. Rauzy, Mary Amoyal
{"title":"Automatic tool to annotate smile intensities in conversational face-to-face interactions","authors":"S. Rauzy, Mary Amoyal","doi":"10.1075/gest.22012.rau","DOIUrl":"https://doi.org/10.1075/gest.22012.rau","url":null,"abstract":"\u0000 This study presents an automatic tool that allows to trace smile intensities along a video record of\u0000 conversational face-to-face interactions. The processed output proposes a sequence of adjusted time intervals labeled following\u0000 the Smiling Intensity Scale (Gironzetti, Attardo, and Pickering,\u0000 2016), a 5 levels scale varying from neutral facial expression to laughing smile. The underlying statistical model of this\u0000 tool is trained on a manually annotated corpus of conversations featuring spontaneous facial expressions. This model will be\u0000 detailed in this study. This tool can be used with benefits for annotating smile in interactions. The results are twofold. First,\u0000 the evaluation reveals an observed agreement of 68% between manual and automatic annotations. Second, manually correcting the\u0000 labels and interval boundaries of the automatic outputs reduces by a factor 10 the annotation time as compared with the time spent\u0000 for manually annotating smile intensities without pretreatment. Our annotation engine makes use of the state-of-the-art toolbox\u0000 OpenFace for tracking the face and for measuring the intensities of the facial Action Units of interest all along the video. The\u0000 documentation and the scripts of our tool, the SMAD software, are available to download at the HMAD open source project URL page\u0000 https://github.com/srauzy/HMAD (last access 31 July 2023).","PeriodicalId":35125,"journal":{"name":"Gesture","volume":null,"pages":null},"PeriodicalIF":1.0,"publicationDate":"2023-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42991983","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Review of Galhano-Rodrigues, Galvão & Cruz-Santos (2019): Recent perspectives on gesture and multimodality Galhano Rodrigues,Galvão&Cruz Santos评论(2019):手势和多模态的最新视角
IF 1 4区 文学
Gesture Pub Date : 2023-08-31 DOI: 10.1075/gest.20031.wan
Xi Wang, Fangfei Lv
{"title":"Review of Galhano-Rodrigues, Galvão & Cruz-Santos (2019): Recent perspectives on gesture and multimodality","authors":"Xi Wang, Fangfei Lv","doi":"10.1075/gest.20031.wan","DOIUrl":"https://doi.org/10.1075/gest.20031.wan","url":null,"abstract":"","PeriodicalId":35125,"journal":{"name":"Gesture","volume":null,"pages":null},"PeriodicalIF":1.0,"publicationDate":"2023-08-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46986053","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Iconic gestures serve as primes for both auditory and visual word forms 标志性手势是听觉和视觉单词形式的启动词
IF 1 4区 文学
Gesture Pub Date : 2023-08-24 DOI: 10.1075/gest.20019.san
Iván Sánchez-Borges, C. J. Álvarez
{"title":"Iconic gestures serve as primes for both auditory and visual word forms","authors":"Iván Sánchez-Borges, C. J. Álvarez","doi":"10.1075/gest.20019.san","DOIUrl":"https://doi.org/10.1075/gest.20019.san","url":null,"abstract":"\u0000Previous studies using cross-modal semantic priming have found that iconic gestures prime target words that are related with the gestures. In the present study, two analogous experiments examined this priming effect presenting prime and targets in high synchrony. In Experiment 1, participants performed an auditory primed lexical decision task where target words (e.g., “push”) and pseudowords had to be discriminated, primed by overlapping iconic gestures that could be semantically related (e.g., moving both hands forward) or not with the words. Experiment 2 was similar but with both gestures and words presented visually. The grammatical category of the words was also manipulated: they were nouns and verbs. It was found that words related to gestures were recognized faster and with fewer errors than the unrelated ones in both experiments and similarly for both types of words.","PeriodicalId":35125,"journal":{"name":"Gesture","volume":null,"pages":null},"PeriodicalIF":1.0,"publicationDate":"2023-08-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45461372","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信