Gesture最新文献

筛选
英文 中文
The road to language through gesture 通过手势通向语言之路
IF 1 4区 文学
Gesture Pub Date : 2023-11-27 DOI: 10.1075/gest.22001.wil
Beatrijs Wille, Hilde Nyffels, O. Capirci
{"title":"The road to language through gesture","authors":"Beatrijs Wille, Hilde Nyffels, O. Capirci","doi":"10.1075/gest.22001.wil","DOIUrl":"https://doi.org/10.1075/gest.22001.wil","url":null,"abstract":"This study explores the role of gestures in Flemish Sign Language (VGT) development through a longitudinal observation of three deaf children’s early interactions. These children were followed over a period of one and a half year, at the ages of 6, 9, 12, 18 and 24 months. This research compares the communicative development of a deaf child growing up in a deaf family and two deaf children growing up in hearing families. The latter two children received early cochlear implants when they were respectively 10 and 7 months old. It is the first study describing the types and tokens of children’s gestures used in early dyadic interactions in Flanders (Belgium). The description of our observations shows three distinct developmental patterns in terms of the use of gestures and the production of combinations. The study supports the finding that children’s gestural output is subject to their parental language, and it further indicates an impact of age of cochlear implantation.","PeriodicalId":35125,"journal":{"name":"Gesture","volume":"80 1","pages":""},"PeriodicalIF":1.0,"publicationDate":"2023-11-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139228588","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Weakest link or strongest link? 最弱的环节还是最强的环节?
4区 文学
Gesture Pub Date : 2023-11-14 DOI: 10.1075/gest.21021.don
Andrea Marquardt Donovan, Sarah A. Brown, Martha W. Alibali
{"title":"Weakest link or strongest link?","authors":"Andrea Marquardt Donovan, Sarah A. Brown, Martha W. Alibali","doi":"10.1075/gest.21021.don","DOIUrl":"https://doi.org/10.1075/gest.21021.don","url":null,"abstract":"Teachers often use gestures to connect representations of mathematical ideas. This research examined (1) whether\u0000 such linking gestures help students understand connections among representations and (2) whether sets of gestures that include\u0000 repeated handshapes and motions – termed gestural catchments – are particularly beneficial. Undergraduates viewed\u0000 one of four video lessons connecting two representations of multiplication. In the control lesson, the instructor\u0000 produced beat gestures that did not link the representations. In the link-only lesson, the instructor used\u0000 gestures to link representations, but the gestures did not form a catchment. In the consistent-catchment lesson,\u0000 the instructor highlighted corresponding elements of the two representations using identical gestures. In the\u0000 inconsistent-catchment lesson, the instructor highlighted non-corresponding elements of the two\u0000 representations using identical gestures. Participants who saw the lesson with the consistent catchment – which highlighted\u0000 similarities between representations – were most likely to understand the novel representation and to report learning from the\u0000 lesson.","PeriodicalId":35125,"journal":{"name":"Gesture","volume":"31 6","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-11-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134901077","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Automatic tool to annotate smile intensities in conversational face-to-face interactions 在面对面对话中注释微笑强度的自动工具
IF 1 4区 文学
Gesture Pub Date : 2023-09-01 DOI: 10.1075/gest.22012.rau
S. Rauzy, Mary Amoyal
{"title":"Automatic tool to annotate smile intensities in conversational face-to-face interactions","authors":"S. Rauzy, Mary Amoyal","doi":"10.1075/gest.22012.rau","DOIUrl":"https://doi.org/10.1075/gest.22012.rau","url":null,"abstract":"\u0000 This study presents an automatic tool that allows to trace smile intensities along a video record of\u0000 conversational face-to-face interactions. The processed output proposes a sequence of adjusted time intervals labeled following\u0000 the Smiling Intensity Scale (Gironzetti, Attardo, and Pickering,\u0000 2016), a 5 levels scale varying from neutral facial expression to laughing smile. The underlying statistical model of this\u0000 tool is trained on a manually annotated corpus of conversations featuring spontaneous facial expressions. This model will be\u0000 detailed in this study. This tool can be used with benefits for annotating smile in interactions. The results are twofold. First,\u0000 the evaluation reveals an observed agreement of 68% between manual and automatic annotations. Second, manually correcting the\u0000 labels and interval boundaries of the automatic outputs reduces by a factor 10 the annotation time as compared with the time spent\u0000 for manually annotating smile intensities without pretreatment. Our annotation engine makes use of the state-of-the-art toolbox\u0000 OpenFace for tracking the face and for measuring the intensities of the facial Action Units of interest all along the video. The\u0000 documentation and the scripts of our tool, the SMAD software, are available to download at the HMAD open source project URL page\u0000 https://github.com/srauzy/HMAD (last access 31 July 2023).","PeriodicalId":35125,"journal":{"name":"Gesture","volume":" ","pages":""},"PeriodicalIF":1.0,"publicationDate":"2023-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42991983","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Review of Galhano-Rodrigues, Galvão & Cruz-Santos (2019): Recent perspectives on gesture and multimodality Galhano Rodrigues,Galvão&Cruz Santos评论(2019):手势和多模态的最新视角
IF 1 4区 文学
Gesture Pub Date : 2023-08-31 DOI: 10.1075/gest.20031.wan
Xi Wang, Fangfei Lv
{"title":"Review of Galhano-Rodrigues, Galvão & Cruz-Santos (2019): Recent perspectives on gesture and multimodality","authors":"Xi Wang, Fangfei Lv","doi":"10.1075/gest.20031.wan","DOIUrl":"https://doi.org/10.1075/gest.20031.wan","url":null,"abstract":"","PeriodicalId":35125,"journal":{"name":"Gesture","volume":" ","pages":""},"PeriodicalIF":1.0,"publicationDate":"2023-08-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46986053","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Iconic gestures serve as primes for both auditory and visual word forms 标志性手势是听觉和视觉单词形式的启动词
IF 1 4区 文学
Gesture Pub Date : 2023-08-24 DOI: 10.1075/gest.20019.san
Iván Sánchez-Borges, C. J. Álvarez
{"title":"Iconic gestures serve as primes for both auditory and visual word forms","authors":"Iván Sánchez-Borges, C. J. Álvarez","doi":"10.1075/gest.20019.san","DOIUrl":"https://doi.org/10.1075/gest.20019.san","url":null,"abstract":"\u0000Previous studies using cross-modal semantic priming have found that iconic gestures prime target words that are related with the gestures. In the present study, two analogous experiments examined this priming effect presenting prime and targets in high synchrony. In Experiment 1, participants performed an auditory primed lexical decision task where target words (e.g., “push”) and pseudowords had to be discriminated, primed by overlapping iconic gestures that could be semantically related (e.g., moving both hands forward) or not with the words. Experiment 2 was similar but with both gestures and words presented visually. The grammatical category of the words was also manipulated: they were nouns and verbs. It was found that words related to gestures were recognized faster and with fewer errors than the unrelated ones in both experiments and similarly for both types of words.","PeriodicalId":35125,"journal":{"name":"Gesture","volume":" ","pages":""},"PeriodicalIF":1.0,"publicationDate":"2023-08-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45461372","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
The Raised Index Finger gesture in Hebrew multimodal interaction 希伯来语多模式交互中的竖起食指手势
IF 1 4区 文学
Gesture Pub Date : 2023-08-24 DOI: 10.1075/gest.21001.inb
Anna Inbar
{"title":"The Raised Index Finger gesture in Hebrew multimodal interaction","authors":"Anna Inbar","doi":"10.1075/gest.21001.inb","DOIUrl":"https://doi.org/10.1075/gest.21001.inb","url":null,"abstract":"\u0000 The present study examines the roles that the gesture of the Raised Index Finger (RIF) plays in Hebrew multimodal\u0000 interaction. The study reveals that the RIF is associated with diverse linguistic phenomena and tends to appear in contexts in\u0000 which the speaker presents a message or speech act that violates the hearer’s expectations (based on either general knowledge or\u0000 prior discourse). The study suggests that the RIF serves the function of discourse deixis: Speakers point to\u0000 their message, creating a referent in the extralinguistic context to which they refer as an object of their stance, evaluating the\u0000 content of the utterance or speech act as unexpected by the hearer, and displaying epistemic authority. Setting up such a frame by\u0000 which the information is to be interpreted provides the basis for a swifter update of the common ground in situations of (assumed)\u0000 differences between the assumptions of the speaker and the hearer.","PeriodicalId":35125,"journal":{"name":"Gesture","volume":" ","pages":""},"PeriodicalIF":1.0,"publicationDate":"2023-08-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43533089","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Co-speech gestures can interfere with learning foreign language words* 共语手势会干扰外语单词的学习*
IF 1 4区 文学
Gesture Pub Date : 2023-08-21 DOI: 10.1075/gest.18020.nic
E. Nicoladis, Paula Marentette, Candace Lam
{"title":"Co-speech gestures can interfere with learning foreign language words*","authors":"E. Nicoladis, Paula Marentette, Candace Lam","doi":"10.1075/gest.18020.nic","DOIUrl":"https://doi.org/10.1075/gest.18020.nic","url":null,"abstract":"\u0000 Co-speech gestures can help the learning, processing, and memory of words and concepts, particularly motoric and spatial\u0000 concepts such as verbs. The purpose of the present studies was to test whether co-speech gestures support the learning of words through gist\u0000 traces of movement. We asked English monolinguals to learn 40 Cantonese words (20 verbs and 20 nouns). In two studies, we found support for\u0000 the gist traces of congruent gestures being movement: participants who saw congruent gestures while hearing Cantonese words thought they had\u0000 seen more verbs than participants in any other condition. However, gist traces were unrelated to the accurate recall of either nouns or\u0000 verbs. In both studies, learning Cantonese words accompanied by congruent gestures tended to interfere with the learning of nouns (but not\u0000 verbs). In Study 2, we ruled out the possibility that this interference was due either to gestures conveying representational information in\u0000 another medium or to distraction from moving hands. We argue that gestures can interfere with learning foreign language words when they\u0000 represent the referents (e.g., show shape or size) because learners must interpret the hands as something other than hands.","PeriodicalId":35125,"journal":{"name":"Gesture","volume":" ","pages":""},"PeriodicalIF":1.0,"publicationDate":"2023-08-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"49444759","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Obituary 讣告
IF 1 4区 文学
Gesture Pub Date : 2023-08-21 DOI: 10.1075/gest.00070.mul
C. Müller
{"title":"Obituary","authors":"C. Müller","doi":"10.1075/gest.00070.mul","DOIUrl":"https://doi.org/10.1075/gest.00070.mul","url":null,"abstract":"","PeriodicalId":35125,"journal":{"name":"Gesture","volume":" ","pages":""},"PeriodicalIF":1.0,"publicationDate":"2023-08-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47010534","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A recurring absence gesture in Northern Pastaza Kichwa 北Pastaza Kichwa地区反复出现的缺席姿态
IF 1 4区 文学
Gesture Pub Date : 2023-07-25 DOI: 10.1075/gest.21008.ric
Alexander Rice
{"title":"A recurring absence gesture in Northern Pastaza Kichwa","authors":"Alexander Rice","doi":"10.1075/gest.21008.ric","DOIUrl":"https://doi.org/10.1075/gest.21008.ric","url":null,"abstract":"\u0000 In this paper I posit the use of a spread-fingered hand torque gesture among speakers of Northern Pastaza Kichwa\u0000 (Quechuan, Ecuador) as a recurrent gesture conveying the semantic theme of absence. The data come from a documentary\u0000 video corpus collected by multiple researchers. The gesture prototypically takes the form of at least one pair of rapid rotations\u0000 of the palm (the torque). Fingers can be spread or slightly flexed towards the palm to varying degrees. This gesture is performed\u0000 in a consistent manner across speakers (and expressions) and co-occurs with a set of speech strings with related semantic\u0000 meanings. Taking a cognitive linguistic approach, I analyse the form, function, and contexts of this gesture and argue that, taken\u0000 together, it should be considered a recurrent gesture that indicates absence.","PeriodicalId":35125,"journal":{"name":"Gesture","volume":" ","pages":""},"PeriodicalIF":1.0,"publicationDate":"2023-07-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43059345","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Coordinating and sharing gesture spaces in collaborative reasoning 协同推理中手势空间的协调与共享
IF 1 4区 文学
Gesture Pub Date : 2023-07-04 DOI: 10.1075/gest.21005.wil
Robert F. Williams
{"title":"Coordinating and sharing gesture spaces in collaborative reasoning","authors":"Robert F. Williams","doi":"10.1075/gest.21005.wil","DOIUrl":"https://doi.org/10.1075/gest.21005.wil","url":null,"abstract":"\u0000 In collaborative reasoning about what causes the seasons, phases of the moon, and tides, participants (three to\u0000 four per group) introduce ideas by gesturing depictively in personal space. Other group members copy and vary these gestures,\u0000 imbuing their gesture spaces with similar conceptual properties. This leads at times to gestures being produced in shared space as\u0000 members elaborate and contest a developing group model. Gestures in the shared space mostly coincide with conversational turns;\u0000 more rarely, participants gesture collaboratively as they enact a joint conception. An emergent shared space is sustained by the\u0000 joint focus and actions of participants and may be repositioned, reoriented, or reshaped to meet changing representational demands\u0000 as the discourse develops. Shared space is used alongside personal spaces, and further research could shed light on how gesture\u0000 placement and other markers (such as eye gaze) contribute to the meaning or function of gestures in group activity.","PeriodicalId":35125,"journal":{"name":"Gesture","volume":" ","pages":""},"PeriodicalIF":1.0,"publicationDate":"2023-07-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46155409","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信