Proceedings of the 3rd International Symposium on Movement and Computing最新文献

筛选
英文 中文
Quantitative Evaluation of Percussive Gestures by Ranking Trainees versus Teacher 学员与教师对敲击手势的定量评价
Proceedings of the 3rd International Symposium on Movement and Computing Pub Date : 2016-07-05 DOI: 10.1145/2948910.2948934
Lei Chen, S. Gibet, P. Marteau, F. Marandola, M. Wanderley
{"title":"Quantitative Evaluation of Percussive Gestures by Ranking Trainees versus Teacher","authors":"Lei Chen, S. Gibet, P. Marteau, F. Marandola, M. Wanderley","doi":"10.1145/2948910.2948934","DOIUrl":"https://doi.org/10.1145/2948910.2948934","url":null,"abstract":"In this paper we characterize timpani gestures by temporal kinematic features, containing most information responsible for the sound-producing actions. In order to evaluate the feature sets, a classification approach is conducted under three main attack categories (legato, accent and vertical accent) and sub-categories (dynamics, striking position). Two studies are carried out: intra-subject and inter-subjects classification. Results are presented in terms of a quantitative ranking of students, using professional gestures as training set, and their gestures as test set.","PeriodicalId":381334,"journal":{"name":"Proceedings of the 3rd International Symposium on Movement and Computing","volume":"67 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-07-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132601293","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Motion, Captured: an Open Repository for Comparative Movement Studies 动作,捕捉:一个开放的比较动作研究资源库
Proceedings of the 3rd International Symposium on Movement and Computing Pub Date : 2016-07-05 DOI: 10.1145/2948910.2948938
Varsha Iyengar, Grisha Coleman, David Tinapple, P. Turaga
{"title":"Motion, Captured: an Open Repository for Comparative Movement Studies","authors":"Varsha Iyengar, Grisha Coleman, David Tinapple, P. Turaga","doi":"10.1145/2948910.2948938","DOIUrl":"https://doi.org/10.1145/2948910.2948938","url":null,"abstract":"This paper begins to describe a new kind of database, one that explores a diverse range of movement in the field of dance through capture of different bodies and different backgrounds - or what we are terming movement vernaculars. We re-purpose Ivan Illich's concept of 'vernacular work' [11] here to refer to those everyday forms of dance and organized movement that are informal, refractory (resistant to formal analysis), yet are socially reproduced and derived from a commons. The project investigates the notion of vernaculars in movement that is intentional and aesthetic through the development of a computational approach that highlights both similarities and differences, thereby revealing the specificities of each individual mover. This paper presents an example of how this movement database is used as a research tool, and how the fruits of that research can be added back to the database, thus adding a novel layer of annotation and further enriching the collection. Future researchers can then benefit from this layer, further refining and building upon these techniques. The creation of a robust, open source, movement lexicon repository will allow for observation, speculation, and contextualization - along with the provision of clean and complex data sets for new forms of creative expression.","PeriodicalId":381334,"journal":{"name":"Proceedings of the 3rd International Symposium on Movement and Computing","volume":"70 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-07-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126329954","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Error prooving and sensorimotor feedback for singing voice 歌声的纠错和感觉运动反馈
Proceedings of the 3rd International Symposium on Movement and Computing Pub Date : 2016-07-05 DOI: 10.1145/2948910.2948952
K. Kokkinidis, A. Stergiaki, A. Tsagaris
{"title":"Error prooving and sensorimotor feedback for singing voice","authors":"K. Kokkinidis, A. Stergiaki, A. Tsagaris","doi":"10.1145/2948910.2948952","DOIUrl":"https://doi.org/10.1145/2948910.2948952","url":null,"abstract":"This paper presents a sensorimotor system for Byzantine Music. The main goal of this research is to detect some pre-defined errors in singing performance. After error-detection, the system uses a pre-defined error-dictionary in order to feedback. Through these feedbacks the potential chanter is being able to correct his performance. The system is being trained via experts MFCC features from a corpus of anthems. The recognition also takes place via MFCC but form student. The developed system is being able to evaluate in real time the pitch distance and furthermore the duration of two musician's performances, expert and student. The system may also evaluate the distance between two sequential musical gestures by which we may find the tempo of the hymn. After the pitch of these two hymns are being compared any identified errors will cause a feedback action to the student. This feedback corresponds to an error dictionary.","PeriodicalId":381334,"journal":{"name":"Proceedings of the 3rd International Symposium on Movement and Computing","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-07-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130894756","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
A 3D application to familiarize children with sign language and assess the potential of avatars and motion capture for learning movement 一个3D应用程序,让孩子们熟悉手语,并评估虚拟形象和动作捕捉学习运动的潜力
Proceedings of the 3rd International Symposium on Movement and Computing Pub Date : 2016-07-05 DOI: 10.1145/2948910.2948917
Rémi Brun, Ahmed Turki, A. Laville
{"title":"A 3D application to familiarize children with sign language and assess the potential of avatars and motion capture for learning movement","authors":"Rémi Brun, Ahmed Turki, A. Laville","doi":"10.1145/2948910.2948917","DOIUrl":"https://doi.org/10.1145/2948910.2948917","url":null,"abstract":"In order to explore the potentialities of 3D tools for teaching and learning movements, we are developing a prototype of a 3D application that will give children a first contact with sign language through an interactive 3D signing avatar. It is not yet a finalized teaching tool but rather the first step in the process of exploring the possibilities given by a 3D character moving in a 3D environment to \"connect\" a child with a world of movement that he has to memorise and become familiar with. Our approach relies heavily on the fact that we have implemented a process to record movements of the body, the face, the eyes, and the fingers and transpose them onto an \"avatar\" with maximal accuracy and fidelity [3,5].","PeriodicalId":381334,"journal":{"name":"Proceedings of the 3rd International Symposium on Movement and Computing","volume":"36 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-07-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134000379","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Moving Music: Exploring Movement-to-Sound Relationships 移动音乐:探索动作与声音的关系
Proceedings of the 3rd International Symposium on Movement and Computing Pub Date : 2016-07-05 DOI: 10.1145/2948910.2948940
Jan C. Schacher
{"title":"Moving Music: Exploring Movement-to-Sound Relationships","authors":"Jan C. Schacher","doi":"10.1145/2948910.2948940","DOIUrl":"https://doi.org/10.1145/2948910.2948940","url":null,"abstract":"Relating movement to sound in an artistic context demands an understanding of the foundations of perception of both domains and the elaboration of techniques that effectively creates a link with technical means from body to sound. This article explores the strategies necessary in interactive dance work to successfully link movement to sound processes. This is done by reducing the dimensions of the observed elements to the fundamentals and at the same time identifying target dimensions that allow the recreation of an equivalent expression. A categorisation helps to elucidate those elements and characteristics that can be applied and looks at how they are perceived by the audience. The asymmetry that arises when using technical links to generate sound in interactive dance poses the question of dependency and exposes limits and challenges of using technology in this performing arts practice.","PeriodicalId":381334,"journal":{"name":"Proceedings of the 3rd International Symposium on Movement and Computing","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-07-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122507540","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Neural Narratives: Dance with Virtual Body Extensions 神经叙事:舞蹈与虚拟身体延伸
Proceedings of the 3rd International Symposium on Movement and Computing Pub Date : 2016-07-05 DOI: 10.1145/2948910.2948925
D. Bisig, Pablo Palacio
{"title":"Neural Narratives: Dance with Virtual Body Extensions","authors":"D. Bisig, Pablo Palacio","doi":"10.1145/2948910.2948925","DOIUrl":"https://doi.org/10.1145/2948910.2948925","url":null,"abstract":"From the context of two dance productions, the Neural Narratives project has started to emerge as a comprehensive exploration of simulation-based approaches that enable the creation of artificial body extensions for dancers. The simulation, visualisation and sonification of these body extensions allow a dancer to alter and enlarge his or her bodily presence and movement possibilities. The main focus of this publication lies in the contextualisation and discussion of a number of questions that have arisen during the realisation of the dance productions. These questions relate to concepts of embodiment, agency, and creativity and their possible implications for the realisation of interactive systems for dance. We try to address these questions by drawing from ideas that originate from a wide range of fields including dance and technology, cognitive science, systems science, and medical engineering. By connecting our own practical activities to a broad disciplinary context, we hope to contribute to a discourse concerning future directions for research and creation that deepen the integration of technology and dance.","PeriodicalId":381334,"journal":{"name":"Proceedings of the 3rd International Symposium on Movement and Computing","volume":"58 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-07-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129346200","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 14
Towards a Multimodal Repository of Expressive Movement Qualities in Dance 舞蹈中表达性动作品质的多模态资源库
Proceedings of the 3rd International Symposium on Movement and Computing Pub Date : 2016-07-05 DOI: 10.1145/2948910.2948931
Stefano Piana, P. Coletta, Simone Ghisio, Radoslaw Niewiadomski, M. Mancini, R. Sagoleo, G. Volpe, A. Camurri
{"title":"Towards a Multimodal Repository of Expressive Movement Qualities in Dance","authors":"Stefano Piana, P. Coletta, Simone Ghisio, Radoslaw Niewiadomski, M. Mancini, R. Sagoleo, G. Volpe, A. Camurri","doi":"10.1145/2948910.2948931","DOIUrl":"https://doi.org/10.1145/2948910.2948931","url":null,"abstract":"In this paper, we present a new multimodal repository for the analysis of expressive movement qualities in dance. First, we discuss guidelines and methodology that we applied to create this repository. Next, the technical setup of recordings and the platform for capturing the synchronized audio-visual, physiological, and motion capture data are presented. The initial content of the repository consists of about 90 minutes of short dance performances movement sequences, and improvisations performed by four dancers, displaying three expressive qualities: Fluidity, Impulsivity, and Rigidity.","PeriodicalId":381334,"journal":{"name":"Proceedings of the 3rd International Symposium on Movement and Computing","volume":"646 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-07-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123284390","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 12
Automatic Affect Classification of Human Motion Capture Sequences in the Valence-Arousal Model 基于价态唤醒模型的人体动作捕捉序列自动影响分类
Proceedings of the 3rd International Symposium on Movement and Computing Pub Date : 2016-07-05 DOI: 10.1145/2948910.2948936
William Li, Philippe Pasquier
{"title":"Automatic Affect Classification of Human Motion Capture Sequences in the Valence-Arousal Model","authors":"William Li, Philippe Pasquier","doi":"10.1145/2948910.2948936","DOIUrl":"https://doi.org/10.1145/2948910.2948936","url":null,"abstract":"The problem that we are addressing is that of affect classification: analysing emotions given input data. There are two parts to this study. In the first part, to achieve better recognition and classification of human movement, we investigate that the labels on existing Motion Capture (MoCap) data are consistent with human perception within a reasonable extent. Specifically, we examine movement in terms of valence and arousal (emotion and energy). In part two, we present machine learning techniques for affect classification of human motion capture sequences in both categorical and continuous approaches. For the categorical approach, we evaluate the performance of Hidden Markov Models (HMM). For the continuous approach, we use stepwise linear regression models with the responses of participants from the first part as the ground truth labels for each movement.","PeriodicalId":381334,"journal":{"name":"Proceedings of the 3rd International Symposium on Movement and Computing","volume":"40 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-07-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122252393","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
A Novel Tool for Motion Capture Database Factor Statistical Exploration 一种新的运动捕捉数据库因子统计探索工具
Proceedings of the 3rd International Symposium on Movement and Computing Pub Date : 2016-07-05 DOI: 10.1145/2948910.2948923
M. Tits, J. Tilmanne, N. D'Alessandro
{"title":"A Novel Tool for Motion Capture Database Factor Statistical Exploration","authors":"M. Tits, J. Tilmanne, N. D'Alessandro","doi":"10.1145/2948910.2948923","DOIUrl":"https://doi.org/10.1145/2948910.2948923","url":null,"abstract":"The recent arise of Motion Capture (MoCap) technologies provides new possibilities, but also new challenges in human motion analysis. Indeed, the analysis of a motion database is a complex task, due to the high dimensionality of motion data, and the number of independent factors that can affect movements. We addressed the first issue in some of our earlier work by developing MotionMachine, a framework helping to overcome the problem of motion data interpretation through feature extraction and interactive visualization [20]. In this paper, we address the question of the relations between movements and some of the various factors (social, psychological, physiological, etc.) that can influence them. To that end, we propose a tool for rapid factor analysis of a MoCap database. This tool allows statistical exploration of the effect of any factor of the database on motion features. As a use case of this work, we present the analysis of a database of improvised contemporary dance, showing the capabilities and interest of our tool.","PeriodicalId":381334,"journal":{"name":"Proceedings of the 3rd International Symposium on Movement and Computing","volume":"28 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-07-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114838529","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
Hand gestures during collaborative problem solving 协作解决问题时的手势
Proceedings of the 3rd International Symposium on Movement and Computing Pub Date : 2016-07-05 DOI: 10.1145/2948910.2948913
D. Anastasiou, Mehmetcan Fal, Eric Ras
{"title":"Hand gestures during collaborative problem solving","authors":"D. Anastasiou, Mehmetcan Fal, Eric Ras","doi":"10.1145/2948910.2948913","DOIUrl":"https://doi.org/10.1145/2948910.2948913","url":null,"abstract":"In this paper we present the need for analyzing (hand) gestures in learning environments and particularly in collaborative problem solving tasks. Based on experimental user studies, we analyze gestures and their impact on technology-based assessment, 21st Century skills as well as on collaboration and cognition.","PeriodicalId":381334,"journal":{"name":"Proceedings of the 3rd International Symposium on Movement and Computing","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-07-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131245769","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信