S. Alaoui, Kristin Carlson, Shannon Cuykendall, Karen Bradley, K. Studd, T. Schiphorst
{"title":"How do experts observe movement?","authors":"S. Alaoui, Kristin Carlson, Shannon Cuykendall, Karen Bradley, K. Studd, T. Schiphorst","doi":"10.1145/2790994.2791000","DOIUrl":"https://doi.org/10.1145/2790994.2791000","url":null,"abstract":"Laban Movement Analysis (LMA) is an expert-based method by which Certified Movement Analysts observe and analyze movement. LMA is increasingly used in a variety of research fields, particularly when studying movement expressivity and computation where it is essential to generate an understanding of the observation process. In this paper we articulate the application of LMA as a tool for movement analysis in HCI research by using qualitative methods to deconstruct the observation process of LMA experts. We conducted a focus group in which 12 expert-participants observed and annotated videos of movement according to LMA categories. We transcribed their observation process and analyzed it using grounded theory in order to extract categories, concepts and theories that best explain and describe the process of observation in LMA. By doing so, we open research perspectives in which LMA can be integrated as a method for observation in the design of movement-based computational systems.","PeriodicalId":272811,"journal":{"name":"Proceedings of the 2nd International Workshop on Movement and Computing","volume":"86 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-08-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122082126","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Conceptualising interaction in live performance: reflections on 'encoded'","authors":"Andrew Johnston","doi":"10.1145/2790994.2791003","DOIUrl":"https://doi.org/10.1145/2790994.2791003","url":null,"abstract":"This paper presents a detailed examination of experiences of the creative team responsible for the direction, choreography, interaction design and performance of a dance and physical theatre work, Encoded. Interviews, observations and reflection on personal experience have made visible a range of different perspectives on the design, use and creative exploration of the interactive systems that were created for the work. The work itself, and in particular the use of interactive systems, was overall considered to be successful and coherent, even while participants' approaches and concerns were often markedly different. A trajectory of creative development in which exploratory improvisation and iterative design gradually became 'locked down' in preparation for final performance and touring is described.","PeriodicalId":272811,"journal":{"name":"Proceedings of the 2nd International Workshop on Movement and Computing","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-08-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114456751","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Sound space and spatial sampler","authors":"G. Beller","doi":"10.1145/2790994.2791010","DOIUrl":"https://doi.org/10.1145/2790994.2791010","url":null,"abstract":"The Spatial Sampler and the Sound Space are new musical instruments coming from the Synekine Project. This project involves performative and scientific researches for the creation of new ways to express ourselves. In a similar way a sampler is an empty box to be filled up with sound files, the Spatial Sampler turns the physical space around the performer as a key-area to place and to play his own voice samples. With the Sound Space, a performer literally spreads his voice around him by the gesture, creating an entire sound scene while playing with it. The gesture association of time and space makes these two instruments also suitable for dancers and body/movement artists, and for various applications.","PeriodicalId":272811,"journal":{"name":"Proceedings of the 2nd International Workshop on Movement and Computing","volume":"27 4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-08-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116346609","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Jules Françoise, A. Roby-Brami, Natasha Riboud, Frédéric Bevilacqua
{"title":"Movement sequence analysis using hidden Markov models: a case study in Tai Chi performance","authors":"Jules Françoise, A. Roby-Brami, Natasha Riboud, Frédéric Bevilacqua","doi":"10.1145/2790994.2791006","DOIUrl":"https://doi.org/10.1145/2790994.2791006","url":null,"abstract":"Movement sequences are essential to dance and expressive movement practice; yet, they remain underexplored in movement and computing research, where the focus on short gestures prevails. We propose a method for movement sequence analysis based on motion trajectory synthesis with Hidden Markov Models. The method uses Hidden Markov Regression for jointly synthesizing motion feature trajectories and their associated variances, that serves as basis for investigating performers' consistency across executions of a movement sequence. We illustrate the method with a use-case in Tai Chi performance, and we further extend the approach to cross-modal analysis of vocalized movements.","PeriodicalId":272811,"journal":{"name":"Proceedings of the 2nd International Workshop on Movement and Computing","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-08-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128455430","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Kinesonic approaches to mapping movement and music with the remote electroacoustic kinesthetic sensing (RAKS) system","authors":"Aurie Y. Hsu, S. Kemper","doi":"10.1145/2790994.2791020","DOIUrl":"https://doi.org/10.1145/2790994.2791020","url":null,"abstract":"Sensor technologies allow for a direct link between a dancer's kinetic and kinesthetic experience and musical expression. The Remote electroAcoustic Kinesthetic Sensing (RAKS) system, a wearable wireless sensor interface designed specifically for belly dance, enables such a link. Teka Mori (2013) for belly dancer, RAKS system, and computer-generated sound, explores a kinesonic approach to mapping belly dance movement to timbral control in electronic music.","PeriodicalId":272811,"journal":{"name":"Proceedings of the 2nd International Workshop on Movement and Computing","volume":"135 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-08-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121367658","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Experimenting with noise in markerless motion capture","authors":"A. Miller","doi":"10.1145/2790994.2791019","DOIUrl":"https://doi.org/10.1145/2790994.2791019","url":null,"abstract":"Visual culture has embraced the visual glitch as just one of many aesthetics associated with digital media. A glitch is often associated with noise in a technological system. Some motion capture systems experience noise and glitches as they process human movement. Under normal conditions, a glitch is undesirable because it decreases the usability of the capture. This short paper and demonstration introduce our research into the non-traditional use of motion capture data for the generation of artistic animated works.","PeriodicalId":272811,"journal":{"name":"Proceedings of the 2nd International Workshop on Movement and Computing","volume":"88 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-08-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123037168","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Music means movement: musings on methods for movement analysis in music","authors":"Jan C. Schacher","doi":"10.1145/2790994.2791001","DOIUrl":"https://doi.org/10.1145/2790994.2791001","url":null,"abstract":"This article addresses the intersection of technical, analytical and artistic approaches to perceiving and measuring musical movement. The point of view taken is situated between the development and application of technological tools, the design and running of exploratory experiments, and the musical performance moment, where perception of the body and its movements constitutes an integral part of the experience. Through a use-case that is shared with other artists and researchers, a wide range of necessary developments, both conceptually and in software is shown. The tools and the methods generated are juxtaposed with the realisation that movement analysis is merely one possible usage of acquired data. Artistic translations provide alternate ways of generating meaning from movement data, in particular when translating musical actions to pieces that span multiple modalities. With the proposed multi-perspective methodology, ways and means are sketched out that address the inherent multiplicity of domains involved in music performance and perception.","PeriodicalId":272811,"journal":{"name":"Proceedings of the 2nd International Workshop on Movement and Computing","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-08-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130536806","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Interactive Tango Milonga: designing internal experience","authors":"Courtney Brown, G. Paine","doi":"10.1145/2790994.2791013","DOIUrl":"https://doi.org/10.1145/2790994.2791013","url":null,"abstract":"The Argentine tango concept of connection refers to the experience of complete synchronicity between self, partner, and music. This paper presents Interactive Tango Milonga, an interactive system giving tango dancers agency over music in order to increase this sense of relation between both partners and music. Like an improvising musician in an ensemble, each dancer receives musical feedback from both her movements and her partner's. Thus, each dancer can respond to the music, driving musical feedback, thereby heightening awareness and agency in both the sound and her partner's movements. Via presentation of this system, this paper illustrates methods for developing interactive systems engaging with distinct musical, movement, and social traditions as well for composing sound-movement relationships leading to specific internal experiences within these social contexts.","PeriodicalId":272811,"journal":{"name":"Proceedings of the 2nd International Workshop on Movement and Computing","volume":"78 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-08-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114285464","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Embodied design of full bodied interaction with virtual humans","authors":"M. Gillies, Harry Brenton, A. Kleinsmith","doi":"10.1145/2790994.2790996","DOIUrl":"https://doi.org/10.1145/2790994.2790996","url":null,"abstract":"This paper presents a system that allows end users to design full body interactions with 3D animated virtual character through a process we call Interactive Performance Capture. This process is embodied in the sense that users design directly by moving and interacting using an interactive machine learning method. Two people improvise an interaction based only on their movements, one plays the part of the virtual character the other plays a real person. Their movements are recorded and they label it with metadata that identifies certain actions and responses. This labeled data is then used to train a Gaussian Mixture Model that is able to recognize new actions and generate suitable responses from the virtual character. A small study showed that users do indeed design in a very embodied way using movement directly as a means of thinking through and designing interactions.","PeriodicalId":272811,"journal":{"name":"Proceedings of the 2nd International Workshop on Movement and Computing","volume":"61 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-08-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122184764","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A review of computable expressive descriptors of human motion","authors":"Caroline Larboulette, S. Gibet","doi":"10.1145/2790994.2790998","DOIUrl":"https://doi.org/10.1145/2790994.2790998","url":null,"abstract":"In this paper we present a review of computable descriptors of human motion. We first present low-level descriptors that compute quantities directly from the raw motion data. We then present higher level descriptors that use low-level ones to compute boolean, single value or continuous quantities that can be interpreted, automatically or manually, to qualify the meaning, style or expressiveness of a motion. We provide formulas inspired from the state of the art that can be applied to 3D motion capture data.","PeriodicalId":272811,"journal":{"name":"Proceedings of the 2nd International Workshop on Movement and Computing","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-08-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130899577","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}