{"title":"Live Electronic Music Performance: Embodied and Enactive Approaches","authors":"Lauren Hayes","doi":"10.1145/3212721.3212891","DOIUrl":"https://doi.org/10.1145/3212721.3212891","url":null,"abstract":"Mini Savior Opt. (2017) is a twenty-five minute live electronic performance which demonstrates an enactive and embodied approach to interactive and improvisational music systems. The piece was formed out of a playful exploration of my most recent hybrid analogue/digital performance system. An excessive number of components mutually affect each other through an ecological network of sound analysis and digital signal processing (DSP). Engaging with different parts of the instrument through tangible and haptic controllers, I bring a sense of immediacy into my hands: the slightest movement may trigger a drastic change in sound, which in turn may activate other processes within the network. Through the physical struggle, the performer, vulnerable to the fragile instabilities that have been potentialised, attempts to navigate the performance space.","PeriodicalId":330867,"journal":{"name":"Proceedings of the 5th International Conference on Movement and Computing","volume":"27 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131309792","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Alexandra Q. Nilles, M. Beckman, C. Gladish, A. LaViers
{"title":"Improv","authors":"Alexandra Q. Nilles, M. Beckman, C. Gladish, A. LaViers","doi":"10.1145/3212721.3212882","DOIUrl":"https://doi.org/10.1145/3212721.3212882","url":null,"abstract":"Often, people such as educators, artists, and researchers wish to quickly generate robot motion. However, current toolchains for programming robots can be difficult to learn, especially for people without technical training. This paper presents the Improv system, a programming language for high-level description of robot motion with immediate visualization of the resulting motion on a physical or simulated robot. Improv includes a \"live coding\" wrapper for ROS (\"Robot Operating System\", an open-source robot software framework which is widely used in academia and industry, and integrated with many commercially available robots). Commands in Improv are compiled to ROS messages. The language is inspired by choreographic techniques, and allows the user to compose and transform movements in space and time. In this paper, we present our work on Improv so far, as well as the design decisions made throughout its creation.","PeriodicalId":330867,"journal":{"name":"Proceedings of the 5th International Conference on Movement and Computing","volume":"375 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124685555","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
K. E. Raheb, Aristotelis Kasomoulis, A. Katifori, Marianna Rezkalla, Y. Ioannidis
{"title":"A Web-based system for annotation of dance multimodal recordings by dance practitioners and experts","authors":"K. E. Raheb, Aristotelis Kasomoulis, A. Katifori, Marianna Rezkalla, Y. Ioannidis","doi":"10.1145/3212721.3212722","DOIUrl":"https://doi.org/10.1145/3212721.3212722","url":null,"abstract":"Recent advances in technologies for capturing, analyzing and visualizing movement can revolutionize the way we create, practice, learn dance, and transmit bodily knowledge. The need for creating meaningful, searchable and re-usable libraries of motion capture and video movement segments can only be fulfilled through the collaboration of both technologists and dance practitioners. Towards this direction, manual annotations of these segments by dance experts can play a four-fold role: a) enrich movement libraries with expert knowledge, b) create \"ground-truth\" datasets for comparing the results of automated algorithms, c) fertilize a dialogue across dance genres and disciplines on movement analysis and conceptualization, and d) raise questions on the subjectivity and diversity of characterizing movement segments using verbal descriptions. The web-based application presented in this work, is an archival system with, browsing, searching, visualization, personalization and textual annotation functionalities. Its main objective is to provide access to a repository of multimodal dance recordings including motion capture data, video, and audio, with the aim to also support dance education. The tool has been designed and developed within an interdisciplinary project, following a user-centered, iterative design approach involving dance researchers and practitioners of four different dance genres.","PeriodicalId":330867,"journal":{"name":"Proceedings of the 5th International Conference on Movement and Computing","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127055010","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Georges Gagneré, A. Lavender, Cédric Plessiet, T. white
{"title":"Challenges of movement quality using motion capture in theatre","authors":"Georges Gagneré, A. Lavender, Cédric Plessiet, T. white","doi":"10.1145/3212721.3212883","DOIUrl":"https://doi.org/10.1145/3212721.3212883","url":null,"abstract":"We describe1 two case studies of AvatarStaging theatrical mixed reality framework combining avatars and performers acting in an artistic context. We outline a qualitative approach toward the condition for stage presence for the avatars. We describe the motion control solutions we experimented with from the perspective of building a protocol of avatar direction in a mixed reality appropriate to live performance.","PeriodicalId":330867,"journal":{"name":"Proceedings of the 5th International Conference on Movement and Computing","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129402618","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The role of emotion in movement segmentation","authors":"Eleonora Ceccaldi, G. Volpe","doi":"10.1145/3212721.3212850","DOIUrl":"https://doi.org/10.1145/3212721.3212850","url":null,"abstract":"Humans understand ongoing events by breaking reality into meaningful units through a process named \"event segmentation\". In psychology, theories such as the \"Event Segmentation theory\" have been proposed to illustrate how humans perceive the structure of ongoing behavior. Parsing discrete gestures from a continuous movement stream is also a necessary step for movement analysis. Many approaches towards automatic emotion recognition from full body movement leverage automatic segmentation methods. Nonetheless, to the best of my knowledge, no framework has applied Event Segmentation theory to the automatic segmentation of emotion conveying movements. In this paper, I propose the exploitation of a computational model of event segmentation to extend a movement-analysis framework with an event segmentation module.","PeriodicalId":330867,"journal":{"name":"Proceedings of the 5th International Conference on Movement and Computing","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125568784","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Lanterns","authors":"G. Johnson, B. Peterson, T. Ingalls, S. Wei","doi":"10.1145/3212721.3212848","DOIUrl":"https://doi.org/10.1145/3212721.3212848","url":null,"abstract":"This paper takes an empirical and processual approach to the study of coordinated group activity through enacted and experimental movement research with responsive media. We focus on the dynamic emergence of what we will refer to as ensemble. Informed by readings from process philosophy and new materialism, we take into account the vitality of matter in investigating a notion of subjectivity unhinged from compartmentalized conceptions of the human. Methodologically we suspend ontological assumptions which occlude, mask, or ignore emergent relations and unprestatable events. These concerns underline the design tactics of an immersive media environment and apparatus for research creation. The Lanterns digital-physical system employs a design ethos of semantically shallow computation towards an experientially responsive environment. We give an overview of the motivating questions for this line of enacted inquiry and describe our experimental process and outcomes with the Lanterns.","PeriodicalId":330867,"journal":{"name":"Proceedings of the 5th International Conference on Movement and Computing","volume":"82 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123937239","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Design Methodology for Abstracting Character Archetypes onto Robotic Systems","authors":"I. Pakrasi, N. Chakraborty, A. LaViers","doi":"10.1145/3212721.3212809","DOIUrl":"https://doi.org/10.1145/3212721.3212809","url":null,"abstract":"Given the increase in robotic systems for the household, it is imperative to design the expressive modes of these systems, that in turn engender likability, animacy, acceptance, and trust. This paper approaches this problem by proposing a design methodology that can be used to abstract archetypal characters across the system, including form factor, user instructions, and interactive modalities. This approach uses Laban Movement Analysis paired with the Kansei Engineering iterative design approach to dissect movement and visual traits of archetypal characters and marry them to features of the robot. These character traits are explored in a product, channel, consumer framework and are realized through interface elements, such as color, animated eyes, and character specific motion profiles. Finally, the use of priming using familiar characters from popular culture as a means to enhance the recognition of character traits is explored. Results show that users associated traits specific to each character archetype that were consistent with the intended design and that an aggregate measure of interaction success went up with character priming. This was bolstered in the priming cases, where users rated these traits more strongly. Thus, this methodology serves as a tool to create meaningful design variations to robotic systems using character archetypes, allowing us to design user-specific personality traits and interactive elements.","PeriodicalId":330867,"journal":{"name":"Proceedings of the 5th International Conference on Movement and Computing","volume":"31 8","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120987144","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
John Sullivan, Alexandra Tibbitts, Brice Gatinet, M. Wanderley
{"title":"Gestural Control of Augmented Instrumental Performance: A Case Study of the Concert Harp","authors":"John Sullivan, Alexandra Tibbitts, Brice Gatinet, M. Wanderley","doi":"10.1145/3212721.3212814","DOIUrl":"https://doi.org/10.1145/3212721.3212814","url":null,"abstract":"We present a gestural control system to augment harp performance with real-time control of computer-based audio affects and processing. While the lightweight system was designed for use alongside any instrument, our choice of the concert harp represented a unique case study in gestural control of music. The instrument's large size and physically demanding playing technique leaves the performer with little spare bandwidth to devote to other tasks. A motion capture study analyzed instrumental and ancillary gestures of natural harp performances that could be mapped effectively to control of additional signal processing parameters. The initial findings of the study helped to guide the design of custom gesture control devices and user software, and a new work for solo harpist and electronics was created. We discuss our findings, successes and challenges in the study and design of gesture control for augmented instrumental performance, with particular focus on the concert harp.","PeriodicalId":330867,"journal":{"name":"Proceedings of the 5th International Conference on Movement and Computing","volume":"13 3","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120991042","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Embodied Simulations of physical phenomena","authors":"C. Karner, C. Cardelli, Adrian Artacho","doi":"10.1145/3212721.3212894","DOIUrl":"https://doi.org/10.1145/3212721.3212894","url":null,"abstract":"At the crossroads between computational physics, human-computer interaction and mediated performance, the present proposal aims at presenting the vast complexity of physical phenomena in an intuitive, embodied manner. Specifically, we created an interactive system capable of accurately simulating the rules of physical phenomena in game form. Through the game, participants perform a dance-like, analogue computing of the Ising model, a model frequently used for phase transitions. A smartphone-based system guides the participants in a playful manner by means of visual and aural cues. The system mediates the multiple interactions happening among actants in a way consistent with the physical model, therefore enabling an embodied simulation of the phenomenon.","PeriodicalId":330867,"journal":{"name":"Proceedings of the 5th International Conference on Movement and Computing","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133840811","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Frédéric Bevilacqua, M. Segalen, V. Marchand-Pauvert, I. Peyre, P. Pradat-Diehl, A. Roby-Brami
{"title":"Exploring different movement sonification strategies for rehabilitation in clinical settings","authors":"Frédéric Bevilacqua, M. Segalen, V. Marchand-Pauvert, I. Peyre, P. Pradat-Diehl, A. Roby-Brami","doi":"10.1145/3212721.3212881","DOIUrl":"https://doi.org/10.1145/3212721.3212881","url":null,"abstract":"We describe an interactive system that allows for sonifying arm movements. The aim is to support stroke patients going through rehabilitation by providing them with augmented auditory feedback that reacts to their movements. The system is based on IMU sensors (Inertial Measurements Unit) attached to each arm. The movement data are streamed in real-time to a laptop computer that generates various sounds or musical interactions using a program we developed. We tested different types of auditory feedback, each of them using a specific strategy for the sound-movement mapping. The first type of movement-sound mappings is based on direct relationships between the reaching distance and either the pitch of a continuous tone, or the tempo of a regular beat pattern. The second type of mapping is music-oriented: the user movement allows for controlling the tempo of musical pieces. The third type of mapping associates the hand position to specific environmental sounds. We report here on the technical system along with preliminary results in a clinical setting with both post-stroke patients and healthy users.","PeriodicalId":330867,"journal":{"name":"Proceedings of the 5th International Conference on Movement and Computing","volume":"84 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123934004","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}